2016 Agenda

IS Auto is an unique event, combining the automotive and image sensor communities to develop new commercial opportunities and address key technical challenges. See nearly 30 presentations from industry leaders in both sectors

Image Sensors Auto Americas 2016

  1. Registration for Image Sensors Auto US 2016

OEM requirements for next generation sensor technology

  1. Keynote Presentation: The Present and Future of Sensors for Intelligent Vehicles

    Dr. Steve Eglash | Executive Director of Stanford AI Lab-Toyota Center for Artificial Intelligence Research

    Intelligent vehicles must execute the entire loop from sensing to perception to action.  This requires sensors and machine intelligence.  Existing sensors provide imaging, range, and other information.  In the future new technologies will enable more-capable sensors at lower cost for improved autonomy, safety, and efficiency.  This talk will describe the trajectory of sensor technology and how new sensors and machine learning algorithms will impact the automobile industry.

  2. The Connected Car and The Future Data Market Place: A Panel Featuring Ford, Volvo, HERE, and Qualcomm

    Amer Aijaz | Senior Director of Volvo Cars R&D Center

    •        Creating valuable data from connected vehicles
    •        Developing business models that work for the whole eco system
    •        Technical requirements from the sensor community

    Panelists include:

    • Amer Aijaz, Volvo
    • Xinzhou Wu, Qualcomm
    • Dragos Maciuca, Ford
    • Dr. Ivan Sucharski, HERE
  3. ADAS sensor architecture evolution towards autonomous cars

    Akhilesh Kona | Senior Analyst, Automotive Electronics & Semiconductor of IHS Markit Technology

    An autonomous car is the main vision of the automotive industry for the next decade. Implementing intermediate milestones through advanced driver assistance (ADAS) functions is essential to achieve a fully autonomous car. From the industry point of view, realizing the new technology needs with reasonable costs is a key requirement for deploying autonomous cars on a large scale. As a result, the ADAS sensor architectures are evolving to support the implementation of growing autonomous functions on car platforms. This presentation will provide a comprehensive overview of how the ADAS sensor architectures have evolved and changed in different vehicle segments during the past decade and the sensor architectures for the car of future -- if "smart" or "dumb"? The focus of the talk will also be on the key semiconductor technologies enabling newer ECU architectures, the challenges related to sensor fusion, and the consequent impact on automotive supply chain.

  4. Image sensor designs with system challenge considerations

    Alvin K. Wong | Vice President, Automotive Innovation Center of Sony

    Having a strong system understanding can provide clarity on key IS specifications and determine what features best operate together.  With this approach, innovation in IS architectural design is a must.   Sony has adopted this design philosophy and is innovating in other areas to bring the highest performing solutions for their customer.

  5. Morning Networking Break

    Morning Refreshments

  6. Improving vehicle safety through Advanced Driver Assistance Systems

    Tim Johnson | Director of NHTSA’s Vehicle Research and Test Center

    The presentation will focus on the potential for advanced driver assistance systems to enhance safety in terms of reducing crashes, injuries, and fatalities as well as a discussion of the various sensor technologies used in these systems.  

  7. Next generation ADAS: current and future innovations in ADAS for next generational cars

    Dr. Subrata Kundu | Senior Researcher & Team Leader Automotive Products Research Laboratory Global Center for Social Innovation (CSI-NA) of Hitachi America, Ltd.

    • Customer demands from next generation ADAS systems
    • Managing the shift in focus from sensor based systems to centralized controller based systems
    • Industry and society's transition to autonomous driving
  8. Case study: Shaping the future - army robotics and autonomous systems

    Jeff Ernat | Large Platform Autonomy Team Leader for the Ground Vehicle Robotics (GVR) of U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC)

    •  Introduction to the application, integration and sensor requirements
    •  Advantages of autonomous driving
    •  Initial challenges and how they were overcome
    •  Partnership approach
    •  What next?
  9. Showcasing the most promising pre-deployment image sensor and machine vision technologies which will have a real impact in the automotive sector: Featuring TriLumina, Sirab Technologies, Princeton Lightwave and SMARTwheel Inc

    Liz Kerton | Executive Director of Autotech Council

    Panelists include:

    David Abell Founder and Chief Strategy Officer TriLumina
    Surya Satyavolu             Founder and CEO Sirab Technologies
    Yves Dzialowski                 Chairman Princeton Lightwave                     
    Bob Fields Vice President Sales & Strategic Alliances                         SMARTwheel Inc


  10. Collaboration and standards to avoid conflicts along the road to fully autonomous vehicles

    Paul Drysch | Vice President of Business Development and GM of APAC Region of RideCell

    • Enabling collaboration between OEM, suppliers, new mobility companies like Lyft and ZipCar, and cities, states and countries
    • The necessity for technology standards, adopted across the globe, to enable the transition to fully autonomous vehicles in a safe and timely manner
  11. Networking Lunch Break

IS Auto Americas | Track A

Track A: Imaging and sensors


  1. Understanding the automotive imaging data flow

    Joshua Wise | Image Signal Processor Architect of NVIDIA Corporation

     As automotive imaging systems become more complex and integrated, we have recently noted an explosion in the total number of components in the system that must communicate.  We present an overview of the dataflow in an automotive imaging and sensing system, from image sensor to system-on-chip, and make recommendations to ecosystem vendors to ensure compatibility and interoperability.

  2. Heterogeneous computing for computational imaging

    Kari Pulli | Senior Principal Engineer, Imaging and Camera Technologies Group of Intel, USA

    Modern computers pack many different types of parallel processors into a single chip. Typically, each processor has a very different programming model, and it is difficult to get the maximum performance out of the device for real-time automotive computer vision tasks. In this talk several technologies will be covered (OpenCL, OpenVX, some domain-specific languages) that make this task easier.

  3. Camera module characterisation and optimisation for automotive applications

    Dietmar Wueller | CEO of Image Engineering GmbH & Co. KG

    • Dynamic range and optoelectronic conversion function (OECF)
    • ISO speed (ISO 12232)
    • Measuring optics and resolution
    • White balancing
    • Colour reproduction
    • Spectral sensitivity and colour
    • Calibration on the production line
  4. Achieving HDR – without motion artifacts and in every single exposure

    Boyd Fowler | VP of Marketing of OmniVision Technologies

    Deep Well pixel technology - enabling an automotive image sensor with 16-bit dynamic range from a single exposure, free of any SNR drops or knee points

  5. Afternoon Networking Break & Refreshments

  6. Key sensor technologies for enabling next-generation Driver Monitoring systems

    Stephen Harris, Sr. Manager – Automotive Ecosystem, Image Sensor Group at ON Semiconductor

    - How Global Shutter technology, Near-IR and Functional Safety are key enablers to OEMs and Tier 1s to create effective driver monitoring solutions

    - One size does not fit all. What should OEMs and Tier 1s consider when looking at the technology and portfolio of sensor suppliers in order to get the optimum combination of features for their specific driver monitoring requirements?

  7. Innovation in automotive camera technology, including mirror replacement and surround view

    Senya Pertsel, Senior Director Marketing, Ambarella

  8. Closing Remarks from the chairs

  9. Evening Drinks Reception

IS Auto Americas | Track B

Track B | Vision and sensor fusion

  1. Computer vision 2.0: where we are and where we're going

    Jeff Bier | President of BDTI

    Computer vision has rapidly transitioned from a research topic with few commercial applications to a mainstream technology with applications in virtually every sector of our economy. But what we are seeing today is just the beginning. In this presentation, Embedded Vision Alliance founder Jeff Bier presents an insider's view of the state of computer vision technology and applications today, and predictions on how the field will evolve in the next few years. Jeff explores the impact of game-changing technologies such as deep neural networks, ultra-low-power processors, and cloud-based vision services. He highlights new products and applications that illuminate what we can expect from visually intelligent devices in the near future.

  2. Utilization of a vision platform optimized for deep learning

    Jeff VanWashenova | Director of Automotive Market Segment of CEVA Inc., USA

    Automotive is seeing huge growth in vision applications that will lead the way to autonomous vehicles.  With the complexity of these systems, Tier-1 suppliers, OEMs, and the entire ecosystem are working with artificial intelligence and deep learning algorithms to identify objects, determine free space for vehicles and plan vehicles movements.  These algorithms are new to automotive, but are gaining a lot of momentum.  As companies move from algorithmic research to the realization of low power embedded solutions, it is important to have both efficient HW and SW that is optimized for CNN and other deep learning approaches. 

  3. 3D depth-sensing technology to bring drivers the augmented cockpit

    Tim Droz | SVP & General Manager U.S. Operations of SoftKinetic

    There are many challenges for the automotive industry, in particular when it comes to comfort, safety and joining the move towards full autonomy of the vehicle. 3D depth-sensing applications for cars such as, natural interaction, context awareness and intelligent 3D mapping will be an integral part of the solution, allowing vehicles to make sense of the activity inside the cockpit and the world around them.

    Building on our experience with depth-sensing and gesture recognition in the automotive market, we’re embracing the challenge and will discuss how 3D depth-sensing is a key factor towards creating the Augmented Cockpit experience which allows the driver to spend more and more time on productive and fun activities than on driving. Augmenting the cockpit is a fundamental step towards a completely autonomous driving solution.

    In this talk, we will illustrate the first steps of this game-changing journey and explain how depth-sensing technology offers the perfect and most robust solution to immerse drivers in an augmented cockpit, where they can interact with their vehicle and its environment in a natural manner, with their very own hands.

  4. AHD™ : The strong bridge from SD to HD automotive camera

    Young-Jun Yoo | Director Strategic Marketing Department of Nextchip

    • Introducing the AHD™ (Analog High Definition) transmission system
    • Enabling the upgrade of camera systems from SD to HD/FHD and up to 4K(UHD), using existing infrastructure
    • Illustrating the key benefits: backward compatibility, cable-independent, digital image quality, long-reach, real-time operation, cost and simplicity
  5. Afternoon Refreshments

  6. What are the latest developments in LiDAR technology?

    Louay Eldada | CEO and co-founder of Quanergy Systems, Inc.

    • The rise from mechanical to solid state LiDAR
    • Advantages of LiDAR over radar and cameras
    • Current and future applications for 3D LiDAR sensing
  7. Boosting signal to noise ratio in 3D depth sensing and Lidar systems through proper selection and matching of the light source and optical filters

    Andre Wong, Director, Product Line Management, Lumentum and Markus Bilger, Senior Product Line Manager , Viavi Solutions

    One of the main factors of developing of 3D depth sensing/LIDAR  systems in the automotive world are the challenging light conditions, which require optimized light source + detection optical designs.

    NIR based systems are sensitive to ambient light and requiring effective suppression of ambient light is therefore critical.  The choice of light source and filter and their proper matching can significantly improve signal to noise ratio and help to enable deployment of 3D depth sensing systems. The presentation will present novel, cost effective designs optimized for the automotive market.

  8. Closing Remarks from the chairs

  9. Evening Drinks Reception

Image Sensors Auto Americas 2016 | Day 2

  1. Registration for Image Sensors Auto US 2016

Delivering Multi modal sensor systems

  1. How to eat an elephant: process approach to autonomous vehicle sensor development

    Bodo Seifert | Director, Advanced Engineering of Magna Electronics Inc.

    • Complexity in modern vehicles today (20 million lines of code, up to 100 networked ECUs)
    • Problem statement: how to manage customer requirements, show traceability and prove that validation and verification covers everything
    • The answer: consequent application of the V model through Automotive SPICE
  2. Surround view system for autonomous driving

    Dr. Philip Chen | System Engineer, Function Owner - Comfort and Driving Assistance of Valeo

    • Why we need surround view systems
    •  How surround view system can improve the performance of ADAS system
    •  How surround view system can fit in the autonomous driving landscape
  3. Augmented navigation

    Aaron Thompson, Director, ADAS Planning & Marketing, Harman

  4. Panel: Routes to and requirements for autonomous vehicles

    Panelists Include: Texas Instruments, Sony, Qualcomm Technologies and Ford

    • Fully autonomous niches vs gradually smarter vehicles
    • Are the barriers to adoption technical or cultural
    • Current limitations in sensor technology and how can it be improved
    • What is the role of combined sensor data, deep learning, positioning and connectivity?
    • What can be done to enhance collaboration through the sensor ecosystem

    Panelists include

    • Dr Xinzhou Wu, Director of Engineering,  Qualcomm Technologies
    • Mayank Mangla, ADAS Imaging Architect,  Texas Instruments
    • Alvin Wong, Vice President, Automotive Innovation Center,   Sony
    • Dragos Maciuca, Technical Director,  Ford Motor Company


  5. Morning Networking Break and Refreshments

Considerations for commercialising autonomous vehicles

  1. Insurance in the era of autonomous vehicles

    Jerry Albright | Principal, Advisory, Autonomous Vehicles and Insurance Task Force Co-Leader of KPMG’s Insurance Advisory

    The conversion to autonomous vehicles could bring about the most significant change to the automobile insurance industry since its inception. The core ingredients – including technology, consumer adoption and regulatory permission – are aligning to enable mass change. Who will the winners and losers be?

  2. Big traffic data analytics for smart mobility – what does the image sensor value chain need to keep abreast of?

    Ralf-Peter Schäfer | Vice President Traffic and Travel Information Product Unit and Fellow of TomTom, Germany

    The presentation will give insights into the big traffic data archive, statistical information and examples of how the traffic and digital map database can be used in different areas of traffic analytics as traffic information, traffic planning, traffic management, Geo-Marketing, smart mobility and connected services for road travellers.

  3. Closing Address: State of the market for current V2X connectivity solutions - what is missing and what challenges remain?

    Ashok Moghe | Principal Engerineer, Engineering of Cisco Systems, Inc.

    • V2V,V2I and V2P requirements
    • Connected Vehicle Pilots and ongoing deployments
    • Why other sensors and V2X should work together: some example cases