2017 Agenda | October 10-11, 2017
Get the latest on imaging and auto sensors, autonomous vehicles, and AI and more.
IS Auto Americas 2017 | Day 1
IS Auto Americas | Chair’s introductory remarks
Dragos Maciuca | Technical Director, Palo Alto Research and Innovation Center of Ford Motor Company
How the traditional auto industries and sensor industries are being disrupted: near – medium – long term
Dr. Sven A. Beiker | Founder and Managing Director of Silicon Valley Mobility
- Analysis of the auto companies that have come to Silicon Valley – Chinese / European / Japanese / Korean. What insight does it give into the momentum, movement and disruption of traditional supply chain?
- New OEMs, new T1s, new autonomous driving start-ups – how is the industry re-shaping; how are traditional business models being challenged and adapted?
- Analysing new partnerships and M&As
Euro NCAP: In pursuit of Vision Zero
Richard Schram | Technical Manager of Euro NCAP
The presentation will show the near, medium and long term requirements Euro NCAP is developing as well as an overview of the achievements so far
Advanced sensor suite requirements for various levels of autonomous driving
Boris Shulkin | Vice President, Research and Development of Magna International Inc.
Panel discussion: Sensor requirements for autonomy
Chair Dragos Maciuca, Technical Director - Palo Alto Research and Innovation Center, Ford Motor Company | Panelists include Kleiner Perkins Caulfield & Byers and Engineering Leader and Angel Investor
- Steven Hong, Partner, Kleiner Perkins Caulfield & Byers
- Darren Liccardo, Engineering Leader and Angel Investor
Morning Networking Break
Camera monitor system requirements derived from human perception and observations
Dr Markus Adameck | General Manager Development of Panasonic Industrial Europe GmbH
- Requirements for indirect camera vision and passive camera sensing
Startup automotive OEMs – opportunities & risks for suppliers
Peter Hasenkamp | Director of Supply Chain of Lucid Motors
- How startup OEMs work differently vs. traditional large automotive OEMs
- What advantages automotive suppliers can gain from working with startups
- The path to autonomy
Industry Keynote panel discussion
Chair: Brendan Hermalyn, Camera Lead, Waymo | Panelists include, Avis Budget Group, Lucid Motors, and Panasonic
Insight into how the automotive industry is changing and how the broader supply chain, and the sensor community in particular, might plan and respond to industry and consumer needs.
- Peter Hasenkamp, Director of Supply Chain, Lucid Motors
- Arthur Orduña, Executive Vice President and Chief Innovation Officer, Avis Budget Group
- Markus Adameck, General Manager Development, Panasonic Industrial Europe GmbH
IS Auto Americas 2017 | Track A
Ubiquitous usage of Optical flow sensors
Ramkrishna Swamy | Senior Applications Engineering Manager of Samsung Semiconductor Inc.
With the sudden explosion of autonomous concepts in automobiles, technologies providing real-time motion, depth mapping, and tracking have become increasingly important. Dynamic Vision Sensor (DVS) technology has evolved to address the challenging requirements of ADAS, offering an optimal balance between resolution, speed and power consumption. The DVS supplements other sensing technologies to enable high-performance system solutions and can be extended for use in multiple automotive scenarios.
In cabin monitoring systems: challenges & solutions
Dr. Petronel Bigioi | Senior Vice President Engineering & General Manager of FotoNation
-Exploring the requirements for the next generation of driver monitoring systems and transition to in cabin monitoring
-Looking at multi-spectral analytics and sensors fusion technologies to overcome current generation limitations
-Evolution of in cabin monitoring solutions in the context of automation driving levels
3D LIDAR: The key to mass-market adoption of autonomous transportation
Anand Gopalan | CTO of Velodyne LiDAR, Inc.
Three-dimensional LIDAR technology has been instrumental in ushering in the autonomous mobility revolution, with today’s 3D LIDAR sensors positioned as an integral element of the sensor suite for most self-driving cars. However, as we move from a small number of test fleets to mass deployment across multiple segments, dollar cost and energy consumption will start to come into focus as key hurdles. The presentation will discuss how these hurdles are being addressed to enable a quantum leap in cost and compute efficiency across the autonomous system.
Afternoon Networking Break
Where does LiDAR fit in the ADAS, automated driving and autonomous vehicle roadmap?
Nicholas Gagnon | Business Development – Automotive of LeddarTech
In the next 5-10 years we will see more ADAS systems and a significant ramp-up of breakthrough Highly Automated Driving (HAD) systems, using several types of sensors, to be implemented in passenger, recreational and commercial vehicles.
Cameras, radars, LiDAR, ultrasonic technologies, and their fusion will contribute greatly to increase safety on the road for all users.
This presentation will review where LiDAR technology positions itself and will present the value in this roadmap. The requisite features and performance aspects of LiDAR sensors requested and envisioned by OEM and Tier-1 will also be discussed.
Development of remote parking system using smartphone to realize driverless parking
Dr. Subrata Kundu | Senior Researcher & Team Leader Automotive Products Research Laboratory Global Center for Social Innovation (CSI-NA) of Hitachi America, Ltd.
With the advancement of autonomous driving (AD) technology, development of the remote parking system has gained significant attention with the aim to realize a practical autonomous vehicle. This presentation focuses on next generation ADAS and remote parking system including performing driverless, parallel and perpendicular parking as well as exiting using the smartphone
Automotive multi-mode MIMO cascaded radar and satellite radar data processing system
Stanley Liu | Senior ADAS Application Engineer, Radar Fusion Tech Lead of Texas Instruments
Radars play a major role in providing sensing capabilities for active safety automotive applications and autonomous cars. Multi-transmitter and receiver radar systems are becoming popular in order to detect and classify objects in complex urban driving scenarios. This presentation will describe hardware and software modules for a multi-mode cascaded radar data processing system and a satellite radar processing system. We derive system configuration and data processing requirements based on current state of the art and future MIMO radar processing requirements. The proposed system is able to meet these requirements with ≤75% AWR12x and < 60% TDA2x utilization.
Laser Diode Solutions for 3D Depth Sensing LiDAR Systems
Tomoko Ohtsuki | Product Line Manager | Lumentum
LiDAR is a required sensor for level 3&4 ADAS applications. The illumination source for LiDAR systems significantly impacts not only performance, but also reliability and cost. A variety of 3D depth sensing (3DS) diode laser illuminators developed by Lumentum have been adopted and advanced in the consumer electronics market with a solid track record of reliability and quality. New high power VCSEL array technologies are enabling the more demanding and high volume 3DS applications. We will present the robustness of 3DS cameras based on VCSEL array illuminators against sunlight, temperature range -40C to 115C, and other environmental stresses required for automotive-qualified products.
Chair's Closing Remarks For The Day
Evening Networking Reception
IS Auto Americas 2017 | Track B
Essential tools of self-driving ecosystem
Lorant Pocsveiler | Head of US Office of AImotive
Developing a hardware agnostic, scalable solution for enabling fully autonomous self-driving cars. Relying on cameras as primary sensors to accomplish the essential tasks of object recognition & classification, localization, decision making and trajectory planning. The presentation will provide an insight into the toolkit developed for training and verification of the full software stack, including but not limited to: calibration, data collection, semi-supervised annotation and simulation.
Enabling the upcoming shift to low-level (a.k.a. early, raw) sensor fusion
Ben Landen | Director of Business Development of DeepScale
- High-level (a.k.a. late) sensor fusion: how the automotive industry arrived at today’s paradigm
- How low-level sensor fusion creates a system of sensors that’s better than the sum of its parts, producing higher quality perception required by autonomous vehicles
- Creating hardware flexibility and system robustness through neural networks
Developing robust ADAS: deep learning at the edge
Bruno Fernandez-Ruiz | CTO / co-founder of Nexar
Robust driving policy models depend on large training datasets exposing the true diversity of the real world. Current approaches are limited to models trained using homogenous data from a small number of vehicles running in controlled environments. This presentation will discuss the introduction of a network of connected devices building an end-to-end driving policy which can leverage the 10 trillion miles driven every year.
Afternoon Networking Break
An ecosystem approach to ADAS sensing
John Wheatley | Division Scientist, Display Materials & Systems Division of 3M
ADAS sensing challenges are multifaceted, including signal to noise, object identification, computational power, and design integration. A key to addressing these problems is an ecosystem approach taking into account the light source, sensor, the object that is being sensed, and synergistic enhancements to all. Included will be technologies that can make both the machine readable signs and the optical sensors themselves invisible to humans. This presentation will discuss the use of this technology in automotive design enabling optical codes for remote sensing, and the potential implications to training of neural networks.
Autonomous driving needs raw data fusion
Youval Nehmadi | CTO of VAYAVISION
In this session we will present how raw data sensor fusion provides the advanced perception needed for self-driving. Using both camera and LiDAR to generate a high resolution 3D RGBd environment model we overcome the limitations of each sensor separately. The session will include comparisons between the methods in real life conditions.
Cybersecurity considerations for autonomous vehicles sensors
Giri Venkat | Technical Marketing, Automotive and Vision of ON Semiconductor
Threats against the sensors in an autonomous vehicle represent one of several potential attack surfaces in an autonomous vehicle system. The presentation will begin with an analysis of the cybersecurity threats against the sensors in an autonomous vehicle and an examination of specific threat models. This will be followed by an overview of the attack surface of a typical autonomous vehicle sensor and an exploration of the specific attack vectors. Finally, methods to harden sensors against these attack vectors will be presented along with a review of best practices.
Chair's Closing Remarks For the Day
Evening Networking Reception
IS Auto Americas 2017 | Day 2
IS Auto Americas | Day 2 Remarks
System engineering considerations in rugged/automotive camera module design
Jerome Barczykowski | Product Line Manager – Autonomous Systems of D3 Engineering
Performance-critical imaging systems require a comprehensive approach to provide a solution that meets or exceeds multiple competing criteria. The best design methodology for a team involved these applications is to follow a system engineering process. This presentation will outline how to apply system engineering concepts to rugged/automotive camera module design. An overview of the approach, design trade-offs, and manufacturing techniques supported by real-world examples will allow the audience to leave with concrete steps to take into their own designs.
360° Surround vision sharper than your own eyes
Patrice Roulet | Director of Engineering and Co-Founder of Technology of Immervision
Automobiles, trucks, buses, farming trucks of tomorrow are evolving – getting connected, smarter, even autonomous. Yet, their vision remains limited mainly due to the use of restrictive old generation lenses. A new generation of super wide angles lenses capturing surrounding in full 360° increases the performance of ADAS technologies and beyond, enabling disruptive use cases for automotive video cameras. This new generation of wide angle lenses customisable as different organic eyeballs will become the essential building block in the cars evolution. In this talk, we will introduce the automotive industry unique needs and the impact the wide field-of-view lens requirements such as Chief Ray Angle, large temperature range, low F#. We will explore the fundamental contributions of panomorph optic technology compare to other alternatives. We will conclude by multiple applications where 360° technologies will shape the upcoming world of safety, sharing and freedom.
Optical challenges and opportunities in the auto sector - next generation lenses and more
Chair: Brendan Hermalyn, Camera Lead, Waymo | Panelists include Navitar and Theia Technologies
What are the unique requirements for sensors in the autonomous vehicle market and what are the implications for lens requirements?
- Michael Thomas, President, Navitar
- Jeff Gohman, co-Founder and President, Theia Technologies
Morning Networking Break
Viewing Safety: An Integrator’s Perspective
Patrick Shehane, Senior Director of Imaging and Software, Intel Corporation
AI and self-driving cars
Tim Wong | Technical Marketing for Autonomous Vehicles of NVIDIA Automotive
NVIDIA will discuss the technology behind self-driving cars, specifically different neural networks that support the AIs that allow the car to recognize its environment, where it is, traffic signs and signals and decide how best to manage its path forward, all while keeping everyone safe.
Investing in the building blocks to optimise range, resolution, intelligence and cost for next generation sensors. Where to place your bets in the components stack?
Panelists include: Kleiner Perkins Caulfield & Byers
- Steven Hong, Partner, Kleiner Perkins Caulfield & Byers
Presentation followed by:
- Showcasing the most promising sensor and machine vision technologies which will have a real impact on the auto sector
Afternoon Networking Lunch
How should sensor hardware and software suppliers navigate the C.A.S.E. trends – Connected, Autonomous, Shared mobility, Electrified
Alexandre Marian | Director, Automotive Practice of AlixPartners
- C.A.S.E. trends are completely revolutionizing the automotive industry and suppliers should adapt to survive
- Connected: suppliers have a role to play to support OEM getting to the next level and help mitigate cybersecurity threats
- Autonomous: biggest disruption in our lifetime and only a few players will prevail
- Electrification: China is already largest producer of electric vehicles and is the true “Giga Factory”
- Quality and warranty risks: new technologies account for 50% of quality issues
- Analysing strategic M&A transactions occurring driven by C.A.S.E.
- Supplier takeaways: how to navigate this transition
A scalable approach to providing cognition for self-driving cars
Sravan Puttagunta | CEO and Co-founder of Civil Maps
Extracting context from the vehicle's environment is one of the major challenges to achieving autonomous driving. While this can be accomplished in highly controlled scenarios today, scalable solutions are not yet deployed. In this talk, we explore the crucial role of HD Semantic Maps in efficiently providing cognition to autonomous vehicles. We look at innovations in “marrying” the HD map space (long term memory) with the sensor space (sensor memory) and how to approach this with advanced localization in 6 degrees of freedom (6DoF), machine vision, and smart compression technology.
Autonomous vehicles - extending the sensory horizon
Anshuman Saxena | Director, Product Management and Product Line Manager of Qualcomm Technologies, Inc.
Critical components of full autonomy include accurate characterization of drivable space, precise localization in the “road world,” real-time path planning and driving behavioural analysis. Advances in computer vision and sensor fusion algorithms, as well as optimized embedded implementations of deep neural networks are needed to achieve the stringent requirements of full autonomy with inexpensive automotive grade sensors and on-board computation within the power budget / thermal envelope. Low latency and reliable vehicle-to-vehicle (V2V), and vehicle-to-infrastructure communication (V2X), enhances perception by extending the vehicle’s sensory horizon, resulting in improved safety.
Afternoon Networking Break
Building Trustworthy Automotive Image Sensors for Autonomous Vehicles (AV)
Munir Haque | Director, Automotive Functional Safety, Automotive Innovation Centre of Sony Electronics Inc.
Image sensors are the eyes of a highly autonomous vehicle and the more we move towards level 5 autonomy (self-driving cars), the more the ECUs will rely on sensing components including image sensors to make self-driving decisions. Undetected faulty sensing may lead to wrong driving decisions and may cause significant hazard to human life. Furthermore, since driving decisions will be made by intelligent systems, and not by humans, unauthorized images acquired by the system may also lead to significant hazard to human life. This presentation describes the key challenges and requirements for an automotive image sensor to be considered trustworthy and provides possible recommendations for each requirement to be fulfilled.
From ADAS to automated driving: LIDARs and sensor fusion advancing ADAS architectures. Projections to 2023
Akhilesh Kona | Senior Analyst, Automotive Electronics & Semiconductor of IHS Markit Technology
- OEMs adoption of ADAS architecture in 2017
- Key requirements for next-generation architectures (L3 and above)
- Technologies enabling low-cost automated driving architectures
- Forecast scenario for automated driving electronics
Closing Keynote Presentation
Huei Peng | Roger L. McCarthy Professor of Mechanical Engineering and Director of Mcity of University of Michigan
Despite continued progress in technology, driver education and enforcement, the number of fatalities and injuries caused by ground vehicles remain high. In the US, about 35,000 people were killed in 2015 and worldwide the number is more than a million. Technologies supporting the development of automated and connected vehicles have the potential to improve motor vehicle safety, and dramatically impact congestion, energy consumption, and mobility. In this talk, the key recent developments will be summarized, including activities at Mcity.
Closing panel: Routes to and requirements for autonomous vehicles - what can be done to enhance collaboration through the sensor and broader, interconnected ecosystem?
- Alvin K. Wong, Vice President, Automotive Innovation Center, Sony Corporation
- Joshua Wise, Image Signal Processor Architect, NVIDIA
- Mayank Mangla, ADAS Imaging Architect, Texas Instruments
- Alexandre Marian, Director, Automotive Practice, AlixPartners