Top 5 Myths in Automotive Vision

Display portlet menu

Top 5 Myths in Automotive Vision

Display portlet menu

Top 5 Myths in Automotive Vision: Designing Embedded Vision Systems Is Easier Than You Think

rear view camera on car dash with image of child riding a toy

Vision has always occupied a special place in information science and popular culture. One does not need to be an engineer to appreciate the vast bandwidth available in normal human vision. Most people understand that the common saying “a picture is worth a thousand words” is simply code for the rapid assimilation and interpretation of huge amounts of raw data. Accordingly, myths arise that building a vision system must be too complex to seriously contemplate.

Indeed, engineers, product planners and company executives are often too quick to accept any number of reasons for avoiding development of any kind of embedded vision application, much less those associated with mission-critical automotive applications. Despite rapid advancements in embedded vision markets and technology, as well as the ready availability of supporting products and services for automotive vision solutions, some myths still persist.

Myth 1: It's just for driver rear-view vision

Rear-view cameras might be the most familiar application of video systems in vehicles, but opportunities for vision systems abound for enhancing vehicle safety and informatics capabilities (Fig. 1). Embedded vision systems integrate high-resolution image sensors with powerful processing hardware and sophisticated software capable of object detection, recognition and tracking. This combination of imaging hardware and software provides the underlying foundation for high-speed detection and recognition of pedestrians, other vehicles, traffic signs, lane obstructions, lane departure and any number of related applications.

2D image of smart car with various safety control points

Fig. 1: Automotive vision applications extend well beyond familiar rear-view monitors. (Source: Xilinx)

Myth 2: The technology's not there yet

For automotive safety applications, embedded vision requires real-time image processing capabilities able to detect, recognize, classify and track hazards or potential hazards, whether those include road obstructions, lane containment, other vehicles or pedestrians. Furthermore, to activate vehicle safety capabilities, vision data needs to be processed and distributed to high-level vehicle control systems such as those responsible for steering, braking and acceleration.

Despite the complexity of these systems, the individual elements required to build these solutions are readily available or even already in place. Existing standards such as CAN and Ethernet AVB (Audio/Video Bridging) provide the communications backbone needed to distribute video, data and control operations across different vehicle subsystems.

Similarly, it is a myth that automotive vision systems require some sort of satellite surveillance-quality imaging capability. In fact, automotive applications dictate a broad range of image sensor requirements that lie well within the capabilities of commonly available devices (Fig. 2).

automotive vision applications chart

Fig. 2: Automotive vision applications present a broad range of image sensor requirements that lie well within the capabilities of commonly available devices.

 

For the underlying computing power, developers can find high-performance processors that combine multiple types of cores in heterogeneous architectures able to handle not only general purpose applications software but also the real-time processing required for this environment. For more demanding video requirements, specialized video processors leverage internal processing pipelines designed to outperform general purpose processors. In fact, devices such as the Analog Devices BF609 Blackfin® processor combine DSP cores with Analog Devices' specialized PVP (pipelined vision processor) to accelerate image processing algorithm execution. Furthermore, hybrid devices such as the Xilinx Zynq® 7000 All Programmable SoC combine general purpose ARM Cortex™-A9 cores with an FPGA fabric designed to support very high speed, hardware-based custom data processing pipelines required for specialized vision applications.

At the same time, development environments offer built-in support designed to accelerate design and enhance productivity. For example, the Xilinx Vivado™ tool chain provides developers with the capabilities needed to rapidly deploy high performance embedded vision systems that combine the Xilinx Zynq 7000 All Programmable SoC, third-party IP and their own proprietary algorithms. High-level synthesis tools in the Vivado tool chain allow engineers to quickly implement their performance critical C-based algorithms or OpenCV functions as hardware in the Zynq 7000's FPGA fabric.

Myth 3: You have to have a Ph.D. in image processing

The myth that algorithm complexity will eventually subvert any automotive vision development project dates back to when real-time object detection and recognition efforts lay strictly in the research domain. Worse, the lack of available embedded processing power available at that time left researchers to work around computational limitations that simply do not exist in today's high performance embedded hardware platforms.

The notion of extreme complexity often persists today despite the wide availability of software solutions for image processing. For example, MATLAB's Computer Vision System Toolbox and open source software libraries from OpenCV provide pre-built and tested functions designed for advanced computer vision functionality with support for capabilities ranging from basic image manipulation to object recognition and tracking.

At the same time, development environments offer built-in support designed to accelerate design and enhance productivity. For example, the Xilinx Vivado™ tool chain provides developers with the capabilities needed to rapidly deploy high performance embedded vision systems that combine the Xilinx Zynq 7000 All Programmable SoC, third-party IP and their own proprietary algorithms. High-level synthesis tools in the Vivado tool chain allow engineers to quickly implement their performance critical C-based algorithms or OpenCV functions as hardware in the Zynq 7000's FPGA fabric.

Myth 4: It's too expensive to get started

The flip side of the software myth is the notion that embedded vision hardware subsystems are simply too costly. In fact, developers can find complete low-cost systems such as Avnet’s Blackfin® Embedded Vision Starter Kit (Fig. 3). Priced at $299, the kit combines the FinBoard development board with a full complement of software development tools and accessories required to build sophisticated vision applications. Based on the Analog Devices Blackfin BF609, the kit enables developers to explore sophisticated imaging applications, relying on the BF609's integrated PVP to accelerate execution of image processing algorithms. Included with the kit, the CrossCore™ Embedded Studio development suite and ICE-100B In-Circuit Emulator help speed design and debug of these systems.

Avnet's Blackfin kit

Fig. 3: Avnet’s Blackfin Embedded Vision Starter Kit includes a full complement of hardware and software needed to build sophisticated embedded vision applications.

Myth 5: There aren't enough resources to help

For developers and companies looking to explore automotive image processing applications, perhaps the most important fact is the breadth and depth of resources available to help them design and optimize these systems. Automotive vision is a strategic market for a growing group of IC and board manufacturers and each of the leading manufacturers offers specialized assistance in specifying and designing these systems. Furthermore, developers can find assistance with embedded vision design through OpenCV with its community of 47,000 developers and through FinBoard with support for the Blackfin Embedded Vision Starter Kit. Engineers beginning to explore vision systems can also take advantage of a growing number of workshops and seminars on the topic. Along with presentations at professional conferences such as the Embedded Vision Summit, engineers can find local presentations such as the Smarter Vision Design Seminar and Workshop.

Top 5 Myths in Automotive Vision

Display portlet menu

Top 5 Myths in Automotive Vision

Display portlet menu
Related Articles
Smart car branded with Avnet and supplier logos
ADAS: paving the way for autonomous vehicles
November 11, 2019
In order to realize fully automated driving without human intervention, three conditions must be met - the vehicle must be fully aware of the surrounding environment, be able to respond accordingly when the environment changes, and the security of th
road view from inside self-driving smart car
Unreliable autonomous driving? Take a look at how difficult vehicle-mounted vision processing is
November 11, 2019
We all have high expectations for autonomous driving, a fact that’s exemplified by the display of autonomous driving technologies at every major tech expo. Development in this sector has however been accompanied by an increase in accidents related
test car colliding with object
Transportation Safety: 5 Protocols & Processes to Know
May 29, 2019
A dynamic range of protocols can help make our transportation technology safer.
cargo ship at port
Transportation Trends in Commercial & Non-Passenger Vehicles
May 29, 2019
Some of the most exciting technology trends shaping transportation are for commercial vehicles.
Graphic of 5G uses for automotive
5G is here – and maybe it’s time to buy a new car!
May 27, 2019
The mission of 5G is to create the ‘Internet of Everything’, drastically expanding the scale of the Internet in the process. As one of our most valuable possessions, it is inevitable that cars will join the network.
Interior of European self-driving car
Meeting Next-Generation Automotive Design Challenges
April 2, 2018
Thanks to engineering ingenuity and the electronic components that have become pervasive inside vehicles of all types, cars are smarter and, in many ways, safer to drive. As the automotive industry advances toward Level 5 fully autonomous vehicles, a
smartphone showing alarm feature
Key Design Considerations for Selecting the Right RF Antenna
March 16, 2018
Know when to choose standard, when to go custom.
Person using the navigation systemm on their car dashboard
Four tech trends that improve the driver experience
September 13, 2017
Learn about ways to integrate some the most innovative features into cars, including haptic-feedback touch screens, knob replacement, smart glass and driver-notification applications.
smart car on the road at sunset with IoT icons overlay
Mastering the road ahead
August 28, 2017
The market demand for Advanced Driver Assistant Systems (ADAS) has increased by leaps and bounds. In fact, the global automotive electronics market is forecast to grow robustly over the next few years. In particular, the market value of entertainment
side view of a sports car from the ground up
Powering the future—the hidden heroes of electric vehicle technologies
August 14, 2017
The automobile industry in Mainland China has developed at an unimaginably rapid pace in recent years and a general trend can be observed in car sales figures. A little more than twenty years ago, automobile production in Mainland China numbered only
man in dirverless car with hands off the wheel
The Growth and Increasing Sophistication of ADAS in the World
July 5, 2017
As electronics in our cars continue to become more sophisticated, ADAS (Advanced Driver Assistance Systems) are taking center stage. These systems make many of the aspects of driving easier, and---most importantly—safer.
self-driving car at intersection
Top 5 Tech Trends in Advanced Driver Assistance Systems
June 15, 2017
Advanced driver assistance systems (ADAS) promise to enhance vehicle safety by helping to simplify the driving process, reducing sources of driver distraction and inattention that often lead to accidents. With ADAS support, drivers and their passenge
man drawing schetch of an electric car attached to a battery
Technologies and Components for Designing Electric Vehicles
April 29, 2017
Hybrid electric vehicles (HEVs) such as the Toyota Prius and the Chevy Volt and electric vehicles (EVs) such as the Nissan Leaf, BMW i3 and Tesla Model S are growing in popularity amid concern for global warming.
Man navigating an IoT car dashboard
The Internet of Things is Driving The Internet of Autos
April 24, 2017
In the early 1900s Henry Ford made the automobile affordable and accessible. But what he really did was offer people connection. Rural residents could connect with more urban areas to sell crops and buy supplies.
smart car dashboard
New TFT LCD Technology Shape Infotainment for Cars of the Future
April 15, 2017
Although the technology has been around since the 1990s, the use of thin-film-transistor liquid-crystal displays (TFT LCDs) is on the rise in automobiles.
person driving a semi
Getting Started in Automotive Smart Vision Design
March 27, 2017
Advances in embedded vision technology have heightened interest in applying smart vision solutions for automotive safety.
Graphic of a car's outline
Gesture Recognition, Proximity Sensors Drive Advances in Automotive Infotainment
March 17, 2017
Safety must remain paramount when designing interactive interfaces for automotive applications, including guarding against distracted driving.
Person using the navigation console in their car
Automotive Electronics: Top 5 Tech Trends of Tomorrow’s Smart Cars
March 3, 2017
In the United States alone, motor vehicles travel well over four trillion miles each year according to the U.S. Department of Transportation.*
WiFi router with man in the background using tablet
How wireless communication protocols make or break designs
March 3, 2017
The design process includes many critical decisions for engineers, such as which wireless communications protocol to use.

Top 5 Myths in Automotive Vision

Display portlet menu
Related Events

No related Events found