How All Programmable Revolutionizes Embedded Vision

Display portlet menu

How All Programmable Revolutionizes Embedded Vision

Display portlet menu

How all programmable technology revolutionizes embedded vision

futuristic automobile interior

Autonomous driving is just the start of EV solutions

In 1982, Knight Rider brought us KITT, an artificially intelligent car that fought crime through high tech features like embedded vision. More than 35 years later, auto makers’ aspirations are racing past driver only or assisted automation like monitoring blind spots or adaptive cruise control. They have their eyes on the prize: fully autonomous driving.

In fact, IHS Automotive predicts autonomous car sales will hit 21 million by 2035.

Embedded vision (EV) replicates the human ability to take in visual information and process it in order to make decisions. Except EV does it with cameras, cables and CPUs, allowing machines like cars to absorb information and make decisions as well.

That creates a host of design challenges:

  • High performance demands: Enabling an embedded vision system to perform analytics in real time is a complex task. The higher the resolution and frame rates of the image, the more computation power required to process the data and extract meaningful information from it. The increasing challenge on designers, is that it must be done at a faster rate and a lower amount of power than ever before. The advent of machine learning algorithms will only exacerbate these demands.
  • Complex programming environment. Building a design that is differentiated and responsive while also able to immediately adapt to the latest algorithms and image sensors creates exponential complexity and stress. You’re left with large decisions such as what tools and emerging techniques can help you build a design that supports quality.
  • Shortened design cycles. Systems must be highly differentiated, extremely responsive and able to immediately adapt to the latest algorithms and image sensors. They must also hit the market faster than their competitors. With shortened design cycles, designers are having to choose between creating next-generation architectures and getting their IP to market on deadline.

Let’s take our autonomous driving example. This EV application, which promises to simplify a common task for people globally, is deeply complex system with multiple interactions between all of its parts:

  • Sensing: processing raw frame-by-frame data via in-vehicle sensors
  • Perception: taking data to do object detection, classification and positioning
  • Mapping: identifying safe driving areas and objects within a mapped area
  • Localizing: pairing information with the vehicle’s high-accuracy GPS
  • Route/path planning: determining short and long-term driving paths—including incident reaction
  • Motion planning: navigating vehicle control strategies appropriate for selected routes
  • Vehicle control: issuing braking, throttling, steering and suspension commands while driving
  • Driver interaction: providing feedback to the driver, sensing driver intent and handing off control

This used to be quite a challenge, considering whether a tiny sports car or a large truck, truly autonomous driving needs a network of cameras on all corners of the vehicle.

But All-Programmable SOCs have brought more clarity to this complex process.

Previous generation ADAS systems required an external processor to implement the algorithms for image processing and analytics. Such ASSP-based architectures required proprietary interface protocols and were more challenging to customize for feature differentiation.

With the advent of All Programmable MPSoCs, software bottlenecks can be accelerated in high-performance programmable logic while retaining the reconfigurability required for rapid upgrade. Designers may choose a software-defined development flow within a familiar, eclipse-based environment using C and C++ languages and leverage hardware-optimized image processing libraries such as OpenCV for an optimal partition of embedded vision algorithms between software and hardware.

As the auto industry transitions from ADAS to autonomous driving, ever greater concentrations of sensor fusion will combine visible-light cameras, radars and LIDAR systems distributed across the vehicle, connected over high-speed serial links. Combining multiple sensor interfaces, analytics and vehicle control into one system helps designers create lower power, higher efficiency data paths that enable self-driving cars to prevent a break-in before the first window pane is shattered or stop a self-driving car dead in its tracks to avoid an accident with an impending obstacle.

But it does more than just simplify design. It also solves problems for end customers.

Right now, most EV in cars reaches level 0 or level 1 in the autonomous driving spectrum, enabling those in the passenger seat with blind-spot monitoring or lane-keeping assistance. Fully driverless cars (level 5) is the hardest version of EV implementation to pull off. But considering 80% of accidents are a result of distracted driving, according to the NHTSA, self-driving cars are also the key to safer roads for us all. And these high efficiency, low power solutions make EV accessible at accessible price points.

We’re driving toward new innovations in embedded vision – and the future of driving is only the beginning.

How All Programmable Revolutionizes Embedded Vision

Display portlet menu

How All Programmable Revolutionizes Embedded Vision

Display portlet menu
Related Articles
automobile with sensors collecting information
We’re All Part of the Automotive Evolution
April 23, 2019
ADAS (Advanced Drive Assist Systems) and its evolution into full autonomy could see the end of road-based fatalities, injuries and incidents. Technology is now providing the solutions that can help the automotive industry achieve Vision Zero within j
test car colliding with object
Transportation Safety: 5 Protocols & Processes to Know
April 4, 2019
A dynamic range of protocols can help make our transportation technology safer.
cargo ship at port
Transportation Trends in Commercial & Non-Passenger Vehicles
March 29, 2019
Some of the most exciting technology trends shaping transportation are for commercial vehicles.
futuristic city scene
The Car: A Rolling Smart Device
February 5, 2019
Be it for streaming your favorite music, sending emails or getting real-time information on traffic jams: cars have long since played host to mobile internet—and will continue to via diagnostics, hotspots and ADAS.
man using tablet computer in industrial setting
5 of the best artificial intelligence use cases
January 3, 2019
AI’s upgraded algorithms make predictive analytics, parse data, and help businesses make smarter decisions from the boardroom to the factory floor.
Robotic hand touching laptop keyboard
Xilinx SoCs and the reVISION Stack Accelerate Embedded Vision Integration
May 4, 2018
See how designers can capitalize on the power and efficiency of Xilinx's Zynq Ultrascale+ MPSoC devices to implement designs using Avnet's Embedded Vision Kits.
Graphic of a green car flying in the air
Autonomous vehicles are the future
April 9, 2018
Autonomous vehicles do not just replace the driver, helmsman or pilot, but have the potential to create completely new business models worth billions.
Interior of European self-driving car
Meeting Next-Generation Automotive Design Challenges
April 2, 2018
As the automotive industry advances toward Level 5 fully autonomous vehicles, automotive engineers will have to continue driving more functionality out of vehicle subsystems that deliver safety, infotainment, and other features.
smartphone showing alarm feature
Key Design Considerations for Selecting the Right RF Antenna
March 16, 2018
Know when to choose standard, when to go custom.
Two men and woman reviewing technology products.
3 ways All Programmable SoCs create new opportunities for designers
March 2, 2018
In the growing market of embedded vision, there’s a need to not only rapidly scale to compete but to also be ready for machine learning’s effect on the space. That’s all while keeping up with lightning fast design cycles.
close-up of orange car headlights
Automotive LEDs Deliver Greater Design Flexibility for Safety, Style and Efficiency
March 7, 2017
LED lighting has been making major inroads into the automotive market for two key reasons: safety concerns and energy consumption.
man drawing schetch of an electric car attached to a battery
Technologies and Components for Designing Electric Vehicles
March 6, 2017
Hybrid electric vehicles (HEVs) such as the Toyota Prius and the Chevy Volt and electric vehicles (EVs) such as the Nissan Leaf, BMW i3 and Tesla Model S are growing in popularity amid concern for global warming.
military plane taking off
Defense/Aero Spinoffs Remain an Exceptional Source for Breakthrough Technology
March 5, 2017
Yes, automatic across the board cuts within the Department of Defense (DoD) will put the squeeze on many within the defense/aerospace supply chain.
self-driving car at intersection
Top 5 Tech Trends in Advanced Driver Assistance Systems
March 4, 2017
Advanced driver assistance systems (ADAS) promise to enhance vehicle safety by helping to simplify the driving process, reducing sources of driver distraction and inattention that often lead to accidents. With ADAS support, drivers and their passenge
truck driver, highway and IoT icons
Getting Started in Automotive Smart Vision Design
March 4, 2017
Advances in embedded vision technology have heightened interest in applying smart vision solutions for automotive safety.
rear view camera on car dash with image of child riding a toy
Top 5 Myths in Automotive Vision: Designing Embedded Vision Systems Is Easier Than You Think
March 3, 2017
Vision has always occupied a special place in information science and popular culture. One does not need to be an engineer to appreciate the vast bandwidth available in normal human vision. Most people understand that the common saying “a picture i
hand navigating smart car dasboard
Automotive Electronics: Top 5 Tech Trends of Tomorrow’s Smart Cars
March 3, 2017
In the United States alone, motor vehicles travel well over four trillion miles each year according to the U.S. Department of Transportation.* To gain a sense of this distance, consider that while it takes light approximately eight minutes to travel
blue sketch of a car
Gesture Recognition, Proximity Sensors Drive Advances in Automotive Infotainment
March 3, 2017
Safety must remain paramount when designing interactive interfaces for automotive applications, including guarding against distracted driving.
blue eye image with green computer pupil
Embedded vision is everywhere! See for yourself
February 20, 2017
Embedded vision technology will soon touch nearly every aspect of our daily lives –- from automatic vacuum cleaners in our homes to augmented-reality applications in our mobile phones and even in the self-driving cars of the future.
Man navigating an IoT car dashboard
The Internet of Things is Driving The Internet of Autos
February 14, 2017
In the early 1900s Henry Ford made the automobile affordable and accessible. But what he really did was offer people connection. Rural residents could connect with more urban areas to sell crops and buy supplies.

How All Programmable Revolutionizes Embedded Vision

Display portlet menu
Related Events
Connector Design Considerations for Advanced Automotive Systems
Date: October 29, 2019
Location: Webinar
XENSIV sensors from Infineon
Date: October 1, 2019
Location: Webinar
speedway logo
MiniZed SpeedWay Design Workshops™
Date: January 1, 2019
Location: Multiple Dates / Locations