Here we grow again

I’m sure you’ve heard the saying, “two heads are better than one.”  The truth of this statement is evident when you consider the increase in efficiency and productivity when tasks are shared between two individuals. Wouldn’t you agree, a team of two is much better than one?  For example, when moving a large couch up three flights of stairs or transporting passengers by airplane?

Imagine what can be achieved to perform complicated tasks when the team of two expands to numerous individuals; where each individual has a diverse set of skills and strengths.  Using the previously mentioned example of aviation, Air Traffic Controllers contribute by managing the flow of inbound/outbound traffic.  Flight Attendants care for passenger safety and confort, while Luggage Handlers ensure that passenger luggage arrive at the appropriate destination.  There are a variety of other diverse teams contributing to help alleviate the complexities associated with the airline industry.

Now reflect on the technical complexity that OEMs must resolve to enable ADAS and autonomous driving solutions for vehicles.  Such applications include lane departure warning, object detection, driver monitoring, and auto-emergency braking.  The underlying technologies enabling these applications entail capturing data from multiple sensor modalities (cameras, radar or lidar), analytical computation of input sensor data, and vehicle control.  This is only a high-level perspective of the underlying technologies.   

Let us consider the fundamental electronics and software necessary for enabling a single advanced driver assistance system (ADAS) application.  Consider the requirements to implement a camera based lane departure warning solution.   First, it would require a design for a ruggedized miniature camera.  Afterwards, the designer would need to implement image processing software to enhance raw camera data, thereby improving image quality over a wide range of lighting conditions.  Finally, analytics processing must be done to detect lane markers.

As a provider of processors dedicated to enabling the development of ADAS and autonomous driving solutions, Texas Instruments (TI) understands the correlation between the capabilities our TDAx processors and the product specifications to implement ADAS applications.  TI’s TDAx Systems-on-Chips (SoCs) are differentiated by a heterogeneous architecture which integrates purpose-built hardware accelerators (HWA) such as image signal processors (ISP), embedded vision engines (EVEs) vector processors and digital signal processors (DSPs), that are optimized to perform highly complex imaging and signal processing with low power consumption.

Equally as important as the SoC are complementary development platforms and software allowing OEMs and Tier-1s to implement full solution themselves.  TI has formed strong relationships with companies that have expertise in specific ADAS applications or system design services to offset the complexity Tier-1s and OEMs may encounter when working alone.

Recently, the TI Automotive Processors team announced the continuation of the expansion of our Advanced Driver Assistance System (ADAS) ecosystem with the announcement of the availability of a RT-RK’s autonomous driving development board, “Alpha.”  RT-RK, an embedded services and product development company globally recognized for their digital signal processing software development, and TI released a reference design board that has three of TI’s TDA2x processors. The reference design board was developed to help customers meet the high performance requirements necessary to develop autonomous driving solutions by minimizing the effort to evaluate the vision analytics capabilities the TDA2x SoC.

The partnership with RT-RK is yet another example of the benefits of a diverse team of domain experts. The strength of this collaborative ecosystem was realized when RT-RK released their development board and began integrating partner solutions to demonstrate the capabilities of their development board at the Electronica Trade Fair in Munich, Germany.  RT-RK was able to leverage the software solutions provided by other TI partners; MMS’s image tuning services, FotoNation’s Driver Monitoring Solution, and KPIT’s front camera lane departure warning, along with TI’s Surround View demo software.  Each of the software solutions provided by our partners was already optimized to execute on TI’s TDAx device, thus reducing RT-RK development to an integration effort within their software framework.

TI’s ecosystem network covers a range of technical expertise including:

  • Hardware: multi-sensor imaging, fusion and autonomous development boards
  • Production ready algorithms: front camera and surround view
  • Production ready imaging support: image pipeline, wide dynamic range (WDR) and flicker mitigation

As a supplier of ADAS or autonomous driving solutions or OEM equipping vehicles with such capabilities, it is not necessary to go at it alone. When developing solution based on TI’s TDAx family of devices you’ll benefit from the strength of a team of industry leaders across a diverse range of applications and service domains. 

For more information about our ecosystem partners please visit our Third Party Ecosystem landing page.