Gene FrantzTI Principal Fellow, Futurist and Business Development Manager, DSP
I started some discussions early last year about where technology would be in 2020 when I originally posted this blog. Since then, I have worked with some of the top technical minds here at TI to create some other viewpoints on architecture, multicore, development tools and more that I will be sharing here over the next two months. I am reposting my original blog to kick this off with the other viewpoints following shortly. I hope you enjoy the series.
I have come to the conclusion that too many of us have no clue where we are going with technology. Rather, we are just busily moving forward and don’t know if we are even moving in the right direction. It would seem that with our extensive experience in traveling we would understand a basic concept – to travel to a distant place requires two points:
The same goes for technology – we need to know where we are going to move in the right direction. So, I have challenged several of our senior technologists to think about what the state of the art will be in the year 2020. You might say that we need to have 20/20 vision for the year 2020. I have invited a number of technologists to provide their point of view (POV) of what the state of the art in IC technology will be in the year 2020, and I’m interested to hear what you have to say on the topic. But, since this is my blog, I will have the first and last word on what the year 2020 will hold for us.
So, here are my first thoughts on the topic.
Unfortunately we use the first definition more than the second. Small design teams with short schedules will require us to use the latter definition. And, yes, there are companies already adopting this concept of reuse.
So, this is a sketch of how I see 2020. After a couple of POV papers from others at TI, I will come back with a conclusion. My colleagues will dive into topics such as programmability, tools and SoCs in the next few blogs. If you would like to share your view of 2020 with me, please comment or send me a private note.
Yet another thought provoking piece!
I must say I can hardly wait for any of these to come true! However point No 5 puts some fear in me, so I will focus on that.
In all those years using DSP and FPGA design tools I have been amazed by what can be done using high-level descriptions and automated optimization tools. I have to say there are very few things more empowering than watching an FPGA design tool synthesize, optimze and implement a logic design worth hundreds of thousands of gates (Now you all are surely thinking that I should get out more...!).
In fact the following trend has slowly but clearly been developing in the industry for the last 10 to 20 years: In a race to bring the advances that the market demands, silicon manufacturers produce platforms and architectures of ever-increasing complexity and offset this increase in complexity by an increase in the power, complexity and automation provided by high-level development tools and languages. In short, the difficulties of programming and designing with "complexier and complexier" platforms is masked by complexity added to the development tools.
Naturally, observing the success of this design philosophy over the last 20 years, the tendency is to assume that it will continue in the future, but here comes the fear:
The promise of high-level development tools is that almost anyone would be able of outputing very complex designs very quickly, even working on extremely complex platforms. However finely verifying and validating the design, making sure that it works reliably in every corner of the envelope, knowing where to look when things do not work as expected, is where good design engineers earn their money. Unfortunately for these tasks, layering dev-tool complexity on top of platform complexity does not help, quite the opposite.
As platforms and their development tools become more and more difficult to grasp, design engineers and teams are pushed to adopt one of the following two postures:
- Either trust the tools blindly and forget about fine verification, not to mention trying to understand the details of what the dev-tools actually do! In this paradigm design validation is understood as throwing the switch and making sure it does not break... immediately. Sadly this attitude is slowly but surely becoming the norm, simply because each day only has 24H, and the design must get out, and design engineers cannot know everything and ...
When things go wrong here it is usually not by half measures!
- Or specialize the team members so that some of them thoroughly master each target platform and its dev-tools. This is required because when you needed weeks to completely master a platform 10 years ago, you now need months and even years. The investment in skill that is required has become very large and is not easy to reuse. In this paradigm you need team members fluent in TI DSPs, other in FPGA, still others in SOC...etc. In addition to the difficulties of maintaining and training such highly specialized and compartmentalized teams, this design environment does not promote the agility of the team, or generally its creativity.
One argument that I often hear is that one day processors will be fast enough that you will be able to design at high-level without much caring about timing or control over "what is executed when". No pipeline or cache worries then, because the cores will be so much overdesigned with respect to the application that it simply won't matter anymore. By the way the dies will also be so small and inexpensive and consume so little power that no one will care much about optimizing these factors either! Of course the problem with this view is that it assumes that there is a finite pool of applications to be served. In reality new applications are born just as soon as they can be pushed into the highest-performance hardware and architectures. And so to satisfy this leading-edge of development often requires that hardware and software architectures be stretched to the limit of what they are capable of. Attention to fine execution control, and the design difficulties that come with it, are not going away anytime soon I am afraid! My worry is that even taking into account the advances in development tool technologies over the years, these difficulties have not been easing with time. They have been increasing.
All this, and I am not even sure that greater complexity is absolutely required to bring about the advances that the industry demands
I know that I may be going against the wind here, but I submit that the industry would be well served by the realization that complexity does matter and that layering dev-tool complexity over platform complexity does not solve everything.
I think that in one of these blogs you commented about some of the reasons why the TMS320C30 had been such a success. To me one clear reason was that (if you repeat it I will deny it!) it was such a simple and elegant architecture that I could easily grasp it in a few weeks inside and out. Good (invaluable actually) because in those days I also had difficult signal processing concepts to grasp!
All content and materials on this site are provided "as is". TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with regard to these materials, including but not limited to all implied warranties and conditions of merchantability, fitness for a particular purpose, title and non-infringement of any third party intellectual property right. TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with respect to these materials. No license, either express or implied, by estoppel or otherwise, is granted by TI. Use of the information on this site may require a license from a third party, or a license from TI.
TI is a global semiconductor design and manufacturing company. Innovate with 100,000+ analog ICs andembedded processors, along with software, tools and the industry’s largest sales/support staff.