Dedicated System Design: Context and Challenges
Dedicated systems have undeniable performance advantages (and sometimes the
performance requirements of the applications are such that there is no other
alternative), but these advantaged come at a major price, namely the design
cost. We briefly describe the principal (often interrelated) problems of
designing dedicated systems, and the research challenges that they imply.
This document was written in early 1997, and some of the specific figures may
not be accurate (however, the mere fact that they have already been superseded
serves to underline the points that we are making here).
Rapid evolution of technology
In 15 years, microprocessor complexity has grown from 40,000 transistors to
about a million transistors. So we can put 250 (circa 1982) microprocessors
on a single chip today. There seems to be no sign of a slowdown in this
exponential growth, driven by the stabilization of copper technology for VLSI,
multi-level logic, and other developments are just around the corner. Within
a few years we will see billion-transistor chips. This has an immediate and
directly visible effect on the "classical" computation systems (personal
computers, workstations, servers, etc.) The effect on embedded systems and
"information appliances" (mobile telephones, modems, personal assistants,
organizers, etc.) that rely on a combination of electronics and dedicated
computation is not so immediate. All research in this domain must anticipate
this evolution, and the primary challenge is to be able to adapt continuously
to evolving technology.
A vast choice of target technology
The dedicated systems designer has at hand a design palette which is
constantly increasing. This includes fabrication technology (for example, the
number of layers of metal on a circuit has grown from 1 in 1982 to 7 today),
the implementation methods (ASICs, sea of gates, FPGAs, PLDs, etc.) and the
architectural choices (RISC/DSP cores, SIMD arrays, microcontrollers, DSPs,
heterogeneous processeurs, etc.) It is not an exageration to say that
embedded system (as opposed to general puropose system) technology becomes
obsolete rather slowly, ironic as it may sound. For example many appliances
still contain 4- and 8-bit microcontrollers. The best designers have a good
knowledge of all the choices. Any solution is acceptable if it contributes to
the final system performance: a combination of raw performance, cost per unit,
and design cost (including the cost of design modification in contet of
evolving appication specifications).
As a result the designer must constantly adapt and improve his/her solution
using all the choices possible, otherwise competing solutions, be they
in software or in hardware will soon (within a few months in some application
domains) have better performance (as defined above). The major research
challenge is to develop design frameworks that are independent of the target
technology for as long as possible in the design cycle.
Evolving Applications
Due to a number of reasons, the applications themselves are constantly
changing. In a highly competitive market, one way to distinguish oneself from
the competition is to improve the (raw) performance. The evolving technology
cited above offers a means to achieve this without sacrificing (per unit)
cost, and it is imperative to take advantage of this, otherwise the designer
will lose competitive advantage. Indeed, the applications themselves become
rapidly obsolete, leading to the increased innportance of the time to
market. As an example, the lifetime of a standard cellular phone is
estimated to be 1 year (this does not necessarily mean that the customer will
buy a new one every year, just that the manufacturer must produce a new and
ungaded version every year). The research challenge is therefore to develop
design methods and tools that are also adaptable, and have short design times.
Incleasing System Complexity
A direct consequence of the technological evolution towards higher level of
miniaturization is that the systems that the designer seeks to implement on a
single chip are becoming increasing complex. However, the CAD exiting tools
have limitations in their extensibility: their performance (running time and
memory usage) becomes a critical factor. For example, it is impossible to
simulate and exhaustively test a complete system at all levels (functional,
behavioral, gate level and switch level). As a result there is an increasing
effort towards verification and formal methods such as correctness-preserving
transformations, certified designs, etc., which were till very recently
considered too academic.
Software Hardware Codesign
Embedded system now contain an increasing software component. Recent
estimates indicate that two thirds of the design and development time for a
typical product such as a cellular phone is spent on the software. Often, the
software runs of very specialized instruction set processors (ASIPs) and it is
necessary to not only design the processor architecture, but also its compiler
(or at least the assembler) in addition to the program. Software hardware
codesign has been an extremely active research field in the past decade.
Since, to a very crude first approximation, software is flexible and adaptable
but slow and hardware is fixed un unchangeable, but fast, the typical tradeoff
is to determine which part of the application to implement on which support.
the classic problems in codesign are therefore the partitioning problem and
that of automatically generating the interfaces of the corresponding system,
once the partitioning has been chosen. It is usually the latter problem that
is addresssed, since the system designer often has a good idea of the parts of
the application that are liable to evolve and the parts that will remain fixed
from one version to the next.
Multiple levels of Abstraction
Designing a hardware system usually involves using a number of cad tools whose
complexity is commensurate with the design goals (for example, one may not
need critical path optimizations for a simple functional prototype). The
commplexity is inherent in the very nature of hardware, which represents a
junction between the world of symbolic manipulation and that of electronic
circuits (whereas in software, essentially the same reasonsing is required to
analyze a peice of application code, or part of a GUI or even some internal
operating system code). This implies that the designer (or the design team)
be comfortable with many views of the same system (behavioral, structural,
physical, etc.) and multiple levels of abstraction (register transfer level,
gate level, switch level, etc.) Today there are very few (two) major cad
tools and their use requres more and more specialization. It is not very
likely that the cad tools will become simpler in the near future since the
continual miniaturization of circuits means that effects that used to be
negligible before (such as transmission line effects of long wires, inductive
coupling, etc.) are no longer so, and newer generations of tools will tend to
take these effects into account. All (academic) research, especially on cad
tools needs to take these facts into account.