Embedded Software

Edward A. Lee

Technical Memorandum UCB/ERL M001/26
University of California,
Berkeley, CA 94720, July 12, 2001.

Earlier version
(Latest version)


Deep in the intellectual roots of computation is the notion that software is the realization of mathematical functions as procedures. A body of input data is mapped by the software into a body of output data. The mechanism that is used to carry out the procedure is not nearly as important as the abstract properties of the procedure itself. In fact, we can reduce the mechanism to seven operations on a machine (the famous Turing machine) with an infinite tape capable of storing zeros and ones [1]. This mechanism is, in theory, as good as any other mechanism. The significance of the software is not affected by the mechanism. Abelson and Sussman say that software gives us a procedural epistemology [2]. Knowledge as procedure.

Embedded software is not like that. Its principal role is not the transformation of data, but rather the interaction with the physical world. It executes on machines that are not, first and foremost, computers. They are cars, airplanes, telephones, audio equipment, robots, appliances, toys, security systems, pacemakers, heart monitors, weapons, television sets, printers, scanners, climate control systems, manufacturing systems, etc.

Software with a principal role of interacting with the physical world must, of necessity, acquire some properties of the physical world. It takes time. It consumes power. It does not terminate (unless it fails). It is not the idealized procedures of Alan Turing.

Computer science has tended to view this physicality of embedded software as messy. Design of embedded software, consequently, has not benefited from the richly developed abstractions of the twentieth century. Instead of using object modeling, polymorphic type systems, and automated memory management, engineers write assembly code for idiosyncratic DSP processors that can do finite impulse response filtering in one (deterministic) instruction cycle per tap.

The engineers that write embedded software are rarely computer scientists. They are experts in the application domain with a good understanding of the target architectures they work with. This is probably appropriate. The principal role of embedded software is interaction with the physical world. Consequently, the designer of that software should be the person who best understands that physical world. The challenge to computer scientists, should they choose to accept it, is to invent better abstractions for that domain expert to do her job.

Today's domain experts may resist such help. In fact, their skepticism is well warranted. They see Java programs stalling for one third of a second to perform garbage collection and update the user interface, and they envision airplanes falling out of the sky. The fact is that the best-of-class methods offered by computer scientists today are, for the most part, a poor match to the requirements of embedded systems.

At the same time, however, these domain experts face a serious challenge. The complexity of their applications (and consequent size of their programs) is growing rapidly. Their devices now often sit on a network, wireless or wired. Even some programmable DSPs now run a TCP/IP protocol stack. And the applications are getting much more dynamic, with downloadable customization and migrating code. Meanwhile, reliability standards for embedded software remain very high, unlike general-purpose software. At a minimum, the methods used for general-purpose software require considerable adaptation for embedded software. At a maximum, entirely new abstractions are needed that embrace physicality and deliver robustness.

[1]A. M. Turing, On Computable Numbers with an Application to the Entscheidungsproblem, Proc. London Math. Soc., Vol. 42, pp. 230-265, 1936.
[2]H. Abelson and G. J. Sussman, Structure and Interpretation of Computer Programs, The MIT Press, Cambridge, MA, 1985.