The twentieth century witnessed the birth of revolutionary ideas in the phys-
ical sciences. These ideas began to shake the traditional view of the universe
dating back to the days of Newton, even to the days of Galileo. Albert Ein-
stein is usually identified as the creator of the relativity theory, a theory that
is used to model the behavior of the huge macrosystems of astronomy. An-
other new view of the physical world was supplied by quantum physics, which
turned out to be successful in describing phenomena in the microworld. the
behavior of particles of atomic size.
Even though the first ideas of automatic information processing are quite
old, I feel justified in saying that the twentieth century also witnessed the
birth of computer science. As a mathematician, by the term "computer sci-
ence" , I mean the more theoretical parts of this vast research area, such as
the theory of formal languages, automata theory, complexity theory, and al-
gorithm design. I hope that readers who are used to a more flexible concept of
"computer science" will forgive me. The idea of a computational device was
crystallized into a mathematical form as a Turing machine by Alan Turing
in the 1930s. Since then, the growth of computer science has been immense,
but many problems in newer areas such as complexity theory are still waiting
for a solution.
Since the very first electronic computers were built, computer technology
has grown rapidly. An observation by Gordon Moore in 1965 laid the founda-
tions for what became known as "Moore's Law" - that computer processing
power doubles every eighteen months. How far can this technical process go?
How efficient can we make computers? In light of the present knowledge, it
seems unfair even to attempt to give an answer to these questions, but some
estimate can be given. By naively extrapolating Moore's law to the future,
we learn that sooner or later, each bit of information should be encoded by
a physical system of subatomic size! Several decades ago such an idea would
have seemed somewhat absurd, but it does not seem so anymore. In fact, a
system of seven bits encoded subatomically has been already implemented
[37]. These small systems can no longer be described by classical physics, but
rather quantum physical effects must be taken into consideration.