CLOUD COMPUTING,QUANTAM COMPUTING by sumitnt1990

VIEWS: 19 PAGES: 16

									Quantum computing

Quantum computing is the area of study focused on developing computer technology based on the
principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum
(atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap
forward in computing capability far greater than that from the abacus to a modern day supercomputer, with
performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of
quantum physics, would gain enormous processing power through the ability to be in multiple states, and to
perform tasks using all possible permutations simultaneously. Current centers of research in quantum
computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National
Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But
it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum
computing research. In 1984, he was at a computation theory conference and began to wonder about the
possibility of designing a computer that was based exclusively on quantum rules, then published his
breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we
delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum

Quantum is the Latin word for amount and, in modern understanding, means the smallest possible discrete
unit of any physical property, such as energy or matter . Quantum came into the latter usage in 1900, when
the physicist Max Planck used it in a presentation to the German Physical Society. Planck had sought to
discover the reason that radiation from a glowing body changes in color from red, to orange, and, finally, to
blue as its temperature rises. He found that by making the assumption that radiation existed in discrete units
in the same way that matter does, rather than just as a constant electromagnetic wave, as had been formerly
assumed, and was therefore quantifiable, he could find the answer to his question.

Planck wrote a mathematical equation involving a figure to represent individual units of energy. He called
the units quanta . Planck assumed there was a theory yet to emerge from the discovery of quanta, but in
fact, their very existence defined a completely new and fundamental law of nature. Einstein's theory of
relativity and quantum theory , together, explain the nature and behavior of all matter and energy on earth
and form the basis for modern physics. However, conflicts remain between the two. For much of his life,
Einstein sought what he called a unified field theory -- one would reconcile the theories' incompatibilities.
Subsequently, Superstring Theory and M-theory have been proposed as candidates to fill that role.

Quantum is sometimes used loosely, in an adjectival form, to mean on such an infinitessimal level as to be
infinite, as, for example, you might say "Waiting for pages to load is quantumly boring."



Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck to the German Physical
Society, in which he introduced the idea that energy exists in individual units (which he called "quanta"), as
does matter. Further developments by a number of scientists over the following thirty years led to the
modern understanding of quantum theory. The Essential Elements of Quantum Theory:

        Energy, like matter, consists of discrete units, rather than solely as a continuous wave.
        Elementary particles of both energy and matter, depending on the conditions, may behave like
         either particles or waves.
        The movement of elementary particles is inherently random, and, thus, unpredictable.
        The simultaneous measurement of two complementary values, such as the position and
         momentum of an elementary particle, is inescapably flawed; the more precisely one value is
         measured, the more flawed will be the measurement of the other value.

Further Developments of Quantum Theory
Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is
whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have
specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality
does not exist. This translates to a principle called superposition that claims that while we do not know
what the state of any object is, it is actually in all possible states simultaneously, as long as we don't look to
check.

To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger's Cat. First, we
have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We
then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the
cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law -
in a superposition of states. It is only when we break open the box and see what condition the cat is in that
the superposition is lost, and the cat must be either alive or dead.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon
as a potential exists for any object to be in any state, the universe of that object transmutes into a series of
parallel universes equal to the number of possible states in which that the object can exist, with each
universe containing a unique single possible state of that object. Furthermore, there is a mechanism for
interaction between these universes that somehow permits all states to be accessible in some way and for all
possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among
the scientists who have expressed a preference for the many-worlds theory.

Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous
states opens up profound implications for computing.



A Comparison of Classical and Quantum Computing
Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with
a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are
AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time - that is,
either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and
capacitors at the heart of computers can only be in one state at any point. While the time that the each
transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a
second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to
smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical
laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as
the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we'll call
QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical
computing). In a quantum computer, a number of elemental particles such as electrons or photons can be
used (in practice, success has also been achieved with ions), with either their charge or polarization acting
as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and
behavior of these particles form the basis of quantum computing. The two most relevant aspects of
quantum physics are the principles of superposition and entanglement .

Superposition
Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the
field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state.
Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from
a laser - let's say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and
completely isolate the particle from all external influences? According to quantum law, the particle then
enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit
utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum
computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of
500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number -
2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing -
classical computers today, even so called parallel processors, still only truly do one thing at a time: there
are just two or more of them doing it). But how will these particles interact with each other? They would do
so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a
type of connection and can be entangled with each other in pairs, in a process known as correlation .
Knowing the spin state of one entangled particle - up or down - allows one to know that the spin of its mate
is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of
superpostition, the measured particle has no single spin direction before being measured, but is
simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is
decided at the time of measurement and communicated to the correlated particle, which simultaneously
assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein
called it "spooky action at a distance"), the mechanism of which cannot, as yet, be explained by any theory
- it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible
distances to interact with each other instantaneously (not limited to the speed of light). No matter how great
the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing
power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00,
01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers
simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity
is expanded exponentially.

Quantum Programming
Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to
write programs in a completely new way. For example, a quantum computer could incorporate a
programming sequence that would be along the lines of "take all the superpositions of all the prior
computations" - something which is meaningless with a classical computer - which would permit extremely
fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of
which we discuss below.

There have been two notable successes thus far with quantum programming. The first occurred in 1994 by
Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large
numbers. It centers on a system that uses number theory to estimate the periodicity of a large number
sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast
algorithm that is proven to be the fastest possible for searching through unstructured databases. The
algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total
number of elements) searches to find the desired result, as opposed to a search in classical computing,
which on average needs N/2 searches.

The Problems - And Some Solutions
The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the
problems with quantum computing are as follows:

        Interference - During the computation phase of a quantum calculation, the slightest disturbance in
         a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation
         to collapse, a process known as de-coherence. A quantum computer must be totally isolated from
         all external interference during the computation phase. Some success has been achieved with the
         use of qubits in intense magnetic fields, with the use of ions.
        Error correction - Because truly isolating a quantum system has proven so difficult, error
         correction systems for quantum computations have been developed. Qubits are not digital bits of
         data, thus they cannot use conventional (and very effective) error correction, such as the triple
         redundant method. Given the nature of quantum computing, error correction is ultra critical - even
         a single error in a calculation can cause the validity of the entire computation to collapse. There
         has been considerable progress in this area, with an error correction algorithm developed that
         utilizes 9 qubits (1 computational and 8 correctional). More recently, there was a breakthrough by
         IBM that makes do with a total of 5 qubits (1 computational and 4 correctional).
        Output observance - Closely related to the above two, retrieving output data after a quantum
         calculation is complete risks corrupting the data. In an example of a quantum computer with 500
         qubits, we have a 1 in 2^500 chance of observing the right output if we quantify the output. Thus,
         what is needed is a method to ensure that, as soon as all calculations are made and the act of
         observation takes place, the observed value will correspond to the correct answer. How can this be
         done? It has been achieved by Grover with his database search algorithm, that relies on the special
         "wave" shape of the probability curve inherent in quantum computers, that ensures, once all
         calculations are done, the act of measurement will see the quantum state decohere into the correct
         answer.

Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in
the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate
as to whether this is less than a decade away or a hundred years into the future. However, the potential that
this technology offers is attracting tremendous interest from both the government and the private sector.
Military applications include the ability to break encryptions keys via brute force searches, while civilian
applications range from DNA modeling to complex material science analysis. It is this potential that is
rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is
very much an open question.
Quantum computing

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage
of certain quantum physics properties of atoms or nuclei that allow them to work together as
quantum bits, or qubits, to be the computer's processor and memory. By interacting with each
other while being isolated from the external environment, qubits can perform certain calculations
exponentially faster than conventional computers.

Qubits do not rely on the traditional binary nature of computing. While traditional computers
encode information into bits using binary numbers, either a 0 or 1, and can only do calculations
on one set of numbers at once, quantum computers encode information as a series of quantum-
mechanical states such as spin directions of electrons or polarization orientations of a photon that
might represent a 1 or a 0, might represent a combination of the two or might represent a number
expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many
different numbers at once. A quantum computer can do an arbitrary reversible classical
computation on all the numbers simultaneously, which a binary system cannot do, and also has
some ability to produce interference between various different numbers. By doing a computation
on many different numbers at once, then interfering the results to get a single answer, a quantum
computer has the potential to be much more powerful than a classical computer of the same size.
In using only a single processing unit, a quantum computer can naturally perform myriad
operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal
for tasks such as cryptography and modeling and indexing very large databases.

Quantum computing - a whole new concept in parallelism!

What is quantum computing? It's something that could have been thought up a long time ago - an idea
whose time has come. For any physical theory one can ask: what sort of machines will do useful
computation? or, what sort of processes will count as useful computational acts? Alan Turing thought about
this in 1936 with regard (implicitly) to classical mechanics, and gave the world the paradigm classical
computer: the Turing machine.

But even in 1936 classical mechanics was known to be false. Work is now under way - mostly theoretical,
but tentatively, hesitantly groping towards the practical - in seeing what quantum mechanics means for
computers and computing.

In a trivial sense, everything is a quantum computer. (A pebble is a quantum computer for calculating the
constant-position function - you get the idea.) And of course, today's computers exploit quantum effects
(like electrons tunneling through barriers) to help do the right thing and do it fast. For that matter, both the
computer and the pebble exploit a quantum effect - the "Pauli exclusion principle", which holds up ordinary
matter against collapse by bringing about the kind of degeneracy we call chemistry - just to remain stable
solid objects. But quantum computing is much more than that.

The most exciting really new feature of quantum computing is quantum parallelism. A quantum system is
in general not in one "classical state", but in a "quantum state" consisting (crudely speaking) of a
superposition of many classical or classical-like states. This superposition is not just a figure of speech,
covering up our ignorance of which classical-like state it's "really" in. If that was all the superposition
meant, you could drop all but one of the classical-like states (maybe only later, after you deduced
retrospectively which one was "the right one") and still get the time evolution right. But actually you need
the whole superposition to get the time evolution right. The system really is in some sense in all the
classical-like states at once! If the superposition can be protected from unwanted entanglement with its
environment (known as decoherence), a quantum computer can output results dependent on details of all its
classical-like states. This is quantum parallelism - parallelism on a serial machine. And if that wasn't
enough, machines that would already, in architectural terms, qualify as parallel can benefit from quantum
parallelism too - at which point the mind begins to seriously boggle!




What is Quantum Computing?

Quantum computing is a new method of computing with a hypothetical computer, capable of processing
speeds impossible by traditional computers. Though the earliest quantum computers have been built, when
practical quantum computing machines hit the market, they will revolutionize an entire industry. However,
significant progress must be made before quantum computers have a mainstream use.

Quantum computing works by being able to make multiple calculations at one time. Traditional computing
works by only making one calculation at a time. While traditional machines do these calculations at an
impressive speed, only doing one at a time does limit their capabilities. Quantum computers have no such
limitations and can do multiple calculations as fast or faster than traditional computers.

Though this may not sound like a major advancement, the ability to make multiple calculations at once can
make a big difference in quantum computing. In fact, quantum computers could make today's
supercomputers look like children's toys. In fact, quantum computing has the potential to make computers
using its technology millions of times more powerful than today's most powerful computers.

The key to quantum computing is the qubit. Qubits are different than traditional bits, which can only hold a
value of 0 or 1, commonly known as binary to computer users. Instead of being one or the other, qubits can
hold a value of both 0 and 1, as well as all values between 0 and 1. Qubits are very small properties, being
made of atoms, ions, photons or electrons

Is There a Quantum Computer in Your Future?

The overriding imperative of computing is "go faster, get smaller". The number of transistors that
can be manufactured on a standard silicon wafer has doubled roughly every two years, as
Moore's Law predicts. That means transistors keep growing smaller. The smaller the distance
between transistors, the faster computations happen. If Moore's Law continues to be an accurate
predictor, then around 2020 or 2030 we should see transistors the size of individual atoms. That's
when quantum computing will come to fruition.

Quantum computing is based upon physics completely different from that observed in the
electronic devices of today. In today's computing paradigm, a transistor can be in only one of two
states called bits - 0 or 1, on or off. But in the realm of quantum computing a transistor can be in a
state of 0, 1, or a "superposition" of 0 or 1. And there can be many superpositions. These
quantum bits are called "qubits." Physically, qubits are encoded in atoms, photons, ions, or
electrons.

Whereas a standard transistor can perform only one operation at a time, a qubit can perform
many simultaneously. Therefore a quantum computer containing the same number of transistors
as an ordinary computer of today can be a million times faster. A 30-qubit quantum computer
could perform as many as 10 teraflops - 10 trillion floating-point operations per second! Today's
desktop computers perform gigaflops - billions of operations per second.

So obviously, that's where the interest in quantum computing comes from - speed. A personal
computer a million times faster than the one currently on your desk boggles the mind. After all,
how fast can you type? But there are applications that would benefit from that type of speed, such
as image recognition, cryptography, and other problems that require enormous computing power.
Personally, I'd be happy with a computer that's ready to go as soon as you turn it on. I don't
anticipate being able to type a million times faster than I already do. J

One problem with quantum computing is that if you observe the quantum state of a qubit, it
changes. So scientists must devise an indirect method of determining the state of a qubit. To do
this, they are trying to take advantage of another quantum property called "entanglement." At the
quantum level, if you apply a force to two particles they become "entangled;" a change in the
state of one particle is instantly reflected in the other particle's change to the opposite state. So
by observing the state of the second particle, physicists hope to determine the state of the first.

Yes, quantum mechanics is rather confusing. But from a layman's perspective, it's enough to
know that quantum computing is based on a new type of transistor that is represented by the
changing states of atomic particles. And the promise of quantum computing is a HUGE
breakthrough in speed.
Are Quantum Computers Available Today?

There is at least one firm that claims to have created a rudimentary, working quantum computer.
Canada-based D-Wave Systems has demonstrated a 16-qubit quantum computer that solved
sudoku puzzles and other pattern-matching problems. Some in the scientific community are
skeptical about D-Wave's claims, but there is definite progress on the quantum computing front
every day.

Quantum computers need at least a few dozen qubits in order to solve real-world problems
usefully. It may be several years, even a couple of decades, before a practical quantum computer
is put into production. But just as world records fell more rapidly after the first sub-four-minute
mile was run, the breakthrough of the first commercial quantum computer will undoubtedly be
followed by very rapid increases in quantum computing capabilities; reductions in costs; and
shrinkage in size. In a decade or so, we can expect to find old-school transistors and simple on-
off bit technology joining analog video tape in the dustbin of technology history.




Molecular electronics


involves the study and application of molecular building blocks for the fabrication of electronic
components. This includes both bulk applications of conductive polymers, and single-molecule electronic
components for nanotechnology.

Conductive polymers or, more precisely, intrinsically conducting polymers (ICPs) are organic polymers
that conduct electricity.[1] Such compounds may have metallic conductivity or can be semiconductors. The
biggest advantage of conductive polymers is their processability, mainly by dispersion. Conductive
polymers are generally not plastics, i.e., they are not thermoformable. But, like insulating polymers, they
are organic materials. They can offer high electrical conductivity but do not show mechanical properties as
other commercially used polymers do. The electrical properties can be fine-tuned using the methods of
organic synthesis [2] and by advanced dispersion techniques

A polymer is a large molecule (macromolecule) composed of repeating structural units. These subunits are
typically connected by covalent chemical bonds. Although the term polymer is sometimes taken to refer to
plastics, it actually encompasses a large class of natural and synthetic materials with a wide variety of
properties.

Because of the extraordinary range of properties of polymeric materials, [2] they play an essential and
ubiquitous role in everyday life.[3] This role ranges from familiar synthetic plastics and elastomers to
natural biopolymers such as nucleic acids and proteins that are essential for life.

A dispersion is a system in which particles are dispersed in a continuous phase of a different composition
(or state). See also emulsion. A dispersion is classified in a number of different ways, including how large
the particles are in relation to the particles of the continuous phase, whether or not precipitation occurs, and
the presence of Brownian motion.

There are three main types of dispersions:

        Suspension
        Colloid
        Solution

Organic synthesis is a special branch of chemical synthesis and is concerned with the construction of
organic compounds via organic reactions. Organic molecules can often contain a higher level of complexity
compared to purely inorganic compounds, so the synthesis of organic compounds has developed into one of
the most important branches of organic chemistry. There are two main areas of research fields within the
general area of organic synthesis: total synthesis and methodology.

Methodology can be:

    1.   "the analysis of the principles of methods, rules, and postulates employed by a discipline"; [1]
    2.   "the systematic study of methods that are, can be, or have been applied within a discipline". [1]
    3.   A documented process for management of projects that contains procedures, definitions and
         explanations of techniques used to collect, store, analyze and present information as part of a
         research process in a given discipline.
    4.   the study or description of methods [2]




In chemistry, chemical synthesis is purposeful execution of chemical reactions to get a product, or several
products. This happens by physical and chemical manipulations usually involving one or more reactions. In
modern laboratory usage, this tends to imply that the process is reproducible, reliable, and established to
work in multiple laboratories.

An interdisciplinary pursuit, molecular electronics spans physics, chemistry, and materials science. The
unifying feature is the use of molecular building blocks for the fabrication of electronic components. This
includes both passive (e.g. resistive wires) and active components such as transistors and molecular-scale
switches. Due to the prospect of size reduction in electronics offered by molecular-level control of
properties, molecular electronics has aroused much excitement both in science fiction and among scientists.
Molecular electronics provides means to extend Moore's Law beyond the foreseen limits of small-scale
conventional silicon integrated circuits.

Molecular electronics is split into two related but separate subdisciplines: molecular materials for
electronics utilizes the properties of the molecules to affect the bulk properties of a material, while
molecular scale electronics focuses on single-molecule applications

Molecular scale electronics


Molecular scale electronics, also called single molecule electronics, is a branch of nanotechnology that uses
single molecules, or nanoscale collections of single molecules, as electronic components. Because single
molecules constitute the smallest stable structures imaginable this miniaturization is the ultimate goal for
shrinking electrical circuits.

Conventional electronics have traditionally been made from bulk materials. With the bulk approach having
inherent limitations in addition to becoming increasingly demanding and expensive, the idea was born that
the components could instead be built up atom for atom in a chemistry lab (bottom up) as opposed to
carving them out of bulk material (top down). In single molecule electronics, the bulk material is replaced
by single molecules. That is, instead of creating structures by removing or applying material after a pattern
scaffold, the atoms are put together in a chemistry lab. The molecules utilized have properties that resemble
traditional electronic components such as a wire, transistor or rectifier.

Single molecule electronics is an emerging field, and entire electronic circuits consisting exclusively of
molecular sized compounds are still very far from being realized. However, the continuous demand for
more computing power together with the inherent limitations of the present day lithographic methods make
the transition seem unavoidable. Currently, the focus is on discovering molecules with interesting
properties and on finding ways to obtaining reliable and reproducible contacts between the molecular
components and the bulk material of the electrodes.

Molecular electronics operates in the quantum realm of distances less than 100 nanometers. The
miniaturization down to single molecules brings the scale down to a regime where quantum effects are
important. As opposed to the case in conventional electronic components, where electrons can be filled in
or drawn out more or less like a continuous flow of charge, the transfer of a single electron alters the
system significantly. The significant amount of energy due to charging has to be taken into account when
making calculations about the electronic properties of the setup and is highly sensitive to distances to
conducting surfaces nearby.




Graphical representation of a rotaxane, useful as a molecular switch.

One of the biggest problems with measuring on single molecules is to establish reproducible electrical
contact with only one molecule and doing so without shortcutting the electrodes. Because the current
photolithographic technology is unable to produce electrode gaps small enough to contact both ends of the
molecules tested (in the order of nanometers) alternative strategies is put into use. These include molecular-
sized gaps called break junctions, in which a thin electrode is stretched until it breaks. Another method is to
use the tip of a scanning tunneling microscope (STM) to contact molecules adhered at the other end to a
metal substrate.[3] Another popular way to anchor molecules to the electrodes is to make use of sulfurs’
high affinity to gold; though useful, the anchoring is non-specific and thus anchors the molecules randomly
to all gold surfaces, and the contact resistance is highly dependent on the precise atomic geometry around
the site of anchoring and thereby inherently compromises the reproducibility of the connection. To
circumvent the latter issue, experiments has shown that fullerenes could be a good candidate for use instead
of sulfur because of the large conjugated π-system that can electrically contact many more atoms at once
than a single atom of sulfur.[4]

One of the biggest hindrances for single molecule electronics to be commercially exploited is the lack of
techniques to connect a molecular sized circuit to bulk electrodes in a way that gives reproducible results.
Also problematic is the fact that some measurements on single molecules are carried out in cryogenic
temperatures (close to absolute zero) which is very energy consuming

Molecular materials for electronics

Conductive polymers or, more precisely, intrinsically conducting polymers (ICPs) are organic polymers
that conduct electricity in their bulk state.[6] Such compounds may have metallic conductivity or can be
semiconductors. The biggest advantage of conductive polymers is their processability, mainly by
dispersion. Conductive polymers are not plastics, i.e., they are not thermoformable, but they are organic
polymers, like (insulating) polymers. They can offer high electrical conductivity but do not show
mechanical properties as other commercially used polymers do. The electrical properties can be fine-tuned
using the methods of organic synthesis [7] and by advanced dispersion techniques.[8]

The linear-backbone "polymer blacks" (polyacetylene, polypyrrole, and polyaniline) and their copolymers
are the main class of conductive polymers. Historically, these are known as melanins.PPV and its soluble
derivatives have emerged as the prototypical electroluminescent semiconducting polymers. Today, poly(3-
alkylthiophenes) are the archetypical materials for solar cells and transistors.[7]

Conducting polymers have backbones of contiguous sp2 hybridized carbon centers. One valence electron
on each center resides in a pz orbital, which is orthogonal to the other three sigma-bonds. The electrons in
these delocalized orbitals have high mobility when the material is "doped" by oxidation, which removes
some of these delocalized electrons. Thus the conjugated p-orbitals form a one-dimensional electronic
band, and the electrons within this band become mobile when it is partially emptied. Despite intensive
research, the relationship between morphology, chain structure and conductivity is poorly understood yet. [9]

Conductive polymers enjoy few large-scale applications due to their poor processability. They have been
known to have promise in antistatic materials[7] and they have been incorporated into commercial displays
and batteries, but there have had limitations due to the manufacturing costs, material inconsistencies,
toxicity, poor solubility in solvents, and inability to directly melt process. Nevertheless, conducting
polymers are rapidly gaining attraction in new applications with increasingly processable materials with
better electrical and physical properties and lower costs. With the availability of stable and reproducible
dispersions, PEDOT and polyaniline have gained some large scale applications. While PEDOT (poly(3,4-
ethylenedioxythiophene)) is mainly used in antistatic applications and as a transparent conductive layer in
form of PEDOT:PSS dispersions (PSS=polystyrene sulfonic acid), polyaniline is widely used for printed
circuit board manufacturing – in the final finish, for protecting copper from corrosion and preventing its
solderability.[8] The new nanostructured forms of conducting polymers particularly, provide fresh air to this
field with their higher surface area and better dispersability
Quantum mechanics

Quantum mechanics, also known as quantum physics or quantum theory, is a branch of physics
providing a mathematical description of the dual particle-like and wave-like behaviour and interaction of
matter and energy.

Quantum mechanics departs from classical mechanics primarily at the atomic and sub-atomic scales, the
so-called quantum realm. In special cases some quantum mechanical processes are macroscopic, but these
emerge only at extremely low or extremely high energies or temperatures.

The term was coined by Max Planck, and derives from the observation that some physical quantities can be
changed only by discrete amounts, or quanta, as multiples of the Planck constant, rather than being capable
of varying continuously or by any arbitrary amount. For example, the angular momentum, or more
generally the action, of an electron bound into an atom or molecule is quantized. Although an unbound
electron does not exhibit quantized energy levels, one which is bound in an atomic orbital has quantized
values of angular momentum. In the context of quantum mechanics, the wave–particle duality of energy
and matter and the uncertainty principle provide a unified view of the behavior of photons, electrons and
other atomic-scale objects.

The mathematical formulations of quantum mechanics are abstract. Similarly, the implications are often
counter-intuitive in terms of classical physics. The centerpiece of the mathematical formulation is the
wavefunction (defined by Schrödinger's wave equation), which describes the probability amplitude of the
position and momentum of a particle. Mathematical manipulations of the wavefunction usually involve the
bra-ket notation, which requires an understanding of complex numbers and linear functionals. The
wavefunction treats the object as a quantum harmonic oscillator and the mathematics is akin to that of
acoustic resonance.

Many of the results of quantum mechanics do not have models that are easily visualized in terms of
classical mechanics; for instance, the ground state in the quantum mechanical model is a non-zero energy
state that is the lowest permitted energy state of a system, rather than a traditional classical system that is
thought of as simply being at rest with zero kinetic energy.

Fundamentally, it attempts to explain the peculiar behaviour of matter and energy at the subatomic level—
an attempt which has produced more accurate results than classical physics in predicting how individual
particles behave. But many unexplained anomolies remain.

Historically, the earliest versions of quantum mechanics were formulated in the first decade of the 20th
Century, around the time that atomic theory and the corpuscular theory of light as interpreted by Einstein
first came to be widely accepted as scientific fact; these latter theories can be viewed as quantum theories
of matter and electromagnetic radiation.

Following Schrödinger's breakthrough in deriving his wave equation in the mid-1920s, quantum theory was
significantly reformulated away from the old quantum theory, towards the quantum mechanics of Werner
Heisenberg, Max Born, Wolfgang Pauli and their associates, becoming a science of probabilities based
upon the Copenhagen interpretation of Niels Bohr. By 1930, the reformulated theory had been further
unified and formalized by the work of Paul Dirac and John von Neumann, with a greater emphasis placed
on measurement, the statistical nature of our knowledge of reality, and philosophical speculations about the
role of the observer.

The Copenhagen interpretation quickly became (and remains) the orthodox interpretation. However, due to
the absence of conclusive experimental evidence there are also many competing interpretations.
Quantum mechanics has since branched out into almost every aspect of physics, and into other disciplines
such as quantum chemistry, quantum electronics, quantum optics and quantum information science. Much
19th Century physics has been re-evaluated as the classical limit of quantum mechanics and its more
advanced developments in terms of quantum field theory, string theory, and speculative quantum gravity
theories.

Applications

Quantum mechanics had enormous success in explaining many of the features of our world. The individual
behaviour of the subatomic particles that make up all forms of matter—electrons, protons, neutrons,
photons and others—can often only be satisfactorily described using quantum mechanics. Quantum
mechanics has strongly influenced string theory, a candidate for a theory of everything (see reductionism)
and the multiverse hypothesis.

Quantum mechanics is important for understanding how individual atoms combine covalently to form
chemicals or molecules. The application of quantum mechanics to chemistry is known as quantum
chemistry. (Relativistic) quantum mechanics can in principle mathematically describe most of chemistry.
Quantum mechanics can provide quantitative insight into ionic and covalent bonding processes by
explicitly showing which molecules are energetically favorable to which others, and by approximately how
much.[40] Most of the calculations performed in computational chemistry rely on quantum mechanics.[41]




A working mechanism of a resonant tunneling diode device, based on the phenomenon of quantum
tunneling through the potential barriers.

Much of modern technology operates at a scale where quantum effects are significant. Examples include
the laser, the transistor (and thus the microchip), the electron microscope, and magnetic resonance imaging.
The study of semiconductors led to the invention of the diode and the transistor, which are indispensable
for modern electronics.

Researchers are currently seeking robust methods of directly manipulating quantum states. Efforts are
being made to develop quantum cryptography, which will allow guaranteed secure transmission of
information. A more distant goal is the development of quantum computers, which are expected to perform
certain computational tasks exponentially faster than classical computers. Another active research topic is
quantum teleportation, which deals with techniques to transmit quantum information over arbitrary
distances.

Quantum tunneling is vital in many devices, even in the simple light switch, as otherwise the electrons in
the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory
chips found in USB drives use quantum tunneling to erase their memory cells.

Quantum mechanics primarily applies to the atomic regimes of matter and energy, but some systems
exhibit quantum mechanical effects on a large scale; superfluidity (the frictionless flow of a liquid at
temperatures near absolute zero) is one well-known example. Quantum theory also provides accurate
descriptions for many previously unexplained phenomena such as black body radiation and the stability of
electron orbitals. It has also given insight into the workings of many different biological systems, including
smell receptors and protein structures.[42] Recent work on photosynthesis has provided evidence that
quantum correlations play an essential role in this most fundamental process of the plant kingdom.[43] Even
so, classical physics often can be a good approximation to results otherwise obtained by quantum physics,
typically in circumstances with large numbers of particles or large quantum numbers. (However, some
open questions remain in the field of quantum chaos.)


Introduction to quantum mechanics


Quantum mechanics is the body of scientific principles which tries to explain the behaviour of matter and
its interactions with energy on the scale of atoms and atomic particles.

Just before 1900, it became clear that classical physics was unable to explain certain phenomena. Coming
to terms with these limitations led to the development of quantum mechanics, a major revolution in
physics. This article describes how the limitations of classical physics were discovered, and describes the
main concepts of the quantum theory which replaced it in the early decades of the 20th Century.[note 1] These
concepts are described in roughly the order they were first discovered; for a more complete history of the
subject see History of quantum mechanics.

Some aspects of quantum mechanics can seem counter-intuitive, because they describe behaviour quite
different to that seen at larger length scales, where classical physics is an excellent approximation. In the
words of Richard Feynman, quantum mechanics deals with "nature as She is—absurd."[1]

Many types of energy, such as photons (discrete units of light), behave in some respects like particles and
in other respects like waves. Radiators of photons - such as neon lights - have emission spectra which are
discontinuous, in that only certain frequencies of light are present. Quantum mechanics predicts the
energies, the colours, and the spectral intensities of all forms of electromagnetic radiation.

But quantum mechanics theory ordains that the more closely one pins down one measure (such as the
position of a particle), the less precise another measurement pertaining to the same particle (such as its
momentum) must become. Put another way, measuring position first and then measuring momentum does
not have the same outcome as measuring momentum first and then measuring position; the act of
measuring the first property necessarily introduces additional energy into the micro-system being studied,
thereby perturbing that system.

Even more disconcerting, pairs of particles can be created as entangled twins — which means that a
measurement which pins down one property of one of the particles will instantaneously pin down the same
or another property of its entangled twin, regardless of the distance separating them — though this may be
regarded as merely a mathematical, rather than a real, anomaly.

The first quantum theory: Max Planck and black body radiation




Hot metalwork from a blacksmith. The yellow-orange glow is the visible part of the thermal radiation
emitted due to the high temperature. Everything else in the picture is glowing with thermal radiation as
well, but less brightly and at longer wavelengths that the human eye cannot see. A far-infrared camera will
show this radiation.

Thermal radiation is electromagnetic radiation emitted from the surface of an object due to the object's
temperature. If an object is heated sufficiently, it starts to emit light at the red end of the spectrum — it is
"red hot". Heating it further causes the colour to change, as light at shorter wavelengths (higher
frequencies) begins to be emitted. It turns out that a perfect emitter is also a perfect absorber. When it is
cold, such an object looks perfectly black, as it emits practically no visible light, because it absorbs all the
light that falls on it. Consequently, an ideal thermal emitter is known as a black body, and the radiation it
emits is called black body radiation.

In the late 19th century, thermal radiation had been fairly well characterized experimentally. The
wavelength at which the radiation is strongest is given by Wien's displacement law, and the overall power
emitted per unit area is given by the Stefan–Boltzmann law. So, as temperature increases, the glow colour
changes from red to yellow to white to blue. Even as the peak wavelength moves into the ultra-violet,
enough radiation continues to be emitted in the blue wavelengths that the body continues to appear blue. It
never becomes invisible—indeed, the radiation of visible light increases monotonically with temperature.[2]
Physicists were searching for a theoretical explanation for these experimental results.




Quantum mechanics is a mathematical theory that can describe the behavior of objects that are
roughly 10,000,000,000 times smaller than a typical human being. Quantum particles move from
one point to another as if they are waves. However, at a detector they always appear as discrete
lumps of matter. There is no counterpart to this behavior in the world that we perceive with our
own senses. One cannot rely on every-day experience to form some kind of "intuition" of how
these objects move. The intuition or "understanding" formed by the study of basic elements of
quantum mechanics is essential to grasp the behavior of more complicated quantum systems.

The approach adopted in all textbooks on quantum mechanics is that the mathematical solution
of model problems brings insight in the physics of quantum phenomena. The mathematical
prerequisites to work through these model problems are considerable. Moreover, only a few of
them can actually be solved analytically. Furthermore, the mathematical structure of the solution
is often complicated and presents an additional obstacle for building intuition.

This presentation introduces the basic concepts and fundamental phenomena of quantum
physics through a combination of computer simulation and animation. The primary tool for
presenting the simulation results is computer animation. Watching a quantum system evolve in
time is a very effective method to get acquainted with the basic features and peculiarities of
quantum mechanics. The images used to produce the computer animated movies shown in this
presentation are not created by hand but are obtained by visualization of the simulation data. The
process of generating the simulation data for the movies requires the use of computers that are
far more powerful than Pentium III based PC 's. At the time that these simulations were carried
out (1994), most of them required the use of a supercomputer. Consequently, within this
presentation, it is not possible to change the model parameters and repeat a simulation in real
time.

This presentation is intended for all those who are interested to learn about the fundamentals of
quantum mechanics. Some knowledge of mathematics will help but is not required to understand
the basics. This presentation is not a substitute for a textbook. The presentation begins by
showing the simplest examples, such as the motion of a free particle, a particle in an electric field,
etc.. Then, the examples become more sophisticated in the sense that one can no longer rely on
one's familiarity with classical physics to describe some of the qualitative features seen in the
animations. Classical notions are of no use at all for the last set of examples. However, once all
other examples have been "understood", it should be possible to "explain" the behavior of these
systems also. Instead of using a comprehensive mathematical apparatus to obtain and analyze
solutions of model problems, a computer simulation technique is employed to solve these
problems including those that would prove intractable otherwise.
The introduction of the quantum
The Quantum Mechanical era commenced in 1900 when Max Planck postulated that everything
is made up of little bits he called quanta (one quantum; two quanta). Matter had its quanta but
also the forces that kept material objects together. Forces could only come in little steps at the
time;     there      was      no     more     such      a    thing     as    infinitely     small.

Albert Einstein took matters further when he successfully described how light interacts with
electrons but it wasn't until the 1920's that things began to fall together and some fundamental
rules about the world of the small where wrought almost by pure thought. The men who mined
these rules were the arch beginners of Quantum Mechanics, the Breakfast Club of the modern
era. Names like Pauli, Heisenberg, Schrödinger, Born, Rutherford and Bohr still put butterflies in
the bellies of those of us who know what incredible work these boys - as most of them where in
their twenties; they were rebels, most of them not even taken serious - achieved. They were
Europeans, struck by the depression, huddled together on tiny attics peeking into a strange new
world as once the twelve spies checked out the Promised Land. Let all due kudoes abound.

Believing                                      the                                     unbelievable

One of the toughest obstacles the early Quantum Mechanics explorers had to overcome was
their own beliefs in determinism. Because the world of the small is so different, people had to
virtually reformat the system of logic that had brought them thus far. In order to understand nature
they had to let go of their intuition and embrace a completely new way of thinking. The things they
discovered where fundamental rules that just were and couldn't really be explained in terms of the
large scale world. Just like water is wet and fire is hot, quantum particles display behavior that are
inherent to them alone and can't be compared with any material object we can observe with the
naked                                                                                             eye.

One of those fundamental rules is that everything is made up from little bits. Material objects are
made up of particles, but also the forces that keep those objects together. Light, for instance, is
besides that bright stuff which makes things visible, also a force (the so-called electromagnetic
force) that keeps electrons tied to the nuclei of atoms, and atoms tied together to make molecules
and finally objects. In Scriptures Jesus is often referred to as light, and most exegetes focus on
the metaphorical value of these statements. But as we realize that all forms of matter are in fact
                                      2
'solidified' light (energy, as in E=mc ) and the electromagnetic force holds all atoms together, the
literal value of Paul's statement "and He is before all things, and in Him all things hold together
(Col                   1:17)"              becomes                  quite                compelling.

Particles are either so-called real particles, also known as fermions, or they are force particles,
also                           known                           as                         bosons.

Quarks, which are fermions, are bound together by gluons, which are bosons. Quarks and gluons
form nulceons, and nucleons bound together by gluons form the nuclei of atoms.

The electron, which is a fermion, is bound to the nucleus by photons, which are bosons. The
whole shebang together forms atoms. Atoms form molecules. Molecules form objects.

Everything that we can see, from the most distant stars to the girl next door, or this computer you
are staring at and yourself as well are made up from a mere 3 fermions and 9 bosons. The 3
fermions are Up-quark, Down-quark and the electron. The 9 bosons are 8 gluons and 1 photon.

Like so:

       Quanta                       Atoms                    Molecules               Objects
But the 3 fermions that make up our entire universe are not all there is. These 3 are the survivors
of a large family of elementary particles and this family is now known as the Standard Model.
What happened to the rest? Will they ever be revived?
We will learn more about the Standard Model a little further up. First we will take a look at what
quantum particles are and in which weird world they live.

(If you plan to research these matters more we have written out the most common quantum
phrases in a table for your convenience. Have a quick look at it so that you know where to find it
in case you decide you need it).
Quantum Gates and Circuits

Quantum gates would be the building blocks of quantum computers, and they could
theoretically calculate much faster than an ordinary computer for certain types of
arithmetic problems.

Crudely put, the idea is that a quantum system can achieve calculation with the
wavefunction in the "wave" mode rather than the "particle" mode. Wave phenomena
are inherently "parallel" when used as a computational tool, so you'd be doing lots
of "work" in a single computational step.

Imagine that you have a computer with a large number in it. You want to divide that
number by all the numbers from 1 to 1e100 to find the one that divides into it
evenly.
In an ordinary computer, you would essentially try each divisor one after the next.

A quantum computer could in principle try all the divisors simultaneously. You would
make a quantum "measurment" of the result that had no remainder, forcing the one
calculation you wanted to see to become the manifested value.

It is much like an analog computer that solves a fluid dynamics problem by direct
simulation, but in the quantum computing case you still emply the methods of digital
computers and gain parallelism from the indeterminacy of the quantum system.

Well, this is not really accurate. The biggest difference between a qubit and an
ordinary bit is the fact that a bit is either 1 or 0. The qubit is a SUPERPOSITION of 1
and 0. So the qubit really is the 'combination' of the two possible bit-states.

The clue in QM-related calculations is the fact that you don't measure one specific
qubit because all the information in this massive quantum paralellism would be gone
(the superposition is broken). For example, you can 'calculate' a thousand values for
any f(x) in just one step. Classically you would need 1000 calculations. Ofcourse you
cannot just measure what outcome 926 is
( ie the term on the 926th position in the superposition of |x>|f(x)>).

Well, you can but then all other terms are lost and you have no benifits of the QM-
approach compared to the classical one. What you can do is try to figure out mutual
connections between the different terms in the superposition, like phase-differences
or something like that.

Further info can be found on John Preskill's webpage, just google for his
name;;;Also, look up the problem of Deutsch

marlon

								
To top