ads

dimanche 17 janvier 2016

ARM based processor set to ‘catalyse’ datacentre innovation, says AMD







AMD has launched the Opteron A1100 processor, which it is targeting at datacentre applications. The 64bit device, manufactured on what the company calls an optimised 28nm process, features up to eight ARM Cortex-A57 cores, as well as twin 10Gbit Ethernet ports.
There are currently three members of the family planned – two of these will feature eight A57 cores, while the other will be a quad core variant. One of the eight core devices will run at 2GHz, whilst the other two parts are designed for use at 1.7GHz. 

Dan Bounds, AMD’s senior director of datacentre products, described the products as the ‘first enterprise class SoCs from AMD based on ARM technology’. “They will be a turning point and a catalyst for datacentre innovation,” he contended. “The devices provide a legitimate choice when it comes to optimising workloads in datacentres and, by driving new thinking, we can help the industry innovate.”
Other features of the A1100 range include a shared 4Mbyte level 2 cache, a shared 8Mbyte level 3 cache and twin 64bit DDR3/DDR4 channels, supporting data rates of up to 1866MHz with ECC. There will also be eight PCI-Express Gen 3 lanes and 14 SATA-3 ports.

Author
Graham Pitcher

Military is the New Industrial








              In my 30-plus years of engineering, I have noticed a change that has gradually crept into the design metrics of military hardware: “Cost is important!” Where the mantra once was “it must meet these requirements at all cost,” it now applies only in certain circumstances.
For several years, there’s been a push to migrate ceramic (expensive) devices to plastic industrial devices. In applications such as guided munitions, radios, military vehicles, and other non-strategic applications, industrial-grade devices are now commonplace. In general, the concept is excellent, since many benefits derive from this migration. The major advantage, of course, is cost.


           Plastic industrial devices are less expensive than ceramic 38510- or 883-grade standard military components. In many applications, hermetic integrity is not a requirement—no more than a commercial device would be in a consumer product. Also, the standard temperature range for industrial devices covers between –40 to +85°C, which is adequate in many military applications.
However, one thing that can jeopardize this philosophy has nothing to do with device performance. Rather, it is procurement. It surrounds the problem of obsolescence, which for military applications, where incremental orders are placed over a period of 20 years, can be a grave problem for the supplier. Once a weapon system is certified by the U.S. Department of Defense (DoD), any changes may require recertification, and that can be extremely expensive. It’s interesting to note that industrial applications have the same issue, especially medical applications with hardware certified by the U.S. Food and Drug Administration (FDA) or other governing body.

       To fix this issue, you either need to use recently released devices in your design, or get some guarantee from the chip supplier that the device will be around long enough for you to retire before having to redesign your system. I favor the latter, since I only want to design something once… I’m funny that way. The good news is that some programs in the industry can provide help.
One program is the automotive Q100 rating for devices. Automotive devices typically will have better specifications and some guarantee that the supply will be available for at least seven years. They do come at a premium, but typically it’s a slight increase in cost for the benefit of a known life cycle.

           Another program, Texas Instruments’ Enhanced Products (EP) portfolio, has similar advantages to the Q100 program. However, it includes enhancements for obsolescence mitigation, which extends the lifespan of the products. This can really help prevent the redesign of systems that will be in service over extremely long life cycles.
So if you want your next military (or industrial) design to stay in production for many years, search out Q100 or EP devices. Then you can look forward to your retirement without getting a call back to redesign your equipment! Till next time…




by : Richard F. Zarr

Fanny Inventions with Images


Foot Powered Bike



Pizza Scissors



Mix Sticks



Water Gun Umbrella



Din-Ink Pen Cap Eating Utensils



Corner Frames



Baby Shower Cap



Piano Doorbell




Full Body Umbrella


Shower Mic



Forget-me-not Kid Mittens




LED Slippers


5 news inventions




The quantum source of space-time





Warner Bros. Entertainment/Paramount Pictures

Black holes such as the one depicted in Interstellar (2014) can be connected by wormholes, which might have quantum origins.
In early 2009, determined to make the most of his first sabbatical from teaching, Mark Van Raamsdonk decided to tackle one of the deepest mysteries in physics: the relationship between quantum mechanics and gravity. After a year of work and consultation with colleagues, he submitted a paper on the topic to the Journal of High Energy Physics.
In April 2010, the journal sent him a rejection — with a referee’s report implying that Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, was a crackpot.
His next submission, to General Relativity and Gravitation, fared little better: the referee’s report was scathing, and the journal’s editor asked for a complete rewrite.
Quantum‘spookiness’passes toughest test yet
But by then, Van Raamsdonk had entered a shorter version of the paper into a prestigious annual essay contest run by the Gravity Research Foundation in Wellesley, Massachusetts. Not only did he win first prize, but he also got to savour a particularly satisfying irony: the honour included guaranteed publication in General Relativity and Gravitation. The journal published the shorter essay in June 2010.
Still, the editors had good reason to be cautious. A successful unification of quantum mechanics and gravity has eluded physicists for nearly a century. Quantum mechanics governs the world of the small — the weird realm in which an atom or particle can be in many places at the same time, and can simultaneously spin both clockwise and anticlockwise. Gravity governs the Universe at large — from the fall of an apple to the motion of planets, stars and galaxies — and is described by Albert Einstein’s general theory of relativity, announced 100 years ago this month. The theory holds that gravity is geometry: particles are deflected when they pass near a massive object not because they feel a force, said Einstein, but because space and time around the object are curved.
Both theories have been abundantly verified through experiment, yet the realities they describe seem utterly incompatible. And from the editors’ standpoint, Van Raamsdonk’s approach to resolving this incompatibility was  strange. All that’s needed, he asserted, is ‘entanglement’: the phenomenon that many physicists believe to be the ultimate in quantum weirdness. Entanglement lets the measurement of one particle instantaneously determine the state of a partner particle, no matter how far away it may be — even on the other side of the Milky Way.
Einstein loathed the idea of entanglement, and famously derided it as “spooky action at a distance”. But it is central to quantum theory. And Van Raamsdonk, drawing on work by like-minded physicists going back more than a decade, argued for the ultimate irony — that, despite Einstein’s objections, entanglement might be the basis of geometry, and thus of Einstein’s geometric theory of gravity. “Space-time,” he says, “is just a geometrical picture of how stuff in the quantum system is entangled.”
“I had understood something that no one had understood before.”
This idea is a long way from being proved, and is hardly a complete theory of quantum gravity. But independent studies have reached much the same conclusion, drawing intense interest from major theorists. A small industry of physicists is now working to expand the geometry–entanglement relationship, using all the modern tools developed for quantum computing and quantum information theory.

Einstein was no lone genius
“I would not hesitate for a minute,” says physicist Bartłomiej Czech of Stanford University in California, “to call the connections between quantum theory and gravity that have emerged in the last ten years revolutionary.”

Gravity without gravity

Much of this work rests on a discovery2 announced in 1997 by physicist Juan Maldacena, now at the Institute for Advanced Study in Princeton, New Jersey. Maldacena’s research had led him to consider the relationship between two seemingly different model universes. One is a cosmos similar to our own. Although it neither expands nor contracts, it has three dimensions, is filled with quantum particles and obeys Einstein’s equations of gravity. Known as anti-de Sitter space (AdS), it is commonly referred to as the bulk. The other model is also filled with elementary particles, but it has one dimension fewer and doesn’t recognize gravity. Commonly known as the boundary, it is a mathematically defined membrane that lies an infinite distance from any given point in the bulk, yet completely encloses it, much like the 2D surface of a balloon enclosing a 3D volume of air. The boundary particles obey the equations of a quantum system known as conformal field theory (CFT).
Maldacena discovered that the boundary and the bulk are completely equivalent. Like the 2D circuitry of a computer chip that encodes the 3D imagery of a computer game, the relatively simple, gravity-free equations that prevail on the boundary contain the same information and describe the same physics as the more complex equations that rule the bulk.
“It’s kind of a miraculous thing,” says Van Raamsdonk. Suddenly, he says, Maldacena’s duality gave physicists a way to think about quantum gravity in the bulk without thinking about gravity at all: they just had to look at the equivalent quantum state on the boundary. And in the years since, so many have rushed to explore this idea that Maldacena’s paper is now one of the most highly cited articles in physics.

Quantum weirdness:What's really real?
Among the enthusiasts was Van Raamsdonk, who started his sabbatical by pondering one of the central unsolved questions posed by Maldacena’s discovery: exactly how does a quantum field on the boundary produce gravity in the bulk? There had already been hints that the answer might involve some sort of relation between geometry and entanglement. But it was unclear how significant these hints were: all the earlier work on this idea had dealt with special cases, such as a bulk universe that contained a black hole. So Van Raamsdonk decided to settle the matter, and work out whether the relationship was true in general, or was just a mathematical oddity.
He first considered an empty bulk universe, which corresponded to a single quantum field on the boundary. This field, and the quantum relationships that tied various parts of it together, contained the only entanglement in the system. But now, Van Raamsdonk wondered, what would happen to the bulk universe if that boundary entanglement were removed?
He was able to answer that question using mathematical tools introduced in 2006 by Shinsei Ryu, now at the University of Illinois at Urbana–Champaign, and Tadashi Takanagi, now at the Yukawa Institute for Theoretical Physics at Kyoto University in Japan. Their equations allowed him to model a slow and methodical reduction in the boundary field’s entanglement, and to watch the response in the bulk, where he saw space-time steadily elongating and pulling apart (see ‘The entanglement connection’). Ultimately, he found, reducing the entanglement to zero would break the space-time into disjointed chunks, like chewing gum stretched too far.

NIK SPENCER/NATURE
The geometry–entanglement relationship was general, Van Raamsdonk realized. Entanglement is the essential ingredient that knits space-time together into a smooth whole — not just in exotic cases with black holes, but always.
“I felt that I had understood something about a fundamental question that perhaps nobody had understood before,” he recalls: “Essentially, what is space-time?”

Entanglement and Einstein


The origins of space and time
Quantum entanglement as geometric glue — this was the essence of Van Raamsdonk’s rejected paper and winning essay, and an idea that has increasingly resonated among physicists. No one has yet found a rigorous proof, so the idea still ranks as a conjecture. But many independent lines of reasoning support it.
In 2013, for example, Maldacena and Leonard Susskind of Stanford published a related conjecture that they dubbed ER = EPR, in honour of two landmark papers from 1935. ER, by Einstein and American-Israeli physicist Nathan Rosen, introduced what is now called a wormhole: a tunnel through space-time connecting two black holes. (No real particle could actually travel through such a wormhole, science-fiction films notwithstanding: that would require moving faster than light, which is impossible.) EPR, by Einstein, Rosen and American physicist Boris Podolsky, was the first paper to clearly articulate what is now called entanglement.
Maldacena and Susskind’s conjecture was that these two concepts are related by more than a common publication date. If any two particles are connected by entanglement, the physicists suggested, then they are effectively joined by a wormhole. And vice versa: the connection that physicists call a wormhole is equivalent to entanglement. They are different ways of describing the same underlying reality.
No one has a clear idea of what this under­lying reality is. But physicists are increasingly convinced that it must exist. Maldacena, Susskind and others have been testing the ER = EPR hypothesis to see if it is mathematically consistent with everything else that is known about entanglement and wormholes — and so far, the answer is yes.

Hidden connections


Theoretical physics: Complexity on the horizon
Other lines of support for the geometry–entanglement relationship have come from condensed-matter physics and quantum information theory: fields in which entanglement already plays a central part. This has allowed researchers from these disciplines to attack quantum gravity with a whole array of fresh concepts and mathematical tools.
Tensor networks, for example, are a technique developed by condensed-matter physicists to track the quantum states of huge numbers of subatomic particles. Brian Swingle was using them in this way in 2007, when he was a graduate student at the Massachusetts Institute of Technology (MIT) in Cambridge, calculating how groups of electrons interact in a solid mat­erial. He found that the most useful network for this purpose started by linking adjacent pairs of electrons, which are most likely to interact with each other, then linking larger and larger groups in a pattern that resembled the hierarchy of a family tree. But then, during a course in quantum field theory, Swingle learned about Maldacena’s bulk–boundary correspondence and noticed an intriguing pattern: the mapping between the bulk and the boundary showed exactly the same tree-like network.
“You can think of space as being built from entanglement.”
Swingle wondered whether this resemblance might be more than just coincidence. And in 2012, he published calculations showing that it was: he had independently reached much the same conclusion as Van Raamsdonk, thereby adding strong support to the geometry–entanglement idea. “You can think of space as being built from entanglement in this very precise way using the tensors,” says Swingle, who is now at Stanford and has seen tensor networks become a frequently used tool to explore the geometry–entanglement correspondence.
Another prime example of cross-fertilization is the theory of quantum error-correcting codes, which physicists invented to aid the construction of quantum computers. These machines encode information not in bits but in ‘qubits’: quantum states, such as the up or down spin of an electron, that can take on values of 1 and 0 simultaneously. In principle, when the qubits interact and become entangled in the right way, such a device could perform calculations that an ordinary computer could not finish in the lifetime of the Universe. But in practice, the process can be incredibly fragile: the slightest disturbance from the outside world will disrupt the qubits’ delicate entanglement and destroy any possibility of quantum computation.
That need inspired quantum error-correcting codes, numerical strategies that repair corrupted correlations between the qubits and make the computation more robust. One hallmark of these codes is that they are always ‘non-local’: the information needed to restore any given qubit has to be spread out over a wide region of space. Otherwise, damage in a single spot could destroy any hope of recovery. And that non-locality, in turn, accounts for the fascination that many quantum information theorists feel when they first encounter Maldacena’s bulk–boundary correspondence: it shows a very similar kind of non-locality. The information that corresponds to a small region of the bulk is spread over a vast region of the boundary.

Nature special: General relativity at 100
“Anyone could look at AdS–CFT and say that it’s sort of vaguely analogous to a quantum error-correcting code,” says Scott Aaronson, a computer scientist at MIT. But in work published in June9, physicists led by Daniel Harlow at Harvard University in Cambridge and John Preskill of the California Institute of Technology in Pasadena argue for something stronger: that the Maldacena duality is itself a quantum error-correcting code. They have demonstrated that this is mathematically correct in a simple model, and are now trying to show that the assertion holds more generally.
“People have been saying for years that entanglement is somehow important for the emergence of the bulk,” says Harlow. “But for the first time, I think we are really getting a glimpse of how and why.”

Beyond entanglement

That prospect seems to be enticing for the Simons Foundation, a philanthropic organization in New York City that announced in August that it would provide US$2.5 million per year for at least 4 years to help researchers to move forward on the gravity–quantum information connection. “Information theory provides a powerful way to structure our thinking about fundamental physics,” says Patrick Hayden, the Stanford physicist who is directing the programme. He adds that the Simons sponsorship will support 16 main researchers at 14 institutions worldwide, along with students, postdocs and a series of workshops and schools. Ultimately, one major goal is to build up a comprehensive dictionary for translating geometric concepts into quantum language, and vice versa. This will hopefully help physicists to find their way to the complete theory of quantum gravity.
Still, researchers face several challenges. One is that the bulk–boundary correspondence does not apply in our Universe, which is neither static nor bounded; it is expanding and apparently infinite. Most researchers in the field do think that calculations using Maldacena’s correspondence are telling them something true about the real Universe, but there is little agreement as yet on exactly how to translate results from one regime to the other.
Another challenge is that the standard definition of entanglement refers to particles only at a given moment. A complete theory of quantum gravity will have to add time to that picture. “Entanglement is a big piece of the story, but it’s not the whole story,” says Susskind.
He thinks physicists may have to embrace another concept from quantum information theory: computational complexity, the number of logical steps, or operations, needed to construct the quantum state of a system. A system with low complexity is analogous to a quantum computer with almost all the qubits on zero: it is easy to define and to build. One with high complexity is analogous to a set of qubits encoding a number that would take aeons to compute.
Susskind’s road to computational complexity began about a decade ago, when he noticed that a solution to Einstein’s equations of general relativity allowed a wormhole in AdS space to get longer and longer as time went on. What did that correspond to on the boundary, he wondered? What was changing there? Susskind knew that it couldn’t be entanglement, because the correlations that produce entanglement between different particles on the boundary reach their maximum in less than a second. In an article last year, however, he and Douglas Stanford, now at the Institute for Advanced Study, showed that as time progressed, the quantum state on the boundary would vary in exactly the way expected from computational complexity.

Quantum quest: Reinventing quantum theory
“It appears more and more that the growth of the interior of a black hole is exactly the growth of computational complexity,” says Susskind. If quantum entanglement knits together pieces of space, he says, then computational complexity may drive the growth of space — and thus bring in the elusive element of time. One potential consequence, which he is just beginning to explore, could be a link between the growth of computational complexity and the expansion of the Universe. Another is that, because the insides of black holes are the very regions where quantum gravity is thought to dominate, computational complexity may have a key role in a complete theory of quantum gravity.
Despite the remaining challenges, there is a sense among the practitioners of this field that they have begun to glimpse something real and very important. “I didn’t know what space was made of before,” says Swingle. “It wasn’t clear that question even had meaning.” But now, he says, it is becoming increasingly apparent that the question does make sense. “And the answer is something that we understand,” says Swingle. “It’s made of entanglement.

Quantum Relativity

Quantum Relativity Space, Time, and Gravity in a Quantum Universe by Mark Lawrence 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
           Sir Isaac Newton, 1689                                                                                                                      AlbertEinstein, about 1947




Quantum Relativity is part of the on-going story of man's attempt to understand the rules of the universe, particularly the laws of gravity.

Gravity

In 1686, Sir Isaac Newton published his great work, Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy). In this book, Newton released his theory of gravity, the first mathematical theory of gravity ever. In order to create his theory, first Newton had to invent a new form of mathematics, Calculus.
Newton was never satisfied with his theory. The idea that the Earth pulls on the Moon with no visible or mediating agent is called action at a distance. Newton never thought this idea was credible, but he was unable to find any alternative.
Newton's theory of gravity is quite good - NASA uses it almost exclusively for all orbital calculations, and it works just fine. However, very small numerical problems with his theory were found over the years. Also, Newton's theory was heavily criticised on philosophical grounds. Newton's theory presumed there were special observers, called "inertial observers," who were the only ones to see the laws of physics in their pure form.

Quantum Mechanics

In 1899, Max Planck first introduced the quantum hypothesis. Up until Planck, it was thought that anything could be divided into smaller and smaller segments without limit, but still retaining the basic characteristics of the substance. It's hard to understand this from today's viewpoint, over 100 years later, but in 1899 the idea of atoms was still controversial. Planck introduced the notion that the electro-magnetic field could only be made up of small indivisible units.
Quantum mechanics is a very strange theory, based on the observation that atomic particles often can be in one place or another, but not in between. Quantum mechanics was developed by several people, culminating in a comprehensive theory released by Werner Heisenberg in 1925 and Erwin Schroedinger in 1926. However, almost immediately it was realized that quantum mechanics was an incomplete theory. Quantum mechanics could not accurately account for electro-magnetic forces, and quantum mechanics violated the laws of special relativity.
In 1948, quantum mechanics was replaced by a new theory simultaneously developed by Richard Feynman, Sun-Itiro Tomonaga, and Julian Schwinger, called Quantum Field Theory. Quantum Field Theory remains today our best theory of electro-magnetic forces, and is our current theory for the nuclear force and the radioactive force, more often called the strong and weak forces.

Relativity

In 1905, Albert Einstein shocked the world with three papers. Before he published these papers, Einstein was a clerk in the Swiss Patent office - he had graduated from college with a bachelors (4 year) degree, but his professors considered him a rather indifferent student who was not talented enough to warrant a position in graduate school to pursue a Doctorate degree. Einstein's three papers were:
  • Brownian Motion - after this paper was published, everyone agreed that matter was made up of atoms. The atomic theory is perhaps the most fundamental part of quantum mechanics.
  • The Photo-Electric effect - in this paper, Einstein coined the work "photon," and put us firmly on the road to quantum mechanics.
  • Special Relativity - in this paper, Einstein explained that the speed of light was an absolute constant. Everyone who measures the speed of light will get the same number, regardless of how fast they are moving and how fast the light source is moving, and nothing can go faster than light. Special relativity tells us that space and time do not exist as separate entities, as Newton thought, but rather as one union, which we call space-time.
Any one of these three papers would have been enough to ensure that Einstein became known as a superb physicist. The three papers published in one summer were enough to set him aside as someone special. However, Einstein was not by any means done creating.
Einstein realized almost immediately that his theory of Special Relativity had a serious flaw: gravity could not co-exist with his new theory of space and time. So, almost immediately Einstein set out to find a new theory of gravity, a theory to replace Newton's. In 1916, 11 years later after special relativity and 230 years after Newton, Einstein published his theory of gravity, the General Theory of Relativity. In order to create this theory of gravity, Einstein had to change our notions of space and time yet again. Einstein had to postulate that we lived in a curved space-time, just as we live on the curved surface of the Earth. Einstein showed that there were no such things as Newton's inertial observers. Also, General Relativity is what is called a field theory, so Newton's spooky action at a distance was also gone.
Just as Newton was never satisfied with his theory of gravity, Einstein was never satisfied with General Relativity. Einstein was disturbed by two problems: he believed that there should be just one theory to account for both gravity and electro- magnetism, and he believed that this "unified field" theory should get rid of quantum mechanics. Although Einstein himself helped create quantum mechanics, he hated quantum mechanics until his death. One interpretation of quantum mechanics is that everything is uncertain, and everything is fundamentally governed by the laws of probability. Einstein particularly despised this notion, frequently asserting "God does not throw dice!"

Quantum Relativity

Gravity as we currently understand it cannot be reconciled with the laws of quantum mechanics. Since 1930, people have tried to invent a theory of quantum gravity. I believe Enrico Fermi was the first to propose a theory of quantum gravity, in 1931. However, Fermi's theory predicted that all forces were infinite, and therefore the universe could not exist. Most physicists think the universe does in fact exist, so it was thought that the theory of quantum gravity had some serious problems.
Shortly after quantum field theory was invented, people started trying to invent a quantum field theory of gravity. Very quickly, it was shown that this is impossible: there can be no theory of gravity which obeys the rules of quantum field theory. The quantum theory of fields simply will not work for a force with the properties of gravity. It was recognized that a completely new type of theory was required. Since this theory does not currently exist, no one is certain exactly what it looks like. However, most people presume we need a new theory of space and time which will be compatible with the laws of quantum mechanics as we know them, and somehow allow a theory of quantum gravity to exist. This new theory of space and time is often called Quantum Relativity.
This web site is all about these theories.

samedi 16 janvier 2016

Artificial Intelligence

 

Artificial Intelligence

Artificial Intelligence, or AI, is the ability of a computer to act like a human being. It has several applications, including software simulations and robotics. However, artificial intelligence is most commonly used in video games, where the computer is made to act as another player.

Nearly all video games include some level of artificial intelligence. The most basic type of AI produces characters that move in standard formations and perform predictable actions. More advanced artificial intelligence enables computer characters to act unpredictably and make different decisions based on a player's actions. For example, in a first-person shooter (FPS), an AI opponent may hide behind a wall while the player is facing him. When the player turns away, the AI opponent may attack. In modern video games, multiple AI opponents can even work together, making the gameplay even more challenging.

Artificial intelligence is used in a wide range of video games, including board games, side-scrollers, and 3D action games. AI also plays a large role in sports games, such as football, soccer, and basketball games. Since the competition is only as good as the computer's artificial intelligence, the AI is a crucial aspect of a game's playability. Games that lack a sophisticated and dynamic AI are easy to beat and therefore are less fun to play. If the artificial intelligence is too good, a game might be impossible to beat, which would be discouraging for players. Therefore, video game developers often spend a long time creating the perfect balance of artificial intelligence to make the games both challenging and fun to play. Most games also include different difficulty levels, such as Easy, Medium, and Hard, which allows players to select an appropriate level of artificial intelligence to play against.

The History Of Electronics

           

The History Of Electronics



                 The history of electronics is a story of the twentieth century and three key components—the vacuum tube, the transistor, and the integrated circuit. In 1883, Thomas Alva Edison discovered that electrons will flow from one metal conductor to another through a vacuum. This discovery of conduction became known as the Edison effect. In 1904, John Fleming applied the Edison effect in inventing a two-element electron tube called a diode, and Lee De Forest followed in 1906 with the three-element tube, the triode. These vacuum tubes were the devices that made manipulation of electrical energy possible so it could be amplified and transmitted.


                      The first applications of electron tubes were in radio communications. Guglielmo Marconi pioneered the development of the wireless telegraph in 1896 and long-distance radio communication in 1901. Early radio consisted of either radio telegraphy (the transmission of Morse code signals) or radio telephony (voice messages). Both relied on the triode and made rapid advances thanks to armed forces communications during World War I. Early radio transmitters, telephones, and telegraph used high-voltage sparks to make waves and sound. Vacuum tubes strengthened weak audio signals and allowed these signals to be superimposed on radio waves. In 1918, Edwin Armstrong invented the "super-heterodyne receiver" that could select among radio signals or stations and could receive distant signals. Radio broadcasting grew astronomically in the 1920s as a direct result. Armstrong also invented wide-band frequency modulation (FM) in 1935; only AM or amplitude modulation had been used from 1920 to 1935.
Communications technology was able to make huge advances before World War II as more specialized tubes were made for many applications. Radio as the primary form of education and entertainment was soon challenged by television, which was invented in the 1920s but didn't become widely available until 1947. Bell Laboratories publicly unveiled the television in 1927, and its first forms were electromechanical. When an electronic system was proved superior, Bell Labs engineers introduced the cathode ray picture tube and color television. But Vladimir Zworykin, an engineer with the Radio Corporation of America (RCA), is considered the "father of the television" because of his inventions, the picture tube and the iconoscope camera tube.
Development of the television as an electronic device benefitted from many improvements made to radar during World War II. Radar was the product of studies by a number of scientists in Britain of the reflection of radio waves. An acronym for RAdio Detection And Ranging, radar measures the distance and direction to an object using echoes of radio microwaves. It is used for aircraft and ship detection, control of weapons firing, navigation, and other forms of surveillance. Circuitry, video, pulse technology, and microwave transmission improved in the wartime effort and were adopted immediately by the television industry. By the mid-1950s, television had surpassed radio for home use and entertainment.
After the war, electron tubes were used to develop the first computers, but they were impractical because of the sizes of the electronic components. In 1947, the transistor was invented by a team of engineers from Bell Laboratories. John Bardeen, Walter Brattain, and William Shockley received a Nobel prize for their creation, but few could envision how quickly and dramatically the transistor would change the world. The transistor functions like the vacuum tube, but it is tiny by comparison, weighs less, consumes less power, is much more reliable, and is cheaper to manufacture with its combination of metal contacts and semiconductor materials.
The concept of the integrated circuit was proposed in 1952 by Geoffrey W. A. Dummer, a British electronics expert with the Royal Radar Establishment. Throughout the 1950s, transistors were mass produced on single wafers and cut apart. The total semiconductor circuit was a simple step away from this; it combined transistors and diodes (active devices) and capacitors and resistors (passive devices) on a planar unit or chip. The semiconductor industry and the silicon integrated circuit (SIC) evolved simultaneously at Texas Instruments and Fairchild Semiconductor Company. By 1961, integrated circuits were in full production at a number of firms, and designs of equipment changed rapidly and in several directions to adapt to the technology. Bipolar transistors and digital integrated circuits were made first, but analog ICs, large-scale integration (LSI), and very-large-scale integration (VLSI) followed by the mid-1970s. VLSI consists of thousands of circuits with on-and-off switches or gates between them on a single chip. Microcomputers, medical equipment, video cameras, and communication satellites are only examples of devices made possible by integrated circuits.
History  of  Computers

Victorian Steam Powered Computer
The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers.
 

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world’s first computer was actually built.
1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.
1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. 
1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language is born.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.
1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.
1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.
1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, RadioShack’s TRS-80 —affectionately known as the “Trash 80” — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft. 
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board.

TRS-80

1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.
1979: Word processing becomes a reality as MicroPro International releases WordStar.


1981: The first IBM personal computer, code-named “Acorn,” is introduced. It uses Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.
1983: Apple’s Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a “laptop.”
1985: Microsoft announces Windows, its response to Apple’s GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.
1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.
1993: The Pentium microprocessor advances the use of graphics and music on PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market.
1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple’s court case against Microsoft in which it alleged that Microsoft copied the “look and feel” of its operating system.
1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.
2003: The first 64-bit processor, AMD’s Athlon 64, becomes available to the consumer market.
2004: Mozilla’s Firefox 1.0 challenges Microsoft’s Internet Explorer, the dominant Web browsers. Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo’s Wii game console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.