ads

jeudi 21 janvier 2016

History of Robotic art1





Although the science of roboticsonly came about in the 20th century, the history of human-invented automation has a much lengthier past. In fact, the ancient Greek engineer Hero of Alexandria, produced two texts,Pneumatica andAutomata, that testify to the existence of hundreds of different kinds of “wonder” machines capable of automated movement. Of course, robotics in the 20th and 21st centuries has advanced radically to include machines capable of assembling other machines and even robots that can be mistaken for human beings.
The word robotics was inadvertently coined by science fiction author Isaac Asimov in his 1941 story “Liar!” Science fiction authors throughout history have been interested in man’s capability of producing self-motivating machines and lifeforms, from the ancient Greek myth of Pygmalion to Mary Shelley’s Dr. Frankenstein and Arthur C. Clarke’s HAL 9000. Essentially, a robot is a re-programmable machine that is capable of movement in the completion of a task. Robots use special coding that differentiates them from other machines and machine tools, such as CNC. Robots have found uses in a wide variety of industries due to their robust resistance capabilities and precision function.
 
Historical Robotics
Many sources attest to the popularity of automatons in ancient and Medieval times. Ancient Greeks and Romans developed simple automatons for use as tools, toys, and as part of religious ceremonies. Predating modern robots in industry, the Greek God Hephaestus was supposed to have built automatons to work for him in a workshop. Unfortunately, none of the early automatons are extant. 
In the Middle Ages, in both Europe and the Middle East, automatons were popular as part of clocks and religious worship. The Arab polymath Al-Jazari (1136-1206) left texts describing and illustrating his various mechanical devices, including a large elephant clock that moved and sounded at the hour, a musical robot band and a waitress automaton that served drinks. In Europe, there is an automaton monk extant that kisses the cross in its hands. Many other automata were created that showed moving animals and humanoid figures that operated on simple cam systems, but in the 18th century, automata were understood well enough and technology advanced to the point where much more complex pieces could be made. French engineer Jacques de Vaucanson is credited with creating the first successful biomechanical automaton, a human figure that plays a flute. Automata were so popular that they traveled Europe entertaining heads of state such as Frederick the Great and Napoleon Bonaparte.
 
Victorian Robots
 
The Industrial Revolution and the increased focus on mathematics, engineering and science in England in the Victorian age added to the momentum towards actual robotics. Charles Babbage (1791-1871) worked to develop the foundations of computer science in the early-to-mid nineteenth century, his most successful projects being the difference engine and the analytical engine. Although never completed due to lack of funds, these two machines laid out the basics for mechanical calculations. Others such as Ada Lovelace recognized the future possibility of computers creating images or playing music.
Automata continued to provide entertainment during the 19th century, but coterminous with this period was the development of steam-powered machines and engines that helped to make manufacturing much more efficient and quick. Factories began to employ machines to either increase work loads or precision in the production of many products. 

The Twentieth Century to Today

 In 1920, Karel Capek published his play R.U.R. (Rossum’s Universal Robots), which introduced the word “robot.” It was taken from an old Slavic word that meant something akin to “monotonous or forced labor.” However, it was thirty years before the first industrial robot went to work. In the 1950s, George Devol designed the Unimate, a robotic arm device that transported die castings in a General Motors plant in New Jersey, which started work in 1961. Unimation, the company Devol founded with robotic entrepreneur Joseph Engelberger, was the first robot manufacturing company. The robot was originally seen as a curiosity, to the extent that it even appeared on The Tonight Show in 1966. Soon, robotics began to develop into another tool in the industrial manufacturing arsenal.
 
Robotics became a burgeoning science and more money was invested. Robots spread to Japan, South Korea and many parts of Europe over the last half century, to the extent that projections for the 2011 population of industrial robots are around 1.2 million. Additionally, robots have found a place in other spheres, as toys and entertainment, military weapons, search and rescue assistants, and many other jobs. Essentially, as programming and technology improve, robots find their way into many jobs that in the past have been too dangerous, dull or impossible for humans to achieve. Indeed, robots are being launched into space to complete the next stages of extraterrestrial and extrasolar research.

What is a programming computer







Aren't Programmers Just Nerds?:
Programming is a creative process done by programmers to instruct a computer on how to do a task. Hollywood has helped instill an image of programmers as uber techies who can sit down at a computer and break any password in seconds or make highly tuned warp engines improve performance by 500% with just one tweak. Sadly the reality is far less interesting!
  • Defiunition of Program
  • What is a Programming Language?
  • What is Software?
So Programming Is Boring? No!:
Computers can be programmed to do interesting things. In the UK, a system has been running for several years that reads car number plates. The car is seen by a camera and the image captured then instantly processed so that the number plate details are extracted, run through a national car registration database of number plates and any stolen vehicle etc alerts for that vehicle flagged up within four seconds.
With the right attachments, a computer could be programmed to perform dentistry. Testing that would be interesting and might be a bit scary!
Two Different Types Of Software:
Older computers, generally those with black and white displays and no mouse tend to run consoleapplications. There are still plenty of these about, they are very popular for rapid data entry.
The other type of applications require a mouse and are called GUI programs or event driven programming. These are seen on Windows PCs, Linux PCs and Apple Macs. Programming these applications is a bit harder than for console but newer programming languages like these have simplified it.
  • Visual Basic
  • Delphi
  • C#
What Do Programs Do?:
Fundamentally programs manipulate numbers and text. These are the building blocks of all programs. Programming languages let you use them in different ways, eg adding numbers, etc, or storing data on disk for later retrieval.
These numbers and text are called variables and can be handled singly or in structured collections. In C++, a variable can be used to count numbers, or a struct) variable hold payroll details for an employee such as
  • Name
  • Salary
  • Company Id Number
  • Total Tax Paid
  • SSN
A database can hold millions of these records and fetch them very rapidly.
Programs Must Be Written For An Operating System:
Programs don't exist by themselves but need operating system, unless they are the operating system!
Win 32
Linux
Mac
Before Java, programs needed rewriting for each operating system. A program that ran on a Linux box could not run on a Windows box or a Mac. With Java it is now far easier to write a program once then run it everywhere as it is compiled to a common code calledbytecode which is then interpreted. Each operating system has a Java interpreter, called a Java Virtual Machine (JVM) written for it and knows how to interpret bytecode. C# has something similar.
Programs Use Operating Systems Code:
Unless you're selling software and want to run it on every different operating system, you are more likely to need to modify it for new versions of the same operating system. Programs use features provided by the operating system and if those change then the program must change or it will break.
Many applications written for Windows 2000 or XP use the Local Machine part of the registry. Under Windows Vista this will cause problems and Microsoft is advising people to rewrite code affected by this. Microsoft have done this to make Vista more secure.
Computers Can Talk To Other Computers:
When connected in a network, they can even run programs on each other or transfer data via ports. Programs you write can also do this. This makes programming a little harder as you have to cope with situations like
  • When a network cable is pulled out.
  • Another networked PC is switched off.
Some advanced programming languages let you write programs that run their parts on different computers. This only works if the problem can use parallelism. Some problems cannot be divided this way:
  • Nine women cannot produce one child between them in just one month!
Programming Peripherals attached to your Computer:
If you have a peripheral, say a computer controlled video camera, it will come with a cable that hooks it up to the PC and some interfacing software to control it. It may also come with
  • API
  • SDK
that lets you write software to control it. You could then program it to switch on and record during the hours when you are out of the house. If your PC can read sound levels from the microphone then you might write code that starts the camera recording when the sound level is above a limit that you specified. Many peripherals can be programmed like this.
Games Are Just Programs:
Games on PCs use special libraries :
  • DirectX
  • XNA
  • SDL
So they can write to the display hardware very rapidly. Games screens update at over 60 times per seconds so 3D games software has to move everything in 3D space, detect collisions etc then render the 3D view onto a flat surface (the screen!) 60 times each second. That's a very short period of time but video card hardware now does an increasing amount of the rendering work. The GPU chips are optimized for fast rendering and can do these operations up to 10x faster than a CPU can, even with the fastest software.
Conclusion:
Many programmers write software as a creative outlet. The web is full of websites with source codedeveloped by amateur programmers who did it for the heck of it and are happy to share their code. Linuxstarted this way when Linus Torvalds shared code that he had written.
The intellectual effort in writing a medium sized program is probably comparable to writing a book, except you never need to debug a book! There is a joy to finding out new ways to make something happen, or solving a particularly thorny problem. If your programming skills are good enough then you could get a full-time job as a programmer.


 https://www.blogger.com/manage-blogs-following.g

What is TCP/IP and How Does It Make the Internet Work?

TCP/IP – A Brief Explanation
the Internet works by using a protocol called TCP/IP, or Transmission Control Protocol/Internet Protocol. TCP/IP is the underlying communication language of the Internet. In base terms, TCP/IP allows one computer to talk to another computer via the Internet through compiling packets of data and sending them to right location.For those who don’t know, a packet, sometimes more formally referred to as a network packet, is a unit of data transmitted from one location to another. Much like the atom is the smallest unit of a cell, a packet is the smallest unit of transmitted information over the Internet.
Defining TCP
As indicated in the name, there are two layers to TCP/IP. The top layer, TCP, is responsible for taking large amounts of data, compiling it into packets and sending them on their way to be received by a fellow TCP layer, which turns the packets into useful information/data.
Defining IP
The bottom layer, IP, is the locational aspect of the pair allowing the packets of information to be sent and received to the correct location. If you think about IP in terms of a map, the IP layer serves as the packet GPS to find the correct destination. Much like a car driving on a highway, each packet passes through a gateway computer (signs on the road), which serve to forward the packets to the right destination.
In summary, TCP is the data. IP is the Internet location GPS.
That is how the Internet works on the surface. Let’s take a look below the surface at the abstraction layers of the Internet.
The Four Abstraction Layers Embedded in TCP/IP
The four abstraction layers are the link layer (lowest layer), the Internet layer, the transport layer and the application layer (top layer).
They work in the following fashion:
  1. The Link Layer is the physical network equipment used to interconnect nodes and servers.
  2. The Internet Layer connects hosts to one another across networks.
  3. The Transport Layer resolves all host-to-host communication.
  4. The Application Layer is utilized to ensure communication between applications on a network.
In English, the four abstraction layers embedded in TCP/IP allow packets of data, application programs and physical network equipment to communicate with one another over the Internet to ensure packets are sent intact and to the correct location.
The Four Abstraction Layers Embedded in TCP/IP
Now that you know the base definition of TCP/IP and how the Internet works, we need to discuss why all of this matters.
The Internet is About Communication and Access
The common joke about the Internet is it is a series of tubes where data is sent and received at different locations. The analogy isn’t bad. However, it isn’t complete.
The Internet is more like a series of tubes with various connection points, various transmission points, various send/receive points, various working speeds and a governing body watching over the entire process.
To understand why TCP/IP is needed, here’s a quick example.
I live in Gainesville, Florida. However, because I once lived in Auckland, New Zealand, for an extended period of time, I enjoy checking the local New Zealand news on a weekly basis.
To do this, I read The New Zealand Herald. To do this, I visit www.nzhearald.co.nz. As you might have guessed from the URL, The New Zealand Herald is digitally based in New Zealand (i.e. the other side of the world from Gainesville).
The Amount of Hops For Packets to Be Transmitted
For the connection to be made from my computer located in Gainesville to a server hosting The New Zealand Herald based in New Zealand, packets of data have to be sent to multiple data centers through multiple gateways and through multiple verification channels to ensure my request finds the right destination.
The common Internet parlance for this is finding out how many hops it takes for one packet of information to be sent to another location.
Running a trace route can show you the amount of hops along the way. If you are wondering, there are 17 hops between my location in Gainesville and the server hosting the The New Zealand Herald website.
TCP/IP is needed to ensure that information reaches its intended destination. Without TCP/IP, packets of information would never arrive where they need to be and the Internet wouldn’t be the pool of useful information that we know it to be today.