Sunday, October 10, 2010

Introduction about ICT (History)

IT defines as Information Technology, consists of study, design, advance development, accomplishment, support or administration of computer foundation information system, mostly software application and computer hardware. Information technology works with the use of electronic computers and computer software to renovate, defend, development, and broadcast and other information. 
Information technology has overstuffed to cover many features of computing and technology, and this word is more familiar than ever before. Information technology subject can be quite large, encompassing many fields. IT professionals perform different types of responsibilities that range from installing applications to designing complex computer networks.


IT professional's responsibilities are data management, networking, database, software design, computer hardware, management and administration of whole system. IT (Information Technology) is combined word of computer and communications or "InfoTech". Information Technology illustrates any technology which helps to manufacture, manipulate, accumulate, communicate or broadcast information.


Recently it has become popular to broaden the term to explicitly include the field of electronic communication so that people tend to use the abbreviation ICT (Information and Communications Technology).


The term "information technology" evolved in the 1970s. Its basic concept, however, can be traced to the World War II alliance of the military and industry in the development of electronics, computers, and information theory. After the 1940s, the military remained the major source of research and development funding for the expansion of automation to replace manpower with machine power. 
Since the 1950s, four generations of computers have evolved. Each generation reflected a change to hardware of decreased size but increased capabilities to control computer operations. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, and the fourth used integrated circuits on a single computer chip. Advances in artificial intelligence that will minimize the need for complex programming characterize the fifth generation of computers, still in the experimental stage.


The first commercial computer was the UNIVAC I, developed by John Eckert and John W. Mauchly in 1951. It was used by the Census Bureau to predict the outcome of the 1952 presidential election. For the next twenty-five years, mainframe computers were used in large corporations to do calculations and manipulate large amounts of information stored in databases. Supercomputers were used in science and engineering, for designing aircraft and nuclear reactors, and for predicting worldwide weather patterns. Minicomputers came on to the scene in the early 1980s in small businesses, manufacturing plants, and factories.


In 1975, the Massachusetts Institute of Technology developed microcomputers. In 1976, Tandy Corporation's first Radio Shack microcomputer followed; the Apple microcomputer was introduced in 1977. The market for microcomputers increased dramatically when IBM introduced the first personal computer in the fall of 1981. Because of dramatic improvements in computer components and manufacturing, personal computers today do more than the largest computers of the mid-1960s at about a thousandth of the cost. 
Computers today are divided into four categories by size, cost, and processing ability. They are supercomputer, mainframe, minicomputer, and microcomputer, more commonly known as a personal computer. Personal computer categories include desktop, network, laptop, and handheld. 

No comments:

Post a Comment