Monday, 16 January 2012

  1. NETWORK COMPUTER

Networks are collections of computers, software, and hardware that are all connected to help their users work together. A network connects computers by means of cabling systems, specialized software, and devices that manage data traffic. A network enables users to share files and resources, such as printers, as well as send messages electronically (e-mail) to each other.
Computer networks fall into two main types: client/server networks and peer-to-peer networks. A client/server network uses one or more dedicated machines (the server) to share the files, printers, and applications. A peer-to-peer network allows any user to share files with any other user and doesn’t require a central, dedicated server.
The most common networks are Local Area Networks or LANs for short. A LAN connects computers within a single geographical location, such as one office building, office suite, or home. By contrast, Wide Area Networks (WANs) span different cities or even countries, using phone lines or satellite links.
Networks are often categorized in other ways, too. You can refer to a network by what sort of circuit boards the computers use to link to each other – Ethernet and Token-Ring are the most popular choices. You can also refer to a network by how it packages data for transmission across the cable, with terms such as TCP/IP (Transmission Control Protocol/Internet Protocol) andIPX/SPX (Internet Package eXchnage/Sequenced Package eXchange)


OPERATING SYSTEMS VERSUS APPLICATION SOFTWARE

An operating system (OS) is a set of programs that manage computer hardware resources and provide common services for application software. The operating system is the most important type of system software in a computer system. A user cannot run an application program on the computer without an operating system, unless the application program is self booting.
Time-sharing operating systems schedule tasks for efficient use of the system and may also include accounting for cost allocation of processor time, mass storage, printing, and other resources.
For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between application programs and the computer hardware,[1][2] although the application code is usually executed directly by the hardware and will frequently call the OS or be interrupted by it. Operating systems are found on almost any device that contains a computer—from cellular phones and video game consoles to supercomputers and web servers.

APPLICATION SOFTWARE
(1) Application software is the general designation of computer programs for performing user tasks. Application software may be general purpose (word processing, web browsers, ...) or have a specific purpose (accounting, truck scheduling, ...). Application software contrasts with (2} system software, a generic term referring to the computer programs used to start and run computer systems and networks; and (3) programming tools, such as compilers and linkers, used to translate and combine computer program source code and libraries into executable programs (programs that will belong to one of the three said categories).

WORD SOFTWARE  
FUNCTIONS =used ducuments such as compaosing editing,formating, and printing
EXAPLE       =microsoft wird 2007,notepad .

SPREADSHEET
FUNCTIONS =freequently used for financial information because of their ability to calculate data automaticaly 
EXAMPLE    =microsoft Excel 2007 ,visicalc 

PRESENTATION
FUCNTIONS =used for present informations to the audience 
EXAMPLE    =microsoft powerpoint 2007,apple keynote ,corel prensentions

WEB BROWSER
FUCNTION =used for retrieving,presenting,and traversing information resoursces on the world wide web
EXAMPLE  =internet explorer 9,mozilla ,google chrome

computer generation

First Generation (1940-1956) Vacuum Tubes



The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

Second Generation (1956-1963) Transistors




Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.

Third Generation (1964-1971) Integrated Circuits




The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitorsand interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation (1971-Present) Microprocessors



The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handhelddevices

Fifth Generation (Present and Beyond) Artificial Intelligence




Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.