Application of Computer in Economics Course: DE-403(ii) with Dr. Sanatan Nayak

Slide Note
Embed
Share

Computer application in economics is essential for data management and analysis. The course covers definitions, features, basic computer organization, and evolution. It explores the characteristics of computers such as high speed, accuracy, diligence, versatility, and memory power. The evolution of computers from abacus to modern digital technology is discussed, emphasizing the importance of input, storage, processing, output, and control in computer organization.


Uploaded on Jul 22, 2024 | 2 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Application of Computer in Economics Course: DE-403(ii) Course teacher Dr. Sanatan Nayak Dept. of Economics, B.B. Ambedkar University Rae Bareli Road, Lucknow-25

  2. Contents of Introductions Definitions Features or characteristics Basic computer Organization/ Components Evolution

  3. Definitions The word computer has been derived from the word compute , means to calculate with high speed. Original objectives To create a fast calculating machine Now-a-days, 80 % of data are for non-mathematics. It is created for operation of information and data, bio-data, railway tickets, air tickets, govt. data base. What the computer does, Store the data Process the data Retrieve the data (data processor)

  4. Characteristics of Computer High speed Million: seconds (1/10000), micro seconds (1/10000000), nano seconds (1/10 000000000), piso seconds (1/10 000 000 000 000). Accuracy: error occurs due to human rather than technological weakness. Diligence: it is lack of monotony, tiredness, lack of concentration. Versatility: different type of work. Power of remembering: it can store and remember any amount of information. No I.Q: it does not have intelligence No feelings: no heart, no taste, no knowledge and experience

  5. Evolution of Computer Necessary is the mother of invention. The earliest one that qualifies abaccus or soroban . It was invented in 600 B.C. It does only addition, subtraction with little speed. Manual Calculating device: John Napier s Card Board- 17th century and updated in 1890 AD. First mechanical machine by Blair Pascal in 1642 AD. Baron Gottfried: German s first calculator for multiplication. Key Board originated in 1880 AD in USA. Herman Hollerith: Punched cards are extensively used a input media in modern digital computer.

  6. Basic Computers Organization Five important operations: 1. Inputting 2. Storing 3. Processing 4. Outputting 5. Controlling Therefore, five important functional units or blocks. 1. Input Unit: Data and information must be given through outside device. Through Key Board All the data and instruction are transformed into binary codes/acceptable form, those are saved in primary memory. It supplies the converted instructions and data to the computer system for further processing.

  7. Basic Computers Organization cont . 2. Output Unit: It is reverse of input Unit It accept the result produced by the computer, which are in coded form and can not be easily understand by us. It convert from binary form to the human acceptable form. It is designed to the external environment through printer etc. It supplies information and results of the computer to the outside world. 3. Storage Unit: All the data and instructions to be stored and kept for processing (received from input device) It stores the intermediate results for processing. Final results of processing before these results are released to be an output device.

  8. Basic Computers Organization cont . 4. Arithmetic Logic Unit It is the place where actual execution of instruction are taken place. All the calculations are performed and all decisions are made in ALU All data and instructions are stored in the primary storage prior to the processing are transferred as and when needed to ALU. Intermediate results are generated in the ALU are temporarily transferred back to primary storage. All the ALU are designed to perform the four basic arithmatic operations, +, -, X, / and all the logic operation, / , >, <,

  9. Basic Computers Organization cont . 5. Control Unit: It is central nervous system in the computer. It abtain instructions from the programme stored in main memory, interpret the instructions and issues signal that cause other units of the system to execute. It acts as selection, interpretation and execution of instruction. Central Processing Units (CPU) CU + ALU = CPU

  10. References P.K. Sinha (latest), Computer Fundamentals, BPB Publications, New Delhi.

  11. Goals of the chapter This chapter deals with Various Generations Computers Types of computers

  12. Generations of Computers Classifications of generations is based on Development of hard wares in the computers Development of soft wares and its applications

  13. First Generations (FG) of Computers First large electronic computer was completed in 1946 in USA is called The ENIAC Electronic Numerical Integration and Calculation (ENIAC). a. It was the first all electronic computer. b. Designed by team lead by Eckert and Mauchly at University of Pennsylvania, USA. c. It was operated by wiring board and used high speed vacuum tube switching devices. d. It had a very small memory and designed primarily to calculate the trajectories of missiles. e. ENIAC took about 200 microseconds for addition and 2800 MS for multiplications.

  14. EDSAC (Electronic Delay Storage Automatic Calculator) Major breakthrough took place due to stored program by John Von Neumann in 1946. To store the machine instruction in the memory of computer along with data. The first computer using this principle was designed and commissioned at Cambridge by Maurice Wilkes. It is called as EDSAC and completed in 1949. It used mercury delay lines for storage.

  15. UNIVAC This is commercial production of stored program electronic computers It is built by Univac divison of Remington Rand and delivered in 1951. It used vacuum tubes. The tube has limited life and each tube consumed half watt of power. It consumed ten thousand tubes. Language during this period Computer programming was done through machine language. Assembly of languages was done in early 50 s. Computer application was mainly in science and engineering. FG was basically more on hard ware with little soft ware development.

  16. The Second Generations Inventions of transistors by Bardeen , Brattain and Shockley in 1947 was big revolutions. Transistors made of germanium semiconductor material and it is more reliable than tubes. No filaments to burn. They occupies less space and consume only one tenth of power. They also switch from one place to another in a few seconds, about one tenth time needed by tubes. Thus switching circuits for computers made with transistors were about ten times more reliable, ten time faster, occupied about one tenth space, and cheaper. Computers thus changed from tubes to transistors. This generations lasted till 1965.

  17. SG Continu. Another major invention was magnetic cores of storage. Magnetic cores are tiny rings (0.05 cm diameter) made of ferrite and can be magnetized in either clock wise or anti-clock wise direction. Magnetic cores were used to construct large random access memories. Memory capacity in SG was about 100 KB Magnetic disk storage was developed during this period. Due to development of Large Memories Development of high level languages, FORTRAN, COBOL, Algol, SNOWBOL were developed. With higher speed of CPU, disk storage, operating systems were developed. Good batch operating system particularly 7000 series computers emerged during the SG.

  18. SG Continu. Rapid development of computers due to development of business and industry (80%). A number of application operation research such as linear programming, critical path methods (CPM), simulation were used in computers. New professions in computing such as systems analysis and programmers emerged during the second generations Academic programmes in computer sciences were also initiated.

  19. The Third Generations (TG) The TG began in 1965 with germanium transistors replaced by silicon transistors. Integrated circuits, circuits consist of transistors, resistors and capacitors grown on single chip of silicon eliminating wired interconnection between components emerged. From small scale circuits to medium scale circuit of 100 transistors per chips developed. Switching speed of transistors went up by a factors of 10 times. Reliability increased by factor of 10. Power dissipation increased by factor of 10 Size also reduce by factor of 10 Powerful CPU with carrying capacity of 1 million instructions per seconds.

  20. (TG) Conti There were significant improvements in design of magnetic core meories. The size of main memories reached about 4 MB. Magnetic disk technology improved rapidly. 100 MB drive became feasible. Time shared operating system was developed (combination of high capacity memory, powerful CPU, large disk memories). Many important online systems became feasible. Dynamic production control system developed. Airline reservation, interactive query systems and real time closed loop process control system were developed. Integrated data base management system was developed.

  21. (TG) Conti High level languages developed. FORTRAN and Optimizing FORTRAN compliers were developed. COBOL 68 developed by American National Standards Institute. It was end by 1975 but no revolutionary new concepts developed.

  22. The Fourth Generations (FG) First Decade (1976-85) It is identified by the advent of microprocessor chip. Medium scale integrated circuits yielded to Large and Very Large Scale Integrated (VLSI) circuits packing about 50000 transistors in a chip. Semiconductor memory sizes of 16 MB of 16 MB with a cycle time 200 nsecs were in common use. Emergence of Microprocessor lead to two directional development Extremely powerful PC.

  23. FG Conti. Major impact on history of computing Due to development of IBM PC and Operating System (OS) Due to development of MSDOS (MS Disk OS) and MS s CP/M (Control Program for Microcomputers) Many small companies made PCs conforming IBM s architecture Word processor, Spread Sheet Data base management

  24. FG Conti. Decentralisation of computer organisation Network of computers and distribution of computer system were developed. Disk memories became very large (1000 MB) Concurrent programming language, such as ADA Interactive graphic devices Language interface to graphic system UNIX OS OS became user friendly and highly reliable

  25. Second Phase (1986-2000) of FG The speed of microprocessor and the size main memory and hard disk went of 4 factors in each 3 years. Many features of CPU in 1stdecade of FG became microprocessor architecture of 2nddecade. The mainframe computer of early 80s died in 90s. Microprocessor chip designed by DEC in 1994 packed 9.3 million transistors in single chip and could carry out one billion operation per seconds (300 MHz clock). Apart from IBM, Apple computer, Motorola designed processor called Power PC 600 series. Intel designed powerful chips called Pentium (1993). It was followed by Pentium with MMX( Multi media Extension) and Pentium II Celeron processor with a 300 MHz clock Intel introduced a 64 bit processor called IA 64 or Itanium.

  26. Second Phase (1986-2000) of FG The area of hard storage also saw vast improvement. 1 GB of disk on workstation became common in 1994. Optical disks also emerged as mass storage for read only files. New optical disks is known as Digital Versatile Disk ROMs (DVDROMs) of storage capacity of 17 GB in 1998. Writable CDs were developed during the same time. Local Area Networks which could transmit 100 MB/sec to 1 GB/sec. Rapid increase in number of computers connected to internet. Introduction of WWW, which eased information retrieval. Objective oriented language called Java for internet. C language became popular. C++ emerged as most popular. PROLOG was designed for logic oriented specification language. HASKELL, FP as functional specification oriented language.

  27. Comparative Chart of generations Generation Years Switching Devices Storing devices Switching Time 1st 49-55 Vacuum tubes 1KB memory 0.1 to 1 mili seconds 2nd 56-65 Transistor 100 KB main memory 1 to 10 micro secs 3rd 66-75 Integrated Circuits Large disks (100 MB), 1MB main memory 0.1 to 1 micro secs 4th 1st phase 75-84 LSI (large scale integrated circuits) 1000 MB disks 10 MB MM 10 to 100 nano secs 4th 2ndphase 85-2000 VLSI (very LSI) 100 GB Disks, 1GB MM 1 to 10 nano secs

  28. Comparative Chart of generations Generation MTBF (mean time between failure of Processor) Software Applications 1st 30 minutes to 1 hour Machine and simple monitor Science and business 2nd About 10 hours FORTRAN, COBOL Engineering, busineess, optimisation 3rd About 100 hours FORTRAN IV, COBOL 68 DBMS, On line system 4th 1st phase About 1000 hours FORTRAN 77, Pascal, ADA, COBOL 74 PCDS, Integrated CAD/CAM real time control 4th 2ndphase About 10000 hours C, C++, Java, PROLOG, Haskell, FORTRAN 90/95 Simulations, Visualilasation, parallel computing, multimedia

  29. The 5thGenerations FG is radically different from Von Neumann architecture. Specification oriented programming and incorporate artificial intelligence features. Changing the processor architecture. It is called Very large Instruction Word (VLIW). The size of one instruction is about 128 to 256 bits and has several parallel instructions. Any time and any place access to data and processing. This is called as wireless enabled processor chips (Centrino of Intel), which are used laptop and hand held computers. Demand for multimedia allowing users to use simple graphical user interface, listen to good quality audio, video on the desktop and mobile computers. FG is wireless enabled multimedia and high performance mobile computers.

  30. 5thGenerations .. Fifth generation computing devices, based on Artificial intelligence: Artificial Intelligence is the branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. Artificial intelligence includes Games Playing: programming computers to play games such as chess and checkers. Expert Systems: programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms) Natural Language: programming computers to understand natural human languages.

  31. 5thGenerations Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains Robotics: programming computers to see and hear and react to other sensory stimuli Voice recognition :Computer systems that can recognize spoken words. Comprehending human languages falls under a different field of computer science called natural language processing. A number of voice recognition systems are available on the market. The most powerful can recognize thousands of words. However, they generally require an extended training session during which the computer system becomes accustomed to a particular voice and accent. Such systems are said to be speaker dependent.

  32. 5thGenerations Quantum computation : First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer's processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers. Qubits do not rely on the traditional binary nature of computing

  33. 5thGenerations Molecular and nanotechnology: Nanotechnology is a field of science whose goal is to control individual atoms and molecules to create computer chips and other devices that are thousands of times smaller than current technologies permit. Current manufacturing processes use lithography to imprint circuits on semiconductor materials. While lithography has improved dramatically over the last two decades -- to the point where some manufacturing plants can produce circuits smaller than one micron(1,000 nanometers) - - it still deals with aggregates of millions of atoms. It is widely believed that lithography is quickly approaching its physical limits. To continue reducing the size of semiconductors, new technologies that juggle individual atoms will be necessary. This is the realm of nanotechnology.

  34. 5thGenerations Natural language: natural language means a human language. For example, English, French, and Chinese are natural languages. Computer languages, such as FORTRAN and C,are not. Probably the single most challenging problem in computer science is to develop computers that can understand natural languages. So far, the complete solution to this problem has proved elusive, although great deal of progress has been made. Fourth-generation languages are the programming languages closest to natural languages.

  35. 5thGenerations Parallel processing and superconductors : The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Parallel processing is the simultaneous use of more than one CPU to execute a program. Ideally, parallel processing makes a program run faster because there are more engines (CPUs) running it. In practice, it is often difficult to divide a program in such a way that separate CPUs can execute different portions without interfering with each other. Most computers have just one CPU, but some models have several. There are even computers with thousands of CPUs. With single-CPU computers, it is possible to perform parallel processing by connecting the computers in a network. However, this type of parallel processing requires very sophisticated software called distributed processing software. Note that parallel processing differs from multitasking, in which a single CPU executes several programs at once. Parallel processing is also called parallel computing.

  36. Moores Law 1965, Gordon E. Moore predicted that density of transistors in integrated circuits with double at regular interval of 2 years. Since, 1965, his prediction became true. Number of transistors per integrated circuit chip has approximately double in every 18 months. In 1974, the largest Dynamic Random Access memory chip had 16 kbits, whereas in 1998 it has 256 mbits, as increase of 16000 times in just 24 years. In 1984, the disks capacity in PCs was around 20 MB, where as it was 80 GB by 2004, which is 8000 fold increase. Now it around 150 GB. It has come without increase in price. Moore s law that foreseeable future will get more powerful computer with less price.

  37. Classification of computers Microcomputers Mainframe Supercomputers But technology has changed and all computers use microprocessor as their CPU. Thus classification is possible only through their mode of use. Palms Laptop PCs Desktop PCs Workstations Based on interconnected characteristics, Distributed computers Parallel computers

  38. Palm PCs/Simputer Which can be held in palm High density packing of transistors on a chip Palm with capabilities nearly that of PCs It accept handwritten inputs using an electronic pen on a palm screen Have small disk storage Can be connected to wireless network It has facilities to be used as mobile phone Has the facility of fax and e-mail. A version of MS OS called Window-CE is available for palm.

  39. Simputer Indian need for rural population called Simputer Simputer is a mobile handheld computer with inputs through icons on touch sensitive overlay on the LCD display panel. A unique feature of Simputer is the use of free open source OS called GNU/Linux. Cost is low as there is no cost for software. Another unique feature of Simputer is a smart card reader/writer which increases the functionality of the Simputer including possibility of personalisation of a single Simputer for several users.

  40. Laptop It is portable computer weighing around 2 kgs. They have key board, flat screen liquid crystal display and pentium or power PC processor. Colour display are also available Normally WINDOWS OS is used. LT come with hard disk (20 GB), CDROM and Floppy disk. They are designed to conserve energy by using power efficient chips. Trend of wireless connectivity to laptops so that they can read files from large stationery computers. Lt are used for word processing and spreadsheet computing.

  41. Personal Computers (PCs) Most of the PCs are desktop machines. Early PCs had intel 8088 microprocessor. Intel Pentium IV is the most popular process. The machines made by IBM are called IBM PCs. IBM PCs mostly use MS-Windows, WINDOWS-XP or GNU/Linux as operating system. Till 2004, PCs has 64 to 256 MB main memory, with 40 to 80 GB disk and now 160 GB 650 MB CDROM is also provided in PCs for multi-media use. Apple Pc are called Apple Machintosh. IBM Pcs are most popular.

  42. Workstations Woskstations are also desktop machines. More powerful processors about 10 times that of PCs. Most workstations have a large colour video display unit. Normally they have main memory of around 256 MB to 4 GB and disk of 80 to 320 GB. Workstations normally use RISC (Reduced Instruction Set Computer) processor such as MIPS (SIG), RIOS (IBM), SPARC (SUN), or PA-RISC (HP). Some manufactures of workstations are silicon graphics (SIG), IBM, SUN Microsystems and HEWlett Packed (HP). The standard OS of Workstations is UNIX and its derivatives such as AIX (IBM), Solaris (SUN), and HP-UX (HP). Very good graphics facilities an large video screens are provided by most workstations. A system called X Windows is provided by workstations to display the status of multiply process during their executions. Most workstations have built in hardware to connect to a LAN.

  43. Servers Workstations are characterized by high performance processors with large screens for interactive programming, While servers are used for specific purposes such as high performance numerical computing, web page hosting, data base store, printing etc. Interactive large scale screen are not necessary. Compute servers have high performance processors with large main memory, database servers have big on-line disk storage (100s of GB) and print servers support several high speed printers.

  44. Mainframe Computers Insurance, Banking and other companies need processor for large number of transactions on-line. They require computers with very large disks to store several Tera bytes of data and transfer data form disk to main memory at several hundred Megabytes/sec. The processing power needed from such computers is hundred million transactions per seconds. These computers are much bigger and faster than workstations and several hundred times more expensive. They provide extensive services such as user accounting, file security and control. they are much more reliable Few manufacturers, viz., IBM, and Hitachi.

  45. Supercomputers Super-computers are fastest computers available at any given time. They are used to solve the problem which require intensive numerical computations. Prediction of weather condition, designing supersonic aircrafts, design of drugs, modeling complex molecules. All these problems require 1016 calculations. These problems will be solved by 3 hours by a computer, which can carry a trillion calculations at a second. These computers are called super-computers by 2004. Super computers are built by interconnecting several high speed computers and programming them to work co-operatively to solve the problems.

  46. Supercomputers Conti They functions are expanded to analyze large commercial data base, produce animated movies and play games like chess. Besides these functions, SC have large main memory of 16 GB and secondary memory of 1000 GB. The speed of transfer of data from the secondary memory to main memory should be at least a tenth of the memory to CPU data Transfer speed. All SC use parallelism to achieve their speed.

  47. Parallel Computers A set of computers connected together by a high speed communication network and programmed in such a way that they co- operate to solve a single large problems is called a Parallel computers. Two types of Parallel computers: Shared memory parallel computer (SMPC) distributed memory parallel computer (DMPC)

  48. Shared Memory Parallel Computer Process of SMPC A number of processing elements are connected to a common main memory by a communication network. Programmes are written in such a way that multiple processor can work independently and co-operate to solve a problem. Programming of such a computer is relatively easy provided the problem can be broken up into parts.

  49. Shared Memory parallel Computers Shared Memory Communication Network CPU CPU CPU CPU

  50. SMPC Conti Limitations/Problems It is not scalable beyond about 16 processors as all the processors share a common memory. This memory is accessed via single communication network which gets saturated when many processors try to read or write from memory.

Related