Unless a programmer adheres exactly to the syntax

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: language. We normally call the set of words allowed in a language, the vocabulary of that particular language. For example, the words we use in English form the vocabulary of English language. Each word has a definite meaning and can be looked up in a dictionary. In a similar manner, all computer languages have a vocabulary of their own. Each word of the vocabulary has a definite unambiguous meaning, which can be looked up in the manual meant for that language. The main difference between a natural language and a computer language is that natural languages have a large vocabulary but most computer languages use a very limited or restricted vocabulary. This is mainly because a programming language by its very nature and purpose does not need to say too much. Each and every problem to be solved by a computer has to be broken down into discrete (simple and separate), logical steps which basically comprise of four fundamental operations - input and output operations arithmetic operations, movement of information within the CPU and memory, and logical or comparison operations. Each natural language has a systematic method of using symbols of that language which is defined by the grammar rules of the language. These rules tell us how to use the different words and symbols allowed in the language. Similarly, the words and symbols of a particular computer language must also be used as per set rules, which are known as the syntax rules of the language. In case of a natural language, people can use poor or incorrect vocabulary and grammar and still make themselves understood. However, in the case of a computer language, we must stick to the exact syntax rules of the language if we want to be understood correctly by the computer. As yet, no computer "is capable of correcting and deducing meaning from incorrect instructions. Computer languages are smaller and simpler than natural languages but they have to be used with great precision. Unless a programmer adheres exactly to t...
View Full Document

This document was uploaded on 04/07/2014.

Ask a homework question - tutors are online