What Is Computer Language?

Article Details
  • Written By: Daniel Liden
  • Edited By: Jenn Walker
  • Last Modified Date: 20 April 2017
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
In March, an astronaut aboard the International Space Station captured an image of lava flowing from Mount Etna.  more...

May 26 ,  1938 :  The House Un-American Activities Committee (HUAC) began its first session.  more...

A computer language is a "language" made up of formal syntax and semantics that allow users to communicate instructions to computer systems. Programming languages are by far the most common examples of computer languages, so the terms "computer language" and "programming language" are often used interchangeably. A computer cannot work with human language and, in most cases, a human cannot efficiently work in computer binary, so a higher-level language is necessary to allow programmers to give instructions to computer systems. There are several types of computer languages. Different programmers choose to use one or more languages because of personal preference, the particular capabilities of a given language, and the demands of the project at hand.

Syntax and semantics, or structure and meaning, are the two defining characteristics of computer language and can be compared to the grammar and vocabulary of human languages. Different programming terms must be placed in particular orders and marked by specific punctuation and spacing to be understood by the computer. In general, these restraints on computer language are much stricter than those on human languages. A human language can often still be understood even with grammar and word usage problems, but the interpretive powers of computers are comparatively limited; even minor spacing issues will, in many languages, result in an error message, as the computer will not be able to follow the instructions as given.


Human languages are intended to allow individuals to communicate with each other. This can involve discussion, command, interrogation, declaration, and many other forms of communication. The purpose of computer language, on the other hand, is generally to provide explicit instructions for the computer to follow, so such languages are generally imperative in nature. Some computer languages, however, are based primarily on the use of logical expressions or mathematical formulas. Reducing computer language to mathematical or logical expressions can help to reduce the possibilities of side effects that can result from some imperative expressions, but it can also make programming more difficult for those without extensive mathematical skills.

Computer languages are modified and developed over time to better meet the needs of their users and to keep up with technological advancements. Modifications include changes in syntax and semantics as well as increases and additions to overall functionality. A modern computer language may, for instance, be updated from an older version to make use of processors with multiple cores. Languages are also optimized for programming applications for mobile devices.


You might also Like


Discuss this Article

Post 4

@SkyWhisperer - I got started with FORTRAN, not BASIC. I basically taught myself this language using some of the computer language tutorials in the library and online.

I had to learn FORTRAN because I worked in the aerospace industry and that was the language of choice. FORTRAN is used for scientific and mathematical applications. As such, it doesn’t give you as much leeway to do things like program video games (although I suppose you could if you wanted to).

It provides more flexibility and power in working with data and number crunching. You could probably accomplish the same results with some other language but FORTRAN is more concise in my opinion.

Post 3

@David09 - The one problem that I have with an interpreted computer language like BASIC – or any computer language for that matter – is that the compiler is a stickler for correct syntax.

In C, for example, if you omit the semicolon at the end of a statement, the compiler will complain. Why should it complain? Can’t it figure out by the line break that it’s the end of the statement?

In my opinion compilers should be made to be a little smarter so that they can figure out what your intentions are, at least for obvious omissions like that.

Post 2

@Mammmood - I remember well those days myself. BASIC was great because it was a readable computer language. It made sense to the ordinary lay person.

However, you pay a price for this readability. That’s because the computer has to translate your English-like programming instructions into things that it can understand, which is basically machine language.

Machine language is a low level computer language. It’s not pretty to read or easy to understand but it requires less translating by the computer. The end result is that programs developed in machine language tend to run a lot faster than those developed in BASIC or even in some computer scripting language for example.

Of course nowadays you can just use C++ if you want blazing speed. Speed is needed for things like high end graphics programs.

Post 1

I think the first computer language used by many beginning programmers is BASIC. That’s the way it was for me. I learned BASIC on a TRS Model III computer, back in the mid 1980s.

It was a lot of fun, and I used it to develop a lot of computer games. The games weren’t fancy by any means but the BASIC programming language allowed me to get the job done quickly and easily.

The graphics were primitive; it was just black and white, but you don’t need fancy graphics to develop good games. You just need a good concept and know how to use your computer language. You find that the routines used to make your game can be adapted from one computer language to another.

Post your comments

Post Anonymously


forgot password?