The computer can recognize only two digits, the digits zero and one. For this reason its computational capability is said to be based on the binary system of notation. Most systems, of course, use ten digits, zero through nine. The binary system can be compared to the on and off settings of the common light bulb. In truth, the light bulb, or vacuum tube, formed the basis for the earliest computers; the bulb had a value of one when it was on, and a zero value when it was off. These bulbs were bulky and gave off large amounts of heat that had to be controlled by cooling the equipment. The early mainframe computers weighed tons in addition to their being bulky, slow, and unreliable. These problems were largely solved with the development of the transistor, a small device that replaced the vacuum tube. Eagerly, new computers were designed and marketed that were based on the transistor. These new and different computers were labeled second generation computers. However, the third generation computer soon resulted from the brilliant research activity that produced the chip, or miniature circuit. The new chip forms the basis of the computer that we now call the micro.