Swiping your credit card through a terminal may not be the only way for you to make purchases. Credit card chip technology is a method credit card companies hope to promote in the U.S. to make credit card use easier, quicker and more frequent.
Credit cards that are equipped with a chip merely need to be waved in front of a scanner for the consumer's payment information to be processed. Because the chips are small, smaller credit cards that could be issued - and possibly even attached to key chains.
Other countries, such as England and Canada, have already embraced credit card chip technology.
Credit cards that contain information within a microchip rather than within a magnetic stripe provide higher security to consumers. The information within a magnetic stripe can be copied or "skimmed". Microchips are immune to skimming.
Credit card chip technology was first introduced in the 1990's but did not catch on in the U.S. due to the fact that most terminals were not equipped to read chip cards. The lack of updated credit card terminals resulted in consumers not being able to use chip scanning to make purchases.
Eight percent of the 325 million credit cards issued in the U.S. are already enabled with chip technology .