Cultural Differences in Handling Credit

Cultural Differences in Handling Credit

Barbara O’Neill, Ph.D., CFP®, Rutgers Cooperative Extension, oneill@aesop.rutgers.edu

When it comes to decisions about borrowing money and handling credit, the culture of a family, community, and country can have a major influence upon an individual’s behavior. What people are taught to believe about borrowing money and the use of credit can affect their financial decisions and debt payment practices for the rest of their lives. This is especially true when strong family role models or religious beliefs exert pressure to manage money in a certain way. Another strong influence is changing credit card industry practices and available products and services.

wedding

Cultural viewpoints about credit use within a family, or even a country, are not necessarily set in stone however. They often change over time. Witness the evolution of credit card use in the United States. Credit cards first began being used in the 1920s. They were issued primarily by department stores, hotel chains, and gasoline companies that sold fuel to increasing numbers of automobile owners. Credit cards were first accepted only at the businesses that issued the cards and were generally limited geographically in their use. Only by the late 1930s did several firms start to accept cards issued by others.

In the early years of credit card use in America, credit cards were marketed for their convenience (i.e., not having to carry around a lot of cash) rather than as a source of borrowed money. Many early cardholders were traveling salesmen who used them for business trips. The earliest credit cards (e.g., Diners Club in 1950) were actually “charge cards” that required users to pay the balance in full when billed. It was not until 1959 that the option to revolve a credit card balance from month to month became available. Consumers then began to pay finance charges if they carried a balance forward from month to month.

After BankAmericard (now Visa) was established in 1958 and MasterCharge (now MasterCard) in 1966, credit card use increased significantly in the U.S. By the 1970s, the U.S. government stepped in to more closely regulate the industry. The practice of mailing credit cards to people who had not applied for them was banned, as were certain discriminatory practices against women (e.g., asking about child-bearing plans as a factor to determine creditworthiness). During the 1980s and 1990s, credit card companies implemented a number of profitable fees and billing methods that added to the debt carried by many U.S. households. Some of these practices (e.g., over-the-limit fees, universal default, and two-cycle billing) have since been curtailed or prohibited by the CARD Act (2009 credit card legislation).

Fast forward 50 years from 1959, when the option to revolve credit card balances first became available, to 2009. By 2009, more than half (about 55%) of U.S. families carried a credit card balance and 78% had one or more credit cards. Average credit card debt per household was $8,329. Among households with a credit card, average credit card debt was even higher at $10,679. Typical consumers had access to about $19,000 on all credit cards combined; three of four (76%) college undergraduates had credit cards with an average balance of $2,200. By the end of 2008, Americans’ credit card debt collectively reached $972.73 billion.

Together, all of these statistics indicate a dramatic shift in American culture with respect to credit use. The predominant shopping pattern clearly shifted from “save up and buy it later” to “buy now and pay later.” Instead of viewing credit cards solely as a convenience, many cardholders started using them to extend (and, in some cases, overextend) their income. Advertising encouraged consumers to buy things immediately and easily available credit made it easy to do so. Most stores discontinued “layaway” plans, after credit use increased, because consumers stopped using them. Rather than wait to save up money to buy set aside items, people simply paid for them immediately with a credit card. Ironically, layaway plans made a comeback in 2008-2009. The financial crisis and recession fostered a new-found sense of thriftiness and consumers scaled back their use of credit cards and started to save more money. Contrary to the evolution in credit use in the United States, some cultures have held fast to their beliefs and practices regarding credit card use despite the increased use of credit worldwide. A prime example is the Muslim (Islam) religion. Muslims are required to follow strict standards of conduct called Shariah. Included within Shariah is a set of beliefs with respect to borrowing money. Muslims are forbidden to do business with any entity that charges interest, which is known as riba. All forms of financing involving interest (riba), directly or indirectly, are not allowed according to Islamic law. Gambling (e.g., casinos and lotteries) and alcohol and tobacco use are also prohibited.

To comply with Islamic law, many Muslims in the past used cash, checks, or debit cards to make all of their financial transactions. Some still do. In recent years, however, a number of “Shariah-compliant” credit cards and other financing methods (e.g., mortgages and small business loans) have become more common. The market for these alternative financing arrangements is large and increasing. Muslims comprise almost one-quarter of the world’s population and more than 10 million Muslims live in the United States alone.

Most Shariah-compliant credit cards are issued by financial institutions located in the Middle East, Persian Gulf, and Southeast Asia. However, since they are processed through the MasterCard and Visa networks, they can be used worldwide wherever credit cards are accepted. Issuing creditors often have Muslim advisory panels to assure that their products and services are compliant with Islamic law.

Three different types of Shariah-compliant credit cards are described below:

• Credit cards that are lease-based, instead of loan-based as typical credit cards in Western cultures are. What this means is that they include a lease-purchase arrangement where the credit card issuer (bank) holds title to a purchased item until the cardholder pays for it in full.

• Credit cards that involve a bank repurchase. Immediately after a cardholder makes a purchase with a credit card, the bank purchases the item and sells it to the cardholder at a higher price.

• Credit cards that are fee-based. The lender (bank) is compensated with a fixed monthly usage fee, typically 5% to 10% of the outstanding balance, for issuing a line of credit.

Another option for Muslims to comply with Islamic law is to use a regular credit card as a “convenience user.” In other words, pay monthly credit card bills promptly in full so as not to incur any interest charges. Still another option is to use pre-paid cards. In recent years, a number of financial services companies have begun offering prepaid cards for Muslims who desire the convenience of a credit card without the risk of inadvertently paying interest charges on traditional credit cards. These cards can be “loaded” with various dollar amounts according to issuers’ policies and users’ financial goals.

Culture clearly plays a role in how people use credit. In societies where spending is encouraged and credit is widely available, people often borrow money freely, especially during strong economic expansions. When full payment is not expected after a purchase, many people elect to revolve a balance and pay interest on the outstanding amount. In some cultures, however, religious beliefs prohibit the payment of interest. As a result, alternative purchasing techniques and forms of financing have evolved to help people obtain needed cash for homes, businesses, and other purchases in a manner that is consistent with their beliefs.