- Industri: Printing & publishing
- Number of terms: 1330
- Number of blossaries: 0
- Company Profile:
Routledge is a global publisher of academic books, journals and online resources in the humanities and social sciences.
While painting and writing on buildings has a millennial history graffiti in late twentiethcentury America faced two contradictory evaluations. An explosion of extensive stylized words and pictures in vivid colors associated with particular artists on New York subways in the 1970s focused debate on graffiti as popular art, associated with hip-hop culture and urban vitality Generally these paintings are not seen in terms of political issues, although race and ethnicity underpin graffiti wars. Yet, while museums, art galleries and universities have championed this viewpoint, urban officials have identified graffiti with vandalism, danger, gangs and quality-of-life issues. Hence, cities have adopted zero-tolerance policies to remove graffiti immediately (especially in downtown areas), or to prevent it through design or alternatives like active public mural programs (Philadelphia, PA, for example, has covered nearly 2,000 walls in this fashion)
Industry:Culture
Underlying generations of ethnic and racial separation in America is a complex mixture of competition, envy and suspicion of the “other.” Ethnic slurs have been and continue to be used to define and enforce cultural boundaries in the United States.
The number and variety of ethnic slurs and racial epithets in American culture indicate the extraordinary importance placed on group distinctions. The legacy of American slavery includes the use of labels to categorize and dehumanize African Americans, and the effect of other ethnic slurs is similar, though not as virulent. The message to the individual at whom a slur or epithet is directed is that whatever else you may think you are or have worked to become you are nothing more than a “wop” (Italian American) or a “nigger” (African American), and you will be treated as such.
Treatment runs the gamut from whispered slurs and subtle snubs to beatings and murder. These words, even when they are used in jest, change a person into an object and a target.
The use of ethnic slurs in American comedy became a complex and controversial topic in the late twentieth century. Comedians instinctively play on social animosities and fears, and few topics have the explosive charge of ethnic and racial identity. Godfrey Cambridge, Dick Gregory and Richard Pryor used racial epithets and stereotypes in their stand-up comedy routines, first to African American audiences in the 1960s and then, in modified form, to national audiences. Comedy including ethnic slurs, when performed by a member of the group that is usually targeted by the slur can have the effect of bolstering group identity and defusing the power of the slur. On the other hand, comedic or any other use of slurs that gets a wide audience has the side-effect of keeping slurs in circulation, and, it may be argued, giving them the acceptability of use by those who might be most offended. Nonetheless, the most heavily weighted ethnic slur in American culture was used as the title of Dick Gregory’s 1964 autobiography and by the 1990s rap group “NWA—Niggaz With Attitude.” At the same time, it became part of street language in cities between young African American men as a label and a greeting.
Other ethnically oriented comedians also have played with slurs in their comedy routines—for example, Alan King and Jackie Mason among Jewish comedians and the Italian American comedian Pat Cooper.
As much as slurs have been burlesqued and defused, they retain their power. Among other kinds of slurs and insults, they are recognized in American constitutional law as “fighting words” immune from the free-speech protections of the 1 st Amendment. In spite of their currency slurs have been rejected by the educated middle class. After the civil-rights movement and the heightening of ethnic identification, it was considered impolite or “incorrect” to depersonalize a group or individual with a slur. A reaction against the norm of incorrectness began in the late 1980s, when conservative commentators included the ban against ethic slurs among other proscribed behaviors in their critique of political correctness.
Industry:Culture
The Congress of the United States is the legislative branch of the federal government, established by Article I of the Constitution. Congress comprises the 435-member House of Representatives and the 100-member Senate. Each House member represents a portion of a state, and all House districts include approximately the same number of people (pursuant to a 1962 Supreme Court decision). Each senator represents an entire state, and each state has two senators. The entire House is up for re-election every two years; senators serve six-year terms and one-third of the Senate seats are up for election in each election cycle. As a result of these structural distinctions, the House and Senate have significantly different rules and cultures and, frequently different politics.
For most of the postwar period, Congress has been controlled by the Democratic Party. The Democrats controlled the House, without interruption, from 1955 through 1994, often by wide margins, and they controlled the Senate during those years as well, except from 1981 through 1986. In the 1994 elections, in a stunning reversal, Republicans gained control of both bodies, and they held onto that majority albeit by thinner and thinner margins, through 2000.
Ideological control of Congress followed a somewhat different pattern. Congress gradually became more liberal through the 1950s, but a conservative coalition of Southern Democrats (sometimes called Dixiecrats) and rural Republicans was often able to exercise a stranglehold over Congress into the early 1960s. Liberals gained control by the mid-1960s, swept in by Lyndon Johnson’s landslide victory in 1964 and replenished by the post-Watergate 1974 congressional elections. Conservatives gradually made a comeback through the 1980s before consolidating their power in the 1994 elections. In 1995 the keystone of the conservative majority was once again the South, now mostly represented by Republicans and exercising additional political muscle, thanks to the shift of population to the Sunbelt.
But, no matter who has been at the helm, the public attitude towards Congress throughout the postwar period has generally been one of scorn. From President Harry Truman running against the Republican “do-nothing” Congress in 1948 to member of Congress Newt Gingrich excoriating the Democratic Congress in 1994 to President Bill Clinton attacking the Gingrich-led Congress in 1996, Congress has been a reliable political whipping-boy an object of public derision and dismay.
While its popularity has varied from year to year, Congress’ approval ratings in polls since 1966 have been below 50 percent (after an unusual high point of more than 60 percent in 1965). Moreover, polling since 1960 has consistently found that the public has less confidence in Congress than in the other branches of the federal government, and often less confidence than in “big business” or the media. At a low point in 1991, fewer than 20 percent of those polled expressed a “great deal” or “quite a lot” of confidence in Congress.
Yet this contempt is not necessarily bred by familiarity. Polls have consistently found that while Americans disapprove of Congress as a whole, they like their individual representative. Asked how they would rate their individual Congress representative, more than 50 percent of Americans—in many of the postwar years, considerably more— expressed approval. That is one reason re-election rates for House and Senate incumbents have generally been higher than 80 percent—94 percent for House incumbents between 1982 and 1992.
In addition, much of the public is unfamiliar with the basic workings of Congress. For example, a 1996 Harvard study found that 39 percent of those questioned could not say which party was in control of the House of Representatives—this at a time of repeated and very widely reported partisan clashes in the House.
The low opinion of Congress has endured, perhaps paradoxically even though the institution has in many ways become more open, responsive and professional throughout the second half of the twentieth century To start with a fundamental, the demographics of congressional membership have become more varied. The number of African Americans in Congress increased from two in 1947 to thirty-nine in 1999, the number of women increased from eight to sixty-seven in the same period, and the religious make-up broadened as well. House members, in particular, increasingly came from different walks of life; the percentage of seats held by lawyers dropped from about 60 percent in 1953 to about 40 percent in 1994, and people were more likely to be elected to Congress without having had previous political experience. In addition, while agitation for “term limits” on members of Congress increased through much of the 1990s, Congress had fewer longtime members, mostly because of a surge of retirements.
Each House and Senate member was also increasingly likely during the postwar period to vote his or her own district or state’s interest rather than to be swayed by party leadership. (While party unity increased in the 1990s, this was generally due to the increased ideological consistency of party membership rather than to the increased power of party leadership.) This independence reflected, among other things, an increased use of polling, which gave members a sense that they knew better how their constituents stood on issues and changes in congressional rules, especially those initiated in the 1970s, which gave more junior members of Congress greater say over the drafting of legislation.
Throughout the postwar period, it also became easier for the public to follow congressional proceedings. Reforms in the 1970s made it easier for the public to get a complete view of committee proceedings. C-SPAN, a non-profit arm of the cabletelevision industry was given permission to offer “gavel-to-gavel” coverage of the House in 1979 and the Senate in 1986. Furthermore, by the mid-1990s, many congressional documents were available over the Internet.
Groups outside of Congress also began to provide more information. Beginning with the liberal Americans for Democratic Action in 1948, interest groups issued annual “report cards” evaluating key congressional votes. By the 1980s more than seventy groups, across the political spectrum, were attempting to hold congressional feet to the fire in that manner. Public interaction with Congress, through mail, phone calls, visits and eventually e-mail, also increased throughout the postwar period, although an increasing amount of the mail consisted of form letters drafted by liberal and conservative interest groups.
All these changes led Congress to increase its institutional resources and to regulate its behavior differently The size of congressional staffs exploded in the early 1970s and then stabilized. About 2,600 people worked for Congress in 1947; in 1991 the number was close to 19,000 (in both Washington, DC and local offices). Beginning in the 1970s, Congress began to do more to oversee the ethics of its members—although that hardly prevented recurrent scandals—and to crack down on the most egregious junkets and other perquisites. Lobbying was subjected to more restrictions, and, perhaps most significantly in 1974 campaign spending was made subject to enforceable restrictions and disclosure requirements for the first time.
None of this, however, stanched the growth of “interest-group” lobbying or the increasing flow of campaign funds into party coffers. With the federal government playing a growing role in American life and Americans’ penchant for forming organizations (noted first by de Tocqueville in the early nineteenth century), more and more groups—business and labor, religious and secular, liberal and conservative—moved their headquarters to Washington, DC or hired burgeoning lobbying firms to ply the halls of Congress.
By the late 1990s, members of Congress were more likely than ever to accuse their foes of being in the pocket of some “special-interest” group—business, labor, environmentalists, trial lawyers, etc. The public’s suspicion that Congress was controlled by “interests” that did not represent the “public interest,” along with the inherently chaotic and combative nature of the congressional process, seemed likely to perpetuate the low esteem with which Americans of all stripe regarded Congress.
Industry:Culture
While not an American invention, this popular dessert has stimulated American ingenuity as well as consumption. The US claims invention of ice-cream sodas (1879, Detroit, MI), sundaes (Evanston, Illinois, late nineteenth century), cones (1904, St. Louis, MO), packaged treats like Eskimo Pies and Popsicles (1920s) and soft ice cream (1939).
Offsetting summer overproduction in the dairy industry ice cream has also been the victim of postwar corporate conglomeration. This has created room for boutique brands like Haagen-Das and Ben & Jerry’s, whose inventive flavors evoke the Grateful Dead and other 1960s themes. Home ice-cream making, from hand-turned churns to electric machines, also has become a family summer ritual.
Industry:Culture
Sites of mass entertainment, incorporating rides (rollercoasters, carousels, etc.), games, shows, curiosities, animals and junk food, which became staples of American urban leisure in the late nineteenth century, exemplified by New York City, NY’s Coney Island or trolley parks like Philadelphia, PA’s Willow Grove. Suburbanization and urban conflicts (including desegregation) pushed many of these local amusement parks into hard times as they were eclipsed by new regional/ national destinations like Disneyland.
The late twentieth century mega-park represents corporate investment in a multi-day family vacation destination whose tab may run to hundreds of dollars. Contemporary options include chains (Six Flags, King’s Island), cross-corporate developments (Hershey Park, with links to chocolate, or the Busch Gardens chain linked to brewers) and media synergies like Disney, Universal Studios and Sesame Place. Their success has also changed entertainment development in zoos and aquaria, urban centers and tourism outside the US. “Theme parking” is also an accusation leveled against many recent urban development schemes as well as the creation of new private public spaces (such as malls).
Industry:Culture
The Miss America pageant is an annual beauty contest, taking place each September in Atlantic City, NJ, and broadcast on television. Beauty-contest champions from each state throughout the United States travel to Atlantic City to compete against each other for the “Miss America” title and accompanying prizes. The first official Miss America pageant took place in Atlantic City on September 7, 1921.
Industry:Culture
While all societies consume, mass consumption has taken on intense and multiple meanings within American society since the nineteenth century when advances in mass production and a continental market demanded a new mass consumer. This new person was fostered by newspaper advertising and department stores that channeled new affluence. Later, radio, film, television and the Internet have all created commercial media in which sales, sponsorship, product placement and information become intertwined. Consumption, despite repeated anticonsumerist movements, is also deeply linked to identity and status—class is read as consumption rather than production.
Contemporary consumption is framed by its economic history of the Great Depression in the 1930s followed by postwar affluence. A Depression “mentality” and the experience of limited rationing in the Second World War directly influenced parents of baby boomers, as well as new generations themselves. Yet for many products were our most important progress as new automobiles, appliances and materials created suburbs and recreated urban lifestyles. The postwar period, in particular, identified children and teenagers as consumers, shaping the intensive niche marketing that in later decades has driven fashion, media, music and other products. The Reagan era became a second spring for consumerism, from the borrowed designer dresses of the First Lady to the yuppies of Wall Street. Expanding credit cards (and debts) replaced savings as baby boomers and their offspring came into employment maturity at a time of apparently constant growth.
The postwar boom did not eliminate divisions in consumption even as it enshrined ideals of the marketplace. Among the struggles of the civilrights movement were African American demands for equal consumption—access to previously segregated department stores or public accommodations. Women, as consumers for the home, dependent on a husband’s salary also learned to establish economic independence through consumption and credit histories. The poor were doubly exploited—unable to buy as readily yet forced to consume cheaper or second-hand goods, or through plans like rental purchase or other financing agreements that doubled prices for inferior products.
One might not buy a new automobile every year, for example, but one is forced either to find something to deal with increasingly diffuse metropolitan life, or become more marginal to an automotive culture.
Intellectual movements have spoken against this intensive consumption in various ways: the beatniks of the 1950s and hippies of the 1960s both represent anti-materialist movements—although their stress on handicrafts or imported goods betrays an alternative consumption as well. Religious groups have promoted spirituality rather than materialism, yet wealthy churches and consumer-based religions, exchanging miracles for donations, underscore a synthesis of God and mammon long criticized in American life. Environmentalists have also pointed out that another result of runaway consumerism is runaway waste, evident in overflowing landfills and polluted ecosystems nationwide, even while “green” products also sell. Political and economic analysts also warn of the dangers of dependent consumerism—whether in the oil crisis of the 1970s or the continuous trade imbalances of the 1990s. Yet, at the same time, American consumption is seen as a vital component of world economic revitalization, where a sneaker plant in Indonesia represents both exploitation and opportunity Indeed, after the 1990’s extended growth and spending, consumerism is deeply ingrained in American society as an emblem of success, a source of individual satisfaction and a motor for American global power. At the same time, consumption is a discourse of division in a polarized society—where children may kill for expensive sneakers, while schools promote uniforms to “restrain” competition in the classroom.
Indeed, extensions of consumerism into areas of public good challenge American dreams of equality and democracy Should one have the right to buy media domination or political influence? Is the Internet a new agora or a new mall? Are education, healthcare and housing public rights or phenomena of the marketplace? Is freedom to consume, in fact, the pervasive yet hushed underpinning of the American dream, as well as the engine of American nightmares at home and abroad?
Industry:Culture
Summer child-education programs, generally under the aegis of Protestant and fundamentalist churches. They occupy the space of secular camps and daycare, but focus on religious training, including memorization of the Bible, plays, prayer and games. They are often associated with the Southern evangelical tradition, although they have been adapted to wider circumstances, including the pressing need for summer care in working families.
Industry:Culture
While disposing of the dead concerns all societies, American cemeteries reflect special historical and experiential concerns. An avoidance of direct government responsibility (except in the general regulation of the marketplace) has promoted multiple, competing options for burial which are complicated by a sense of extensive land that has allowed the preservation of older cemeteries alongside new innovations. Racial, ethnic, religious and class differences also have multiplied the number and meanings of cemeteries nationwide.
American cemeteries fall into a few broad categories: family/community church/religious, government and commercial. The last dominates late twentieth/early twenty-first century practice. In rural areas, family and church/congregation graveyards may still be central, and constitute places of pilgrimage and identity for widespread descendants. By contrast, metropolitan areas present a conflicting mapping of social change in their cemeteries and individual monuments to illustrious citizens.
The oldest cities include burial grounds no longer in use but maintained as historical memorials Savannah’s colonial cemetery or the African Burial Ground in New York City NY whose discovery became a point of debate over place and presence. Nineteenthcentury park cemeteries like Mt Auburn in Boston or Laurel Grove in Philadelphia, PA are also historic sites in their landscapes and “inhabitants,” although they are also maintained and used by long-resident families (especially elites).
In addition to congregational burial grounds, Jews and Catholics have also established their own consecrated sites in many cities. In name and use these may also distinguish among different ethnic groupings as well—Italians, Irish and Poles, for example, have built different Catholic cemeteries in Northeast industrial cities.
In the South, divisions of race are common either within cemeteries or in segregated clienteles. These differences are accentuated by the economics of caste which have made white cemeteries richer and better kept. African American cemeteries have faced inadequate endowments but have incorporated distinctive cultural traditions of burial and remembrance.
Government cemeteries include military burials, which are now straining available resources after the Second World War, Korean War and Vietnam War veterans for whom such burial is a less expensive option. Arlington National Cemetery across the Potomac from Washington, DC, is reserved for special memorialization, for example, the Tomb of the Unknown Soldier and burial of President John F. Kennedy. Battlefield cemeteries are also maintained abroad.
Municipalities face responsibilities for burial of unclaimed or impoverished deceased.
These may be contracted to commercial cemeteries or buried in a common potter’s field.
Commercial cemeteries take on many of the solemnities of earlier community-based burials, but adapt these to a profit margin—using uniform in-ground markers to facilitate mowing, promoting special-interest sections or touting advance planning and purchase plans. These are by far the most expensive options, where funerals, burial plots, markers and care cost tens of thousands of dollars. As commercial enterprises, concerns about bankruptcy or mismanagement have frequently surfaced, leading to government restrictions on operation and even takeovers. Critics also target their vulgar excess: Evelyn Waugh’s The Loved One (1948; movie, 1965) pilloried the famous Forest Lawn cemetery of Los Angeles, CA, decrying its commercialization of death and art.
In the late twentieth century cremation became a more popular, cheaper option, overcoming religious and ethnic taboos. Ashes are stored in the home, religious shrines or commercial mausoleums or distributed in personally meaningful sites.
Burial has not been limited to humans alone. Pet cemeteries (and crematory facilities) complete the humanization of domestic animals amid extraordinary affluence that characterizes modern American relations with animals.
Industry:Culture
Two very different stories are encompassed in the term “wrestling”—the college and professional games. College wrestling has been strong since the nineteenth century and featured in the revived Olympic Games held in Greece in 1896. There are now believed to be around 750,000 participants in the sport nationwide, with a large number concentrated in Pennsylvania and Ohio, the heartland of the sport.
The amateur sport has come under threat from a number of directions, however. First, the impact of Title IX has been felt most heavily in minor sports like wrestling without female opportunity Unwilling to cut into funding for football programs to make money available for women’s sports, colleges tended to cut back on wrestling. Of the 788 schools with programs in 1982, only 247 have programs in 1997.
This has had an impact on the number of scholarships available to wrestlers and has served to increase the intensity of the competition in wrestling. There is a long history of wrestlers trying to lose weight to remain in a lower weight class, but since 1997 there have been at least three deaths at colleges resulting from starvation and dehydration. One wrestler, for example, who normally weighed over 235 pounds attempted to wrestle at 190 pounds. Managing to bring his weight down to 195 pounds, he died of heart failure.
Deaths have also occurred due to muscle-building dietary supplements like creatine.
These events have caused the NCAA to place new restrictions on the way wrestlers shed pounds, ending the use of rubber exercise suits and diuretics. Amateur wrestling was also tainted by its association with John Du Pont, a multimillionaire who had never fulfilled his own wrestling ambitions. After funding the American Olympic team for many years and providing housing on his grounds, Du Pont shot and killed Dave Schultz, the leading American freestyle wrestler, in 1996.
Professional wrestling was transformed after the Second World War with the new medium of television and the establishment of the National Wrestling Alliance (NWA) in 1948 by Midwestern promoters. Instead of merely presenting athletic bouts, as previously these promoters staged athletic soap operas with themes of good against evil and of American wrestlers fighting off foreign enemies—Japanese, Middle-Eastern, or Soviet.
Although these events required great physical prowess and considerable training, the choreography involved fundamentally altered the nature of wrestling as a sport.
The 1980s saw a further explosion of wrestling on cable television. Ted Turner bought out the NWA in 1988 and established World Championship Wrestling (WCW), which became a mainstay of TNT, TBS and USA networks. The World Wrestling Federation (WWF) emerged as the other dominant wrestling federation picked up by a variety of television channels. In addition, many local professional organizations appeared in the late 1990s that promoted the same style of wrestling but in which the wrestlers, with great fan enthusiasm, suffered actual injuries.
Wrestling, with its combination of acting and athleticism, has become central to American popular culture. Commercials and movies frequently feature wrestlers like “Hollywood” (socalled by his detractors) Hulk Hogan, “Stone Cold” Steve Austin and Dwayne “The Rock” Johnson. Sony PlayStations and other television consoles feature games that many children can play from which they can learn moves and holds that they try out with their friends, occasionally with catastrophic results. The death of a young boy clothes-lined by his elder brother, has fueled the controversy about the impact of television violence on the young.
The election of former wrestler Jesse Ventura, as governor of Minnesota on a reform ticket, has illustrated the cultural significance of professional wrestling, helping to determine the outcome of a contest in the sport of name-recognition—politics.
Industry:Culture