The Eighties in America
The Eighties in America Volume I Aboriginal rights in Canada— Gehry, Frank
Editor
Milton Be...
47 downloads
8922 Views
27MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
The Eighties in America
The Eighties in America Volume I Aboriginal rights in Canada— Gehry, Frank
Editor
Milton Berman, Ph.D. University of Rochester
Managing Editor
Tracy Irons-Georges
Salem Press, Inc. Pasadena, California Hackensack, New Jersey
Editorial Director: Christina J. Moose Managing Editor: Tracy Irons-Georges Production Editor: Joyce I. Buchea Copy Editors: Andy Perry, Timothy M. Tiernan, Acquisitions Editor: Mark Rehn and Rebecca Kuzins Research Supervisor: Jeffry Jensen Editorial Assistant: Dana Garey Research Assistant: Keli Trousdale Photo Editor: Cynthia Breslin Beres Graphics and Design: James Hutson Title page photo: Presidential candidate Ronald Reagan throws a kiss to his supporters at the Iowa Republican Convention, June 7, 1980. (AP/Wide World Photos) Cover images (pictured clockwise, from top left): Michael Jackson, 1988. (Hulton Archive/ Getty Images); Ronald and Nancy Reagan, inaugural parade in Washington, D.C., Jan. 20, 1981. (AP/Wide World Photos); Pac-Man video game, 1980. (Ullstein Bild); Mount St. Helen’s eruption, May 18, 1980. (AP/Wide World Photos)
Copyright © 2008, by Salem Press, Inc. All rights in this book are reserved. No part of this work may be used or reproduced in any manner whatsoever or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without written permission from the copyright owner except in the case of brief quotations embodied in critical articles and reviews or in the copying of images deemed to be freely licensed or in the public domain. For information address the publisher, Salem Press, Inc., P.O. Box 50062, Pasadena, California 91115. ∞ The paper used in these volumes conforms to the American National Standard for Permanence of Paper for Printed Library Materials, Z39.48-1992 (R1997).
Library of Congress Cataloging-in-Publication Data The eighties in America / editor, Milton Berman. p. cm. Includes bibliographical references and indexes. ISBN 978-1-58765-419-0 (set : alk. paper) — ISBN 978-1-58765-420-6 (v. 1: alk. paper) — ISBN 978-1-58765-421-3 (v. 2 : alk. paper) — ISBN 978-1-58765-422-0 (v. 3 : alk. paper) 1. United States—History—1969- —Encyclopedias. 2. United States—Social conditions—1980- — Encyclopedias. 3. United States—Politics and government—1981-1989—Encyclopedias. 4. United States—Intellectual life—20th century—Encyclopedias. 5. Popular culture—United States— History—20th century—Encyclopedias. 6. Nineteen eighties—Encyclopedias. I. Berman, Milton. E876.E347 2008 973.927003—dc22 2008005068
First Printing
printed in the united states of america
■ Table of Contents Publisher’s Note . . . . . . . . . . . . . . . . . . ix Contributors . . . . . . . . . . . . . . . . . . . . xi Complete List of Contents . . . . . . . . . . . xvii
Back to the Future. . . . . . . . . . . . . . . . . . 86 Bakker, Jim and Tammy Faye. . . . . . . . . . . 87 Ballet . . . . . . . . . . . . . . . . . . . . . . . 88 Baseball . . . . . . . . . . . . . . . . . . . . . . 90 Baseball strike of 1981 . . . . . . . . . . . . . . 93 Basketball . . . . . . . . . . . . . . . . . . . . . 94 Basquiat, Jean-Michel. . . . . . . . . . . . . . . 98 Beattie, Ann . . . . . . . . . . . . . . . . . . . 100 Beirut bombings. . . . . . . . . . . . . . . . . 101 Beloved . . . . . . . . . . . . . . . . . . . . . . 102 Bennett, William. . . . . . . . . . . . . . . . . 104 Bentsen, Lloyd . . . . . . . . . . . . . . . . . . 105 Berg, Alan . . . . . . . . . . . . . . . . . . . . 106 Berlin Wall . . . . . . . . . . . . . . . . . . . . 106 Big Chill, The . . . . . . . . . . . . . . . . . . . 108 Bioengineering . . . . . . . . . . . . . . . . . 110 Biological clock . . . . . . . . . . . . . . . . . 111 Biopesticides . . . . . . . . . . . . . . . . . . . 112 Bird, Larry . . . . . . . . . . . . . . . . . . . . 114 Black Monday stock market crash . . . . . . . 115 Blade Runner . . . . . . . . . . . . . . . . . . . 117 Blondie. . . . . . . . . . . . . . . . . . . . . . 118 Bloom County . . . . . . . . . . . . . . . . . . . 119 Blue Velvet. . . . . . . . . . . . . . . . . . . . . 120 Boat people . . . . . . . . . . . . . . . . . . . 121 Boitano, Brian . . . . . . . . . . . . . . . . . . 123 Bon Jovi . . . . . . . . . . . . . . . . . . . . . 124 Bonfire of the Vanities, The . . . . . . . . . . . . 125 Bonin, William. . . . . . . . . . . . . . . . . . 126 Book publishing . . . . . . . . . . . . . . . . . 127 Bork, Robert H. . . . . . . . . . . . . . . . . . 129 Bourassa, Robert . . . . . . . . . . . . . . . . 130 Bowers v. Hardwick . . . . . . . . . . . . . . . . 131 Boxing . . . . . . . . . . . . . . . . . . . . . . 132 Boy George and Culture Club . . . . . . . . . 135 Boyle, T. Coraghessan . . . . . . . . . . . . . . 136 Brat Pack in acting . . . . . . . . . . . . . . . 137 Brat Pack in literature . . . . . . . . . . . . . . 139 Brawley, Tawana . . . . . . . . . . . . . . . . . 140 Break dancing . . . . . . . . . . . . . . . . . . 142 Breakfast Club, The . . . . . . . . . . . . . . . . 144 Brett, George . . . . . . . . . . . . . . . . . . 145 Bridges, Jeff . . . . . . . . . . . . . . . . . . . 146 Broadway musicals. . . . . . . . . . . . . . . . 147 Broderick, Matthew . . . . . . . . . . . . . . . 149 Brokaw, Tom . . . . . . . . . . . . . . . . . . . 151
Aboriginal rights in Canada . . . . . . . . . . . . 1 Abortion . . . . . . . . . . . . . . . . . . . . . . 2 Abscam . . . . . . . . . . . . . . . . . . . . . . . 5 Academy Awards . . . . . . . . . . . . . . . . . . 7 ACT UP . . . . . . . . . . . . . . . . . . . . . . . 9 Action films . . . . . . . . . . . . . . . . . . . . 12 Adams, Bryan . . . . . . . . . . . . . . . . . . . 14 Advertising . . . . . . . . . . . . . . . . . . . . 15 Aerobics . . . . . . . . . . . . . . . . . . . . . . 21 Affirmative action. . . . . . . . . . . . . . . . . 23 Africa and the United States . . . . . . . . . . . 25 African Americans . . . . . . . . . . . . . . . . 27 Age discrimination . . . . . . . . . . . . . . . . 31 Agriculture in Canada . . . . . . . . . . . . . . 33 Agriculture in the United States . . . . . . . . . 34 AIDS epidemic . . . . . . . . . . . . . . . . . . 38 AIDS Memorial Quilt . . . . . . . . . . . . . . . 42 Air India Flight 182 bombing . . . . . . . . . . 43 Air pollution . . . . . . . . . . . . . . . . . . . 45 Air traffic controllers’ strike . . . . . . . . . . . 47 Airplane! . . . . . . . . . . . . . . . . . . . . . . 50 Aliens. . . . . . . . . . . . . . . . . . . . . . . . 51 Alternative medicine . . . . . . . . . . . . . . . 52 America’s Most Wanted . . . . . . . . . . . . . . . 53 Anderson, Terry . . . . . . . . . . . . . . . . . 54 Androgyny. . . . . . . . . . . . . . . . . . . . . 56 Apple Computer . . . . . . . . . . . . . . . . . 58 Archaeology . . . . . . . . . . . . . . . . . . . . 60 Architecture. . . . . . . . . . . . . . . . . . . . 62 Arena Football League . . . . . . . . . . . . . . 67 Art movements . . . . . . . . . . . . . . . . . . 68 Artificial heart . . . . . . . . . . . . . . . . . . 71 Asian Americans . . . . . . . . . . . . . . . . . 73 Aspartame . . . . . . . . . . . . . . . . . . . . . 76 Astronomy. . . . . . . . . . . . . . . . . . . . . 76 AT&T breakup . . . . . . . . . . . . . . . . . . 78 Atlanta child murders . . . . . . . . . . . . . . 80 Atwater, Lee . . . . . . . . . . . . . . . . . . . . 81 Auel, Jean M. . . . . . . . . . . . . . . . . . . . 83 Baby Fae heart transplantation . . . . . . . . . 84 Baby Jessica rescue . . . . . . . . . . . . . . . . 85 v
The Eighties in America
Bush, George H. W. . . . . . . . . . . . . . . . 152 Business and the economy in Canada . . . . . 156 Business and the economy in the United States . . . . . . . . . . . . . . . . . 157 Cabbage Patch Kids . . . . . . . . . . . . Cable television . . . . . . . . . . . . . . CAD/CAM technology . . . . . . . . . . Caffeine . . . . . . . . . . . . . . . . . . Cagney and Lacey . . . . . . . . . . . . . . Camcorders . . . . . . . . . . . . . . . . Canada Act of 1982 . . . . . . . . . . . . Canada and the British Commonwealth . Canada and the United States . . . . . . Canada Health Act of 1984 . . . . . . . . Canada-United States Free Trade Agreement . . . . . . . . . . . . . . . Canadian Caper . . . . . . . . . . . . . . Canadian Charter of Rights and Freedoms . . . . . . . . . . . . . . . . Cancer research . . . . . . . . . . . . . . Car alarms . . . . . . . . . . . . . . . . . Cats . . . . . . . . . . . . . . . . . . . . . Cell phones . . . . . . . . . . . . . . . . Central Park jogger case . . . . . . . . . Cerritos plane crash . . . . . . . . . . . . Challenger disaster . . . . . . . . . . . . . Cheers . . . . . . . . . . . . . . . . . . . . Cher . . . . . . . . . . . . . . . . . . . . Children’s literature. . . . . . . . . . . . Children’s television. . . . . . . . . . . . China and the United States . . . . . . . Chrétien, Jean . . . . . . . . . . . . . . . Chrysler Corporation federal rescue . . . Claiborne, Harry E. . . . . . . . . . . . . Clancy, Tom . . . . . . . . . . . . . . . . Classical music . . . . . . . . . . . . . . . Close, Glenn . . . . . . . . . . . . . . . . Closing of the American Mind, The . . . . . CNN . . . . . . . . . . . . . . . . . . . . Cold Sunday . . . . . . . . . . . . . . . . Cold War . . . . . . . . . . . . . . . . . . Color Purple, The . . . . . . . . . . . . . . Colorization of black-and-white films . . Comedians . . . . . . . . . . . . . . . . . Comic Relief . . . . . . . . . . . . . . . . Comic strips . . . . . . . . . . . . . . . . Compact discs (CDs) . . . . . . . . . . . Computers . . . . . . . . . . . . . . . . . Conch Republic . . . . . . . . . . . . . .
. . . . . . . . . .
. . . . . . . . . .
. . . . . . . . . .
Confederacy of Dunces, A . . . . . Congress, U.S. . . . . . . . . . . Congressional page sex scandal Conservatism in U.S. politics . . Consumerism . . . . . . . . . . Cosby Show, The . . . . . . . . . . Cosmos . . . . . . . . . . . . . . Costner, Kevin . . . . . . . . . . Country music . . . . . . . . . . Crack epidemic . . . . . . . . . Craft, Christine . . . . . . . . . Crime . . . . . . . . . . . . . . Cruise, Tom . . . . . . . . . . . Cyberpunk literature . . . . . .
164 165 168 170 171 172 173 176 178 181
. . . 182 . . . 183 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
184 185 187 188 189 190 192 194 197 198 199 205 210 211 213 214 215 216 218 219 220 222 223 227 228 230 233 233 235 238 241 vi
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
. . . . . . . . . . . . . .
242 242 246 247 249 251 253 254 255 259 260 262 265 266
Dallas . . . . . . . . . . . . . . . . . Dance, popular . . . . . . . . . . . Davies, Robertson . . . . . . . . . . Day After, The . . . . . . . . . . . . . Decker, Mary. . . . . . . . . . . . . Deconstructivist architecture . . . . De Lorean, John . . . . . . . . . . . Demographics of Canada . . . . . . Demographics of the United States Designing Women . . . . . . . . . . . Devo . . . . . . . . . . . . . . . . . Diets . . . . . . . . . . . . . . . . . Disability rights movement . . . . . Disposable cameras . . . . . . . . . DNA fingerprinting . . . . . . . . . Do the Right Thing . . . . . . . . . . Domestic violence . . . . . . . . . . Doppler radar . . . . . . . . . . . . Douglas, Michael . . . . . . . . . . Drug Abuse Resistance Education (D.A.R.E.). . . . . . . . . . . . . Dukakis, Michael . . . . . . . . . . Dupont Plaza Hotel fire . . . . . . . Duran Duran . . . . . . . . . . . . Dworkin, Andrea . . . . . . . . . . Dynasty . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
269 271 272 274 275 276 278 279 282 286 287 288 290 291 292 293 295 297 297
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
299 299 301 302 304 305
École Polytechnique massacre . . . . . . Economic Recovery Tax Act of 1981 . . . Education in Canada . . . . . . . . . . . Education in the United States . . . . . . El Niño . . . . . . . . . . . . . . . . . . . Elections in Canada . . . . . . . . . . . . Elections in the United States, midterm . Elections in the United States, 1980 . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
307 309 310 312 315 317 320 322
Table of Contents
Elections in the United States, 1984 . . . Elections in the United States, 1988 . . . Elway, John. . . . . . . . . . . . . . . . . Empire Strikes Back, The. . . . . . . . . . . Environmental movement . . . . . . . . Epic films . . . . . . . . . . . . . . . . . Erdrich, Louise . . . . . . . . . . . . . . E.T.: The Extra-Terrestrial . . . . . . . . . . Europe and North America. . . . . . . . Evangelical Lutheran Church in America Exxon Valdez oil spill . . . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
326 330 333 334 335 340 341 342 344 346 347
Facts of Life, The . . . . . . . Fads . . . . . . . . . . . . . Falwell, Jerry . . . . . . . . . Family Ties . . . . . . . . . . Farm Aid . . . . . . . . . . . Farm crisis . . . . . . . . . . Fashions and clothing . . . . Fast Times at Ridgemont High . Fatal Attraction . . . . . . . . Fax machines . . . . . . . . Feminism . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
350 350 352 353 355 356 357 360 361 362 363
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
. . . . . . . . . . .
vii
Ferraro, Geraldine . . . . . . . . . Fetal medicine . . . . . . . . . . . . Film in Canada . . . . . . . . . . . Film in the United States . . . . . . Flag burning . . . . . . . . . . . . . Flashdance . . . . . . . . . . . . . . Flynt, Larry . . . . . . . . . . . . . Food Security Act of 1985 . . . . . . Food trends . . . . . . . . . . . . . Football . . . . . . . . . . . . . . . Ford, Harrison. . . . . . . . . . . . Foreign policy of Canada . . . . . . Foreign policy of the United States. 401(k) plans . . . . . . . . . . . . . Fox, Michael J. . . . . . . . . . . . . FOX network . . . . . . . . . . . . Full Metal Jacket. . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
365 366 367 369 374 375 376 377 379 381 384 385 387 391 392 393 395
Gallagher . . . . . . . . . . Gallaudet University protests Gangs. . . . . . . . . . . . . Garneau, Marc. . . . . . . . Gehry, Frank . . . . . . . . .
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
397 398 399 400 401
. . . . .
. . . . .
. . . . .
. . . . .
■ Publisher’s Note Reagan, AIDS, the Challenger disaster, MTV, Yuppies, “Who Shot J. R.?”—the 1980’s was a pivotal time. The Eighties in America examines such iconic personalities, issues, and moments of the decade. America had a renewed sense of confidence after the chaos of the 1960’s and 1970’s, but many people found themselves shut out of the new prosperity. Looming threats and difficult questions, both old and new, remained within American society and the world. The encyclopedia serves as a valuable source of reliable information and keen insights for today’s students, most of whom were born after the decade ended.
• • • • • • • • • • • • • • • • • • • • • • • • • • •
Contents of the Encyclopedia
This illustrated threevolume encyclopedia is a companion set to The Sixties in America (1999), The Fifties in America (2005), and The Seventies in America (2006). It covers events, movements, people, and trends in popular culture, literature, art, sports, science, technology, economics, and politics in both the United States and Canada. The Eighties in America features long overviews and short entries discussing people, books, films, television series, musical groups, and other important topics representative of that era. Every entry focuses on the topic or person during the 1980’s—for this work, defined as January, 1, 1980, through December 31, 1989—in order to explore what made the decade unique. Topics that span several decades often provide some background and information on subsequent events to help place the 1980’s in perspective. The Eighties in America contains 663 essays, in alphabetical order, ranging from 1 to 6 pages in length. Written with the needs of students and general readers in mind, the essays present clear discussions of their topics, explaining terms and references that may be unfamiliar. Entries fall into the following general categories: • • • • • • •
disasters economics education environmental issues film health and medicine international relations journalism Latinos legislation literature military and war music Native Americans people politics and government popular culture religion and spirituality science and technology sexuality social issues sports television terrorism theater and dance transportation women’s issues
The encyclopedic format allows readers to take either a broad view or a narrow one. For example, in addition to the overview of the Iran-Contra scandal, The Eighties in America offers related entries on important figures (Colonel Oliver North and Admiral John Poindexter), legislative reactions (the Tower Commission), and foreign policy issues (the Reagan Doctrine). The Eighties in America contains more than three hundred evocative photographs of people and events. In addition, more than sixty sidebars—lists, time lines, tables, graphs, excerpts from speeches— highlight interesting facts and trends from the decade.
African Americans art and architecture Asian Americans business Canada court cases and the law crime and punishment
Essay Organization Every essay begins with a clear, concise title followed by a brief description called Identification (for people, organizations, and works, such as books or films); Definition (for ob-
ix
The Eighties in America
jects, concepts, and overviews); or The Event. Next, a heading for Author, Publisher, Director, or Producer is used when appropriate and includes vital dates. A Date line appears for events, legislation, films, books, television series, plays, and any topic linked to a discrete time. Biographical entries feature the headings Born and Died, listing the date and place of birth and death for the subject. A Place line appears if appropriate. Every essay includes a brief assessment of what made the topic important during the 1980’s. Within the text, boldfaced subheads show readers the overall organization of the essay at a glance, make finding information quick and easy. Every essay features an Impact section, which examines the subject’s broader importance during the 1980’s. Longer overviews sometimes include a section called Subsequent Events that sums up later developments. Cross-references at the end of each essay direct readers to additional entries in the encyclopedia on related subjects. Every entry, regardless of length, offers bibliographical notes under the heading Further Reading in order to guide readers to additional information about the topic; annotations are provided in essays of 1,000 words or more. Every essay includes an author byline.
watched U.S. television shows, and Emmy Award winners. The two literature appendixes list the bestselling U.S. books and the winners of major literary awards, and two music appendixes provide notable facts about some of the decade’s most popular musicians and list Grammy Award winners. A sports appendix provides a quick glance at the winners of major sporting events of the 1980’s. The two legislative appendixes look at major decisions of the U.S. Supreme Court and important legislation passed by Congress during the decade. The other appendixes are a glossary of new words and slang from the 1980’s, a detailed time line of the decade, an annotated general bibliography, and an annotated list of Web sources on 1980’s subjects. The encyclopedia also contains a number of useful tools to help readers find entries of interest. A complete list of all essays in The Eighties in America appears at the beginning of each volume. Volume 3 contains a list of entries sorted by category, personage and photo indexes, and a comprehensive subject index. Acknowledgments
The editors of Salem Press would like to thank the scholars who contributed essays and appendixes to The Eighties in America; their names and affiliations are listed in the front matter to volume 1. The editors would also like to thank Professor Milton Berman of the University of Rochester for serving as the project’s Editor and for bringing to the project his expertise on North American history.
Appendixes
Volume 3 of The Eighties in America contains sixteen appendixes that provide additional information about selected aspects of the decade in easily accessible formats. The five entertainment appendixes list major films, Academy Award winners, major Broadway plays and theatrical awards, most-
x
■ Contributors Michael Adams
Milton Berman
Frederick B. Chary
CUNY Graduate Center
University of Rochester
Indiana University Northwest
Richard Adler
R. Matthew Beverlin
Douglas Clouatre
University of Michigan-Dearborn
University of Kansas
Mid-Plains Community College
Jennifer L. Amel
Margaret Boe Birns
Thad Cockrill
Minnesota State University, Mankato
New York University
Southwest Tennessee Community College
Corinne Andersen
Nicholas Birns
Lily Neilan Corwin
Peace College
The New School
Catholic University of America
Carolyn Anderson
Devon Boan
Eddith A. Dashiell
University of Massachusetts
Belmont University
Ohio University
Mary Welek Atwell
Bernadette Lynn Bosky
Mary Virginia Davis
Radford University
Olympiad Academia
University of California, Davis
Charles Lewis Avinger, Jr.
Gordon L. Bowen
Danielle A. DeFoe
Washtenaw Community College
Mary Baldwin College
California State University, Sacramento
Mario J. Azevedo
John Boyd
Antonio Rafael de la Cova
Jackson State University
Appalachian State University
University of North Carolina, Greensboro
Sylvia P. Baeza
Kevin L. Brennan
Paul Dellinger
Applied Ballet Theater
Ouachita Baptist University
Wytheville, Virginia
Amanda Bahr-Evola
Matt Brillinger
Joseph Dewey
Southern Illinois University, Edwardsville
Carleton University
University of Pittsburgh at Johnstown
Jocelyn M. Brineman
Thomas E. DeWolfe Hampden-Sydney College
University of North Texas
University of North Carolina, Charlotte
Jane L. Ball
William S. Brockington, Jr.
Wilberforce University
University of South Carolina, Aiken
University of Illinois at UrbanaChampaign
Carl L. Bankston III
Susan Love Brown
Marcia B. Dinneen
Tulane University
Florida Atlantic University
Bridgewater State College
David Barratt
Michael H. Burchett
L. Mara Dodge
Asheville, North Carolina
Limestone College
Westfield State College
Maryanne Barsotti
William E. Burns
J. R. Donath
Warren, Michigan
George Washington University
California State University, Sacramento
Garlena A. Bauer
Joseph P. Byrne
Cecilia Donohue
Otterbein College
Belmont University
Madonna University
Alvin K. Benson
Richard K. Caputo
Georgie L. Donovan
Utah Valley State College
Yeshiva University
Appalachian State University
Jim Baird
M. Casey Diana
xi
The Eighties in America
Desiree Dreeuws
Janet E. Gardner
Bernadette Zbicki Heiney
Claremont Graduate University
University of Massachusetts at Dartmouth
Lock Haven University of Pennsylvania
Thomas Du Bose Louisiana State University at Shreveport
James J. Heiney Ryan Gibb
Lock Haven University of Pennsylvania
University of Kansas
Julie Elliott Indiana University South Bend
Jennifer Heller Richard A. Glenn
University of Kansas
Millersville University
Thomas L. Erskine Salisbury University
Peter B. Heller Nancy M. Gordon
Manhattan College
Amherst, Massachusetts
Kevin Eyster Madonna University
Timothy C. Hemmis Sidney Gottlieb
Edinboro University of Pennsylvania
Sacred Heart University
Elisabeth Faase Athens Regional Medical Center
Diane Andrews Henningfeld Elizabeth B. Graham
Adrian College
Clarion County Adult Probation Office
Susan A. Farrell
Mark C. Herman
Kingsborough Community College, CUNY
Charles Gramlich
Thomas R. Feller
Michael E. Graydon
Nashville, Tennessee
Carleton University
David G. Fisher
Scot M. Guenter
Lycoming College
San José State University
Patrick Fisher
Needham Yancey Gulley
Seton Hall University
University of Georgia
Dale L. Flesher
Larry Haapanen
University of Mississippi
Lewis-Clark State College
George J. Flynn
Michael Haas
SUNY—Plattsburgh
College of the Canyons
Joseph Francavilla
Irwin Halfond
Columbus State University
McKendree College
Michael S. Frawley
Jan Hall
Louisiana State University
Columbus, Ohio
Timothy Frazer
Timothy L. Hall
Magadalen College
University of Mississippi
Ben Furnish
Randall Hannum
Glendale Community College
University of Missouri-Kansas City
New York City College of Technology, CUNY
Bruce E. Johansen
Edison College
Xavier University of Louisiana
Steve Hewitt University of Birmingham
Randy Hines Susquehanna University
Samuel B. Hoff Delaware State University
Kimberley M. Holloway King College
Mary Hurd East Tennessee State University
Raymond Pierre Hylton Virginia Union University
Margot Irvine University of Guelph
Ron Jacobs Asheville, North Carolina
Jeffry Jensen
Hayes K. Galitski Claremont, California
University of Nebraska at Omaha
Alan C. Haslam California State University, Sacramento
Ann D. Garbett Averett University
Barbara E. Johnson University of South Carolina, Aiken
John C. Hathaway Midlands Technical College
Sheila Golburgh Johnson Santa Barbara, California
xii
Contributors
Mark S. Joy
Victor Lindsey
Nancy Farm Mannikko
Jamestown College
East Central University
National Park Service
Laurence R. Jurdem
Alar Lipping
Martin J. Manning
Jurdem Associates Public Relations
Northern Kentucky University
U.S. Department of State
David Kasserman
Renée Love
Laurence W. Mazzeno
Rowan University
Lander University
Alvernia College
Steven G. Kellman
Bernadette Flynn Low
Scott A. Merriman
University of Texas at San Antonio
Community College of Baltimore CountyDundalk
University of Kentucky
Leigh Husband Kimmel Indianapolis, Indiana
Nancy Meyer Denise Low
Academy of Television Arts and Sciences
Haskell Indian Nations University
Bill Knight Western Illinois University
Dodie Marie Miller M. Philip Lucas
Indiana Business College
Cornell College
John P. Koch Blake, Cassels, and Graydon
Esmorie J. Miller Eric v. d. Luft
Ottawa, Ontario
North Syracuse, New York
Gayla Koerting University of South Dakota
P. Andrew Miller R. C. Lutz
Northern Kentucky University
Madison Advisors
Grove Koger
Randall L. Milstein
Boise State University
Laurie Lykken
Oregon State University
Margaret A. Koger
Century Community and Technical College
William V. Moore
Boise, Idaho
College of Charleston
Richard D. McAnulty Rebecca Kuzins
University of North Carolina, Charlotte
Pasadena, California
Anthony Moretti Point Park University
Joanne McCarthy Andrew J. LaFollette
Tacoma, Washington
Silver Spring, Maryland
Bernard E. Morris Modesto, California
Andrew Macdonald Wendy Alison Lamb
Loyola University, New Orleans
South Pasadena, California
Alice Myers Bard College at Simon’s Rock
Mary McElroy William T. Lawlor
Kansas State University
University of Wisconsin-Stevens Point
John Myers Bard College at Simon’s Rock
Robert R. McKay Joseph Edward Lee
Clarion University of Pennsylvania
Winthrop University
Daniel-Raymond Nadon Kent State University-Trumbull Campus
Shelly McKenzie Ann M. Legreid
George Washington University
Leslie Neilan
David W. Madden
Virginia Polytechnic Institute and State University
University of Central Missouri
Denyse Lemaire
California State University, Sacramento
Caryn E. Neumann
Rowan University
Scott Magnuson-Martinson Sharon M. LeMaster
John Nizalowski
Gwinnett Technical College
Michael E. Manaton Thomas Tandy Lewis
Miami University of Ohio at Middletown
Normandale Community College
Beaverton, Oregon
St. Cloud State University
xiii
Mesa State College
The Eighties in America
Holly L. Norton
Marguerite R. Plummer
Sandra Rothenberg
University of Northwestern Ohio
Louisiana State University at Shreveport
Framingham State College
Austin Ogunsuyi
Michael Polley
Richard Rothrock
Fairleigh Dickinson University
Columbia College
Dundee, Michigan
James F. O’Neil
Kimberly K. Porter
Thomas E. Rotnem
Florida Gulf Coast University
University of North Dakota
Southern Polytechnic State University
Brooke Speer Orr
Jessie Bishop Powell
Joseph R. Rudolph, Jr.
Westfield State College
Lexington, Kentucky
Towson University
Arsenio Orteza
Luke Powers
Irene Struthers Rush
St. Thomas More High School
Tennessee State University
Boise, Idaho
Robert J. Paradowski
Jean Prokott
Malana S. Salyer
Rochester Institute of Technology
Minnesota State University, Mankato
University of Louisville
James Pauff Tarleton State University
Joseph C. Santora Maureen Puffer-Rothenberg
Thomas Edison State College
Valdosta State University
Roger Pauly University of Central Arkansas
Sean J. Savage Aaron D. Purcell
Saint Mary’s College
University of Tennessee, Knoxville
Cheryl Pawlowski University of Northern Colorado
Jean Owens Schaefer Edna B. Quinn
University of Wyoming
Salisbury University
Rick Pearce Illinois Board of Higher Education
Elizabeth D. Schafer Christopher Rager
Loachapoka, Alabama
Pasadena, California
Michael Pelusi Philadelphia, Pennsylvania
Lindsay Schmitz Cat Rambo
University of Missouri, St. Louis
Redmond, Washington
Ray Pence University of Kansas
Matthew Schmitz Steven J. Ramold Eastern Michigan University
Southern Illinois University, Edwardsville
Kilby Raptopoulos
Lacy Schutz
University of Arkansas at Little Rock
The Sterling and Francine Clark Art Institute
Jan Pendergrass University of Georgia
Alan Prescott Peterson Gordon College
John David Rausch, Jr. West Texas A&M University
Taylor Shaw
P. Brent Register
ADVANCE Education and Development Center
R. Craig Philips Michigan State University
Clarion University of Pennsylvania
Douglas A. Phillips Sierra Vista, Arizona
Martha A. Sherwood H. William Rice
University of Oregon
Kennesaw State University
John R. Phillips
R. Baird Shuman
Purdue University Calumet
Betty Richardson
Erika E. Pilver
Southern Illinois University, Edwardsville
Westfield State College
Charles L. P. Silet Robert B. Ridinger
Troy Place
University of Illinois at UrbanaChampaign
Northern Illinois University
Western Michigan University
xiv
Iowa State University
Contributors
Michael W. Simpson
Aswin Subanthore
Daniel R. Vogel
Gallup, New Mexico
University of Wisconsin-Milwaukee
Edinboro University of Pennsylvania
Paul P. Sipiera
Cynthia J. W. Svoboda
William T. Walker
William Rainey Harper College
Bridgewater State College
Chestnut Hill College
Amy Sisson
Peter Swirski
Spencer Weber Waller
University of Houston-Clear Lake
University of Hong Kong
Loyola University Chicago School of Law
Douglas D. Skinner
James Tackach
Mary C. Ware
Texas State University-San Marcos
Roger Williams University
SUNY, College at Cortland
Caroline Small
Cassandra Lee Tellier
Donald A. Watt
Burtonsville, Maryland
Capital University
Dakota Wesleyan University
Rhonda L. Smith
Nicholas D. ten Bensel
Marcia J. Weiss
Alice Lloyd College
University of Arkansas at Little Rock
Point Park University
Roger Smith
John M. Theilmann
Twyla R. Wells
Portland, Oregon
Converse College
University of Northwestern Ohio
Tom Smith
Susan E. Thomas
George M. Whitson III
New Mexico State University
Indiana University South Bend
University of Texas at Tyler
Alan L. Sorkin
Traci S. Thompson
Thomas A. Wikle
University of Maryland-Baltimore County
Hardin-Simmons University
Oklahoma State University
Jennifer L. Titanski
Tyrone Williams
Lock Haven University of Pennsylvania
Xavier University
Anh Tran
Richard L. Wilson
Wichita State University
University of Tennessee at Chattanooga
Paul B. Trescott
Mary A. Wischusen
Southern Illinois University
Wayne State University
Marcella Bush Trevino
Scott Wright
Barry University
University of St. Thomas
Jack Trotter
Susan J. Wurtzburg
Trident College
University of Utah
Sheryl L. Van Horne
Kristen L. Zacharias
Pennsylvania State University
Albright College
Sara Vidar
Tusty Zohra
Los Angeles, California
University of Arkansas at Little Rock
Sonia Sorrell Pepperdine University
Leigh Southward Tennessee Technological University
Brian Stableford Reading, England
Alison Stankrauff Indiana University South Bend
August W. Staub University of Georgia
James W. Stoutenborough University of Kansas
Fred Strickert Wartburg College
Charles L. Vigue University of New Haven
xv
■ Complete List of Contents Volume I Publisher’s Note . . . . . . . . . . ix Contributors . . . . . . . . . . . . xi Complete List of Contents . . . xvii Aboriginal rights in Canada . . . . 1 Abortion. . . . . . . . . . . . . . . 2 Abscam . . . . . . . . . . . . . . . 5 Academy Awards . . . . . . . . . . 7 Accountability in education. See Standards and accountability in education Acquired immunodeficiency syndrome. See AIDS epidemic ACT UP . . . . . . . . . . . . . . . 9 Action films . . . . . . . . . . . . 12 Adams, Bryan . . . . . . . . . . . 14 Advertising . . . . . . . . . . . . . 15 Aerobics . . . . . . . . . . . . . . 21 Affirmative action . . . . . . . . . 23 Africa and the United States . . . 25 African Americans . . . . . . . . . 27 Age discrimination . . . . . . . . 31 Agriculture in Canada. . . . . . . 33 Agriculture in the United States . . . . . . . . . . . . . . 34 AIDS epidemic. . . . . . . . . . . 38 AIDS Memorial Quilt . . . . . . . 42 Air India Flight 182 bombing . . . 43 Air pollution . . . . . . . . . . . . 45 Air traffic controllers’ strike . . . 47 Airplane! . . . . . . . . . . . . . . 50 Aliens . . . . . . . . . . . . . . . . 51 Alternative medicine . . . . . . . 52 America’s Most Wanted . . . . . . . 53 Anderson, Terry . . . . . . . . . . 54 Androgyny . . . . . . . . . . . . . 56 Apple Computer. . . . . . . . . . 58 Archaeology . . . . . . . . . . . . 60 Architecture . . . . . . . . . . . . 62 Arena Football League . . . . . . 67 Art movements . . . . . . . . . . 68 Artificial heart . . . . . . . . . . . 71 Asian Americans . . . . . . . . . . 73 Aspartame . . . . . . . . . . . . . 76 Assassination attempt on Ronald Reagan. See Reagan assassination attempt Astronomy . . . . . . . . . . . . . 76
AT&T breakup. . . . . Atlanta child murders . Atwater, Lee . . . . . . Auel, Jean M. . . . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
78 80 81 83
Baby Fae heart transplantation . . . . . . . . . 84 Baby Jessica rescue . . . . . . . . 85 Back to the Future . . . . . . . . . . 86 Bakker, Jim and Tammy Faye . . . 87 Ballet . . . . . . . . . . . . . . . . 88 Baseball . . . . . . . . . . . . . . 90 Baseball strike of 1981. . . . . . . 93 Basketball . . . . . . . . . . . . . 94 Basquiat, Jean-Michel . . . . . . . 98 Beattie, Ann . . . . . . . . . . . 100 Beirut bombings . . . . . . . . . 101 Beloved. . . . . . . . . . . . . . . 102 Bennett, William . . . . . . . . . 104 Bentsen, Lloyd . . . . . . . . . . 105 Berg, Alan . . . . . . . . . . . . 106 Berlin Wall . . . . . . . . . . . . 106 Big Chill, The . . . . . . . . . . . 108 Bioengineering. . . . . . . . . . 110 Biological clock . . . . . . . . . 111 Biopesticides . . . . . . . . . . . 112 Bird, Larry . . . . . . . . . . . . 114 Black Monday stock market crash . . . . . . . . . . . . . . 115 Blacks. See African Americans Blade Runner . . . . . . . . . . . 117 Blondie . . . . . . . . . . . . . . 118 Bloom County . . . . . . . . . . . 119 Blue Velvet . . . . . . . . . . . . . 120 Boat people. . . . . . . . . . . . 121 Boitano, Brian . . . . . . . . . . 123 Bon Jovi. . . . . . . . . . . . . . 124 Bonfire of the Vanities, The . . . . . 125 Bonin, William . . . . . . . . . . 126 Book publishing . . . . . . . . . 127 Bork, Robert H. . . . . . . . . . 129 Bourassa, Robert . . . . . . . . . 130 Bowers v. Hardwick . . . . . . . . 131 Boxing . . . . . . . . . . . . . . 132 Boy George and Culture Club . . . . . . . . . . . . . . 135 Boyle, T. Coraghessan . . . . . . 136 Brat Pack in acting . . . . . . . . 137
xvii
Brat Pack in literature . . . . Brawley, Tawana . . . . . . . Break dancing . . . . . . . . Breakfast Club, The . . . . . . Brett, George . . . . . . . . . Bridges, Jeff. . . . . . . . . . Broadway musicals . . . . . . Broderick, Matthew . . . . . Brokaw, Tom . . . . . . . . . Bush, George H. W. . . . . . Business and the economy in Canada . . . . . . . . . . Business and the economy in the United States . . . . .
. . . . . . . . . .
. . . . . . . . . .
139 140 142 144 145 146 147 149 151 152
. . 156 . . 157
Cabbage Patch Kids . . . . . . Cable News Network. See CNN Cable television. . . . . . . . . CAD/CAM technology. . . . . Caffeine. . . . . . . . . . . . . Cagney and Lacey . . . . . . . . Camcorders. . . . . . . . . . . Canada Act of 1982 . . . . . . Canada and the British Commonwealth . . . . . . . Canada and the United States . . . . . . . . . . . . Canada Health Act of 1984 . . Canada-United States Free Trade Agreement . . . . . . Canadian Caper . . . . . . . . Canadian Charter of Rights and Freedoms. . . . . . . . Cancer research . . . . . . . . Car alarms . . . . . . . . . . . Cats . . . . . . . . . . . . . . . CDs. See Compact discs (CDs) Cell phones . . . . . . . . . . . Central Park jogger case . . . . Cerritos plane crash . . . . . . Challenger disaster . . . . . . . Cheers . . . . . . . . . . . . . . Cher. . . . . . . . . . . . . . . Children’s literature . . . . . . Children’s television . . . . . . China and the United States . . . . . . . . . . . . Chrétien, Jean . . . . . . . . .
. 164 . . . . . .
165 168 170 171 172 173
. 176 . 178 . 181 . 182 . 183 . . . .
184 185 187 188
. . . . . . . .
189 190 192 194 197 198 199 205
. 210 . 211
The Eighties in America Chrysler Corporation federal rescue . . . . . . . Claiborne, Harry E. . . . . . Clancy, Tom . . . . . . . . . Classical music . . . . . . . . Close, Glenn . . . . . . . . . Closing of the American Mind, The . . . . . . . . . . . . . Clothing. See Fashions and clothing CNN . . . . . . . . . . . . . Cold Sunday . . . . . . . . . Cold War . . . . . . . . . . . Color Purple, The . . . . . . . Colorization of black-andwhite films. . . . . . . . . Comedians . . . . . . . . . . Comic Relief . . . . . . . . . Comic strips . . . . . . . . . Compact discs (CDs). . . . . Computers . . . . . . . . . . Conch Republic . . . . . . . Confederacy of Dunces, A . . . . Congress, U.S. . . . . . . . . Congressional page sex scandal . . . . . . . . . . Conservatism in U.S. politics. . . . . . . . . . . Consumerism. . . . . . . . . Contragate. See Iran-Contra affair Cosby Show, The . . . . . . . . Cosmos . . . . . . . . . . . . . Costner, Kevin . . . . . . . . Country music . . . . . . . . Crack epidemic. . . . . . . . Craft, Christine . . . . . . . . Crime . . . . . . . . . . . . . Cruise, Tom . . . . . . . . . Culture Club. See Boy George and Culture Club Cyberpunk literature. . . . . Dallas . . . . . . . . . . . Dance, popular . . . . . . D.A.R.E. See Drug Abuse Resistance Education (D.A.R.E.) Davies, Robertson . . . . Day After, The . . . . . . . Decker, Mary . . . . . . . Deconstructivist architecture . . . . . .
. . . . .
. . . . .
213 214 215 216 218
. . 219
. . . .
. . . .
220 222 223 227
. . . . . . . . .
. . . . . . . . .
228 230 233 233 235 238 241 242 242
. . 246 . . 247 . . 249
. . . . . . . .
. . . . . . . .
251 253 254 255 259 260 262 265
. . 266
. . . . 269 . . . . 271
. . . . 272 . . . . 274 . . . . 275 . . . . 276
De Lorean, John . . . . . . . Demographics of Canada . . Demographics of the United States . . . . . . . . . . . Designing Women . . . . . . . Devo . . . . . . . . . . . . . Diets . . . . . . . . . . . . . Disability rights movement . Disposable cameras . . . . . Divorce. See Marriage and divorce DNA fingerprinting . . . . . Do the Right Thing . . . . . . . Domestic violence . . . . . . Doppler radar . . . . . . . . Douglas, Michael . . . . . . . Drug Abuse Resistance Education (D.A.R.E.). . . Dukakis, Michael . . . . . . . Dupont Plaza Hotel fire . . . Duran Duran . . . . . . . . . Dworkin, Andrea . . . . . . . Dynasty . . . . . . . . . . . .
. . 278 . . 279 . . . . . .
. . . . . .
. . . . .
. . . . .
292 293 295 297 297
. . . . . .
. . . . . .
299 299 301 302 304 305
École Polytechnique massacre. . . . . . . . . . . . Economic Recovery Tax Act of 1981 . . . . . . . . . . . . Economy. See Business and the economy in Canada; Business and the economy in the United States Education in Canada. . . . . . . Education in the United States . . . . . . . . . . . . . El Niño . . . . . . . . . . . . . . Elections in Canada . . . . . . . Elections in the United States, midterm . . . . . . . . . . . . Elections in the United States, 1980 . . . . . . . . . . . . . . Elections in the United States, 1984 . . . . . . . . . . . . . . Elections in the United States, 1988 . . . . . . . . . . . . . . Elway, John . . . . . . . . . . . . Empire Strikes Back, The . . . . . . Environmental movement . . . . Epic films . . . . . . . . . . . . . Erdrich, Louise . . . . . . . . . . E.T.: The Extra-Terrestrial . . . . . Europe and North America . . . . . . . . . . . .
xviii
282 286 287 288 290 291
307 309
310 312 315 317 320 322
Evangelical Lutheran Church in America . . . . . . . . . . 346 “Evil Empire” speech. See Reagan’s “Evil Empire” speech Exxon Valdez oil spill . . . . . . . 347 F-117 Nighthawk. See Stealth fighter Facts of Life, The . . . . . . . . Fads . . . . . . . . . . . . . . Falwell, Jerry . . . . . . . . . Family Ties. . . . . . . . . . . Farm Aid . . . . . . . . . . . Farm crisis . . . . . . . . . . Fashions and clothing . . . . Fast Times at Ridgemont High . Fatal Attraction . . . . . . . . Fax machines . . . . . . . . . Feminism . . . . . . . . . . . Ferraro, Geraldine . . . . . . Fetal medicine . . . . . . . . Film in Canada . . . . . . . . Film in the United States . . First Nations. See Aboriginal rights in Canada Flag burning . . . . . . . . . Flashdance . . . . . . . . . . . Flynt, Larry . . . . . . . . . . Food Security Act of 1985 . . Food trends. . . . . . . . . . Football . . . . . . . . . . . . Ford, Harrison . . . . . . . . Foreign policy of Canada . . Foreign policy of the United States . . . . . . . . . . . 401(k) plans . . . . . . . . . Fox, Michael J. . . . . . . . . FOX network . . . . . . . . . Freeway Killer. See Bonin, William Full Metal Jacket . . . . . . . .
. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . .
350 350 352 353 355 356 357 360 361 362 363 365 366 367 369
. . . . . . . .
. . . . . . . .
374 375 376 377 379 381 384 385
. . . .
. . . .
387 391 392 393
. . 395
326 330 333 334 335 340 341 342 344
Gallagher . . . . . . . . . . . . Gallaudet University protests . . . . . . . . . . . Games. See Toys and games Gangs . . . . . . . . . . . . . . Garneau, Marc . . . . . . . . . Gay rights. See Homosexuality and gay rights Gehry, Frank . . . . . . . . . .
. 397 . 398 . 399 . 400
. 401
Complete List of Contents
Volume II Complete List of Contents. . . xxxiii Gender gap in voting . . General Hospital . . . . . . Generation X . . . . . . . Genetics research . . . . Gentrification . . . . . . Gere, Richard . . . . . . Ghostbusters . . . . . . . . Gibson, Kirk . . . . . . . Gibson, Mel. . . . . . . . Gibson, William . . . . . Gimli Glider . . . . . . . Glass, Philip . . . . . . . Glass ceiling . . . . . . . Globalization . . . . . . . Go-Go’s, The . . . . . . . Goetz, Bernhard . . . . . Golden Girls, The . . . . . Goldmark murders. . . . Goldwater-Nichols Act of 1986 . . . . . . . . . . Golf . . . . . . . . . . . . Goodwill Games of 1986 . Grant, Amy . . . . . . . . Grenada invasion. . . . . Gretzky, Wayne . . . . . . Griffith-Joyner, Florence . Guns n’ Roses . . . . . . Haig, Alexander . . . . . Hairstyles . . . . . . . . . Halley’s comet . . . . . . Handmaid’s Tale, The . . . Hands Across America . . Hannah, Daryl . . . . . . Harp seal hunting . . . . Hart, Gary . . . . . . . . Hawkins, Yusef . . . . . . Health care in Canada . . Health care in the United States . . . . . . . . . Health maintenance organizations (HMOs) Heat wave of 1980 . . . . Heaven’s Gate . . . . . . . Heavy metal . . . . . . . Heidi Chronicles, The . . . Henley, Beth . . . . . . . Heritage USA. . . . . . . Herman, Pee-Wee . . . .
. . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . .
403 405 406 407 408 410 412 413 414 415 416 417 418 419 421 423 424 424
. . . . . . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
425 426 428 429 430 432 433 434
. . . . . . . . . .
. . . . . . . . . .
. . . . . . . . . .
. . . . . . . . . .
436 437 438 439 441 442 443 443 445 446
. . . . 448 . . . . . . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
451 453 454 455 458 459 460 461
Hershiser, Orel . . . . . . . . . . 463 Hill Street Blues . . . . . . . . . . 463 Hinckley, John, Jr. See Reagan assassination attempt Hip-hop and rap . . . . . . . . . 465 Hispanics. See Latinos HMOs. See Health maintenance organizations (HMOs) Hobbies and recreation . . . . . 469 Hockey . . . . . . . . . . . . . . 471 Hoffman, Dustin . . . . . . . . . 474 Holmes, Larry . . . . . . . . . . 475 Home shopping channels . . . . 475 Home video rentals . . . . . . . 476 Homelessness. . . . . . . . . . . 478 Homosexuality and gay rights . . . . . . . . . . . . . 481 Homosexuals, military ban on. See Military ban on homosexuals Horror films . . . . . . . . . . . 486 Horton, William . . . . . . . . . 488 Houston, Whitney . . . . . . . . 490 Howard Beach incident . . . . . 491 Hubbard, L. Ron . . . . . . . . . 492 Hudson, Rock . . . . . . . . . . 493 Hughes, John. . . . . . . . . . . 494 Hurricane Hugo . . . . . . . . . 495 Hurt, William. . . . . . . . . . . 497 Hustler Magazine v. Falwell . . . . 498 Hwang, David Henry. . . . . . . 499 Iacocca, Lee . . . . . . . . . . . 501 Ice hockey. See Hockey Immigration Reform and Control Act of 1986. . . . . . 502 Immigration to Canada . . . . . 503 Immigration to the United States . . . . . . . . . . . . . 505 Income and wages in Canada . . . . . . . . . . . . 508 Income and wages in the United States . . . . . . . . . 509 Indian Gaming Regulatory Act of 1988 . . . . . . . . . . 512 Indians, American. See Native Americans INF Treaty. See Intermediate-Range Nuclear Forces (INF) Treaty Inflation in Canada . . . . . . . 513 Inflation in the United States . . . . . . . . . . . . . 514
xix
Infomercials . . . . . . . . . Information age . . . . . . . Intermediate-Range Nuclear Forces (INF) Treaty. . . . Inventions . . . . . . . . . . Iran-Contra affair . . . . . . Iranian hostage crisis. . . . . Irving, John. . . . . . . . . . Israel and the United States .
. . . . . .
. . . . . .
519 522 528 531 534 535
Jackson, Bo . . . . . . . . Jackson, Jesse . . . . . . . Jackson, Michael . . . . . Japan and North America Jazz . . . . . . . . . . . . Jennings, Peter . . . . . . Jewish Americans. . . . . Johnson, Magic . . . . . . Journalism . . . . . . . . Journey . . . . . . . . . . Joy Luck Club, The . . . . . Junk bonds . . . . . . . . Just Say No campaign . .
. . . . . . . . . . . . .
. . . . . . . . . . . . .
. . . . . . . . . . . . .
. . . . . . . . . . . . .
538 539 541 543 545 548 549 551 552 554 555 557 558
Keillor, Garrison . . . . Kincaid, Jamaica . . . . King, Stephen . . . . . Kirkpatrick, Jeane . . . Kiss of the Spider Woman. Klinghoffer, Leon . . . Knoxville World’s Fair . Koop, C. Everett . . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
561 562 563 564 565 566 567 567
. . . . . . . .
L.A. Law . . . . . . . . . LaRouche, Lyndon. . . . Last Temptation of Christ, The . . . . . . . . . . . Latin America . . . . . . Latinos . . . . . . . . . . Lauper, Cyndi . . . . . . Leg warmers . . . . . . . Lemieux, Mario . . . . . LeMond, Greg . . . . . . Lennon, John . . . . . . Leonard, Sugar Ray . . . Letterman, David . . . . Lévesque, René. . . . . . Lewis, Carl . . . . . . . . Liberalism in U.S. politics Libya bombing . . . . . . Literature in Canada . . .
. . 516 . . 517
. . . . 570 . . . . 571 . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . .
572 573 575 578 579 580 581 582 583 584 585 586 587 589 590
The Eighties in America Literature in the United States . . . . . . . . . . . Little Mermaid, The . . . . . . Live Aid . . . . . . . . . . . . Lockerbie bombing. See Pan Am Flight 103 bombing Loma Prieta earthquake . . . Louganis, Greg . . . . . . . . Louisiana World Exposition . Lucas, Henry Lee . . . . . . Ludlum, Robert . . . . . . .
. . 591 . . 596 . . 596
. . . . .
McDonald’s massacre. See San Ysidro McDonald’s massacre McEnroe, John . . . . . . . . . McKinney Homeless Assistance Act of 1987 . . . . . . . . . McMartin Preschool trials . . . Madonna . . . . . . . . . . . . Mafia. See Organized crime Magnet schools . . . . . . . . . Magnum, P.I. . . . . . . . . . . Mainstreaming in education . . . . . . . . . . Malathion spraying. . . . . . . Mamet, David. . . . . . . . . . Marathon of Hope . . . . . . . Mariel boatlift . . . . . . . . . Marriage and divorce . . . . . Married . . . with Children . . . . Martial arts . . . . . . . . . . . Martin, Steve . . . . . . . . . . Martin Luther King Day . . . . M*A*S*H series finale . . . . . Max Headroom . . . . . . . . . Medicine . . . . . . . . . . . . Meech Lake Accord . . . . . . Meese, Edwin, III. . . . . . . . Mellencamp, John Cougar. . . Meritor Savings Bank v. Vinson . . . . . . . . . . . . Mexico and the United States . . . . . . . . . . . . MGM Grand Hotel fire . . . . Miami Riot of 1980. . . . . . . Miami Vice. . . . . . . . . . . . Michael, George . . . . . . . . Microsoft . . . . . . . . . . . . Middle East and North America . . . . . . . . . . . Military ban on homosexuals . . . . . . . . Military spending . . . . . . .
. . . . .
599 601 602 603 604
. 605 . 606 . 607 . 609 . 610 . 612 . . . . . . . . . . . . . . . .
613 615 616 617 618 620 622 624 624 625 626 628 629 632 633 634
. 635 . . . . . .
636 637 639 640 641 642
. 644 . 648 . 649
Miller, Sue . . . . . . . . . . . . 650 Minimalist literature . . . . . . . 651 Miniseries. . . . . . . . . . . . . 653 Minivans . . . . . . . . . . . . . 655 Minorities in Canada. . . . . . . 656 Miracle on Ice . . . . . . . . . . 657 Missing and runaway children . . . . . . . . . . . . 659 Mommy track. . . . . . . . . . . 660 Mondale, Walter . . . . . . . . . 662 Montana, Joe . . . . . . . . . . . 664 Montreal massacre. See École Polytechnique massacre Moonlighting . . . . . . . . . . . 665 Moral Majority . . . . . . . . . . 666 Mothers Against Drunk Driving (MADD) . . . . . . . 668 Mötley Crüe . . . . . . . . . . . 668 Mount St. Helens eruption . . . 669 MOVE. . . . . . . . . . . . . . . 671 Movies. See Film in Canada; Film in the United States Mr. T . . . . . . . . . . . . . . . 673 MTV. . . . . . . . . . . . . . . . 674 Mullet . . . . . . . . . . . . . . . 677 Mulroney, Brian . . . . . . . . . 677 Multiculturalism in education . . . . . . . . . . . 679 Multiplex theaters . . . . . . . . 680 Murphy, Eddie . . . . . . . . . . 681 Murray, Bill . . . . . . . . . . . . 682 Music . . . . . . . . . . . . . . . 683 Music videos . . . . . . . . . . . 686 Musicals. See Broadway musicals Nation at Risk, A . . . . . . . Nation of Yahweh . . . . . . National Anthem Act of 1980 . . . . . . . . . . . . National Education Summit of 1989 . . . . . . . . . . National Energy Program . . National Minimum Drinking Age Act of 1984 . . . . . . Native Americans. . . . . . . Natural disasters . . . . . . . Navratilova, Martina . . . . . Naylor, Gloria. . . . . . . . . Neoexpressionism in painting . . . . . . . . . . Neo-Nazis. See Skinheads and neo-Nazis Network anchors . . . . . . . New Coke . . . . . . . . . . .
xx
. . 689 . . 690 . . 690 . . 692 . . 693 . . . . .
. . . . .
694 695 697 699 700
. . 701
. . 702 . . 704
New Mexico State Penitentiary Riot . . . . . . New Wave music . . . . . . . . Nicholson, Jack. . . . . . . . . Night Stalker case . . . . . . . Nighthawk. See Stealth fighter Nobel Prizes . . . . . . . . . . North, Oliver . . . . . . . . . . Nuclear Waste Policy Act of 1982 . . . . . . . . . . . . . Nuclear winter scenario . . . .
. . . .
705 706 708 710
. 711 . 714 . 715 . 716
Oates, Joyce Carol . . . . . . Ocean Ranger oil rig disaster . O’Connor, Sandra Day. . . . Olson, Clifford . . . . . . . . Olympic boycotts . . . . . . . Olympic Games of 1980 . . . Olympic Games of 1984 . . . Olympic Games of 1988 . . . On Golden Pond . . . . . . . . O’Neill, Tip. . . . . . . . . . Ordinary People . . . . . . . . Organized crime . . . . . . . Osbourne, Ozzy . . . . . . . Oscars. See Academy Awards Ozone hole . . . . . . . . . .
. . . . . . . . . . . . .
. . . . . . . . . . . . .
719 720 721 722 722 724 728 731 735 736 737 738 740
Pac-Man . . . . . . . . . . . . Pan Am Flight 103 bombing. Panama invasion . . . . . . . Parental advisory stickers . . Pauley, Jane. . . . . . . . . . PC. See Political correctness Pei, I. M. . . . . . . . . . . . Peller, Clara. . . . . . . . . . People’s Court, The . . . . . . . Performance art . . . . . . . PG-13 rating . . . . . . . . . Phantom of the Opera, The . . . Photography . . . . . . . . . Plastic surgery . . . . . . . . Platoon . . . . . . . . . . . . Play, the . . . . . . . . . . . . Poetry . . . . . . . . . . . . . Poindexter, John . . . . . . . Political correctness . . . . . Pollution. See Air pollution; Water pollution Pop music. . . . . . . . . . . Pornography . . . . . . . . . Post office shootings . . . . . Power dressing . . . . . . . . Preppies . . . . . . . . . . .
. . . . .
. . . . .
743 744 745 748 749
. . . . . . . . . . . . .
. . . . . . . . . . . . .
750 752 753 754 755 756 757 759 761 762 764 766 767
. . . . .
. . . . .
768 772 774 776 777
. . 741
Complete List of Contents Presidential elections. See Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988 Prince . . . . . . . . . . . . . . . 778 Prozac. . . . . . . . . . . . . . . 780 Psychology . . . . . . . . . . . . 781 Public Enemy. . . . . . . . . . . 783 Quayle, Dan . . . . . . . . . . . 785 Quebec English sign ban . . . . 787
Quebec referendum of 1980 . . . . . . . . . . . . . . 788 Racial discrimination . . . Radon. . . . . . . . . . . . Raging Bull . . . . . . . . . Raiders of the Lost Ark . . . . Rambo . . . . . . . . . . . Ramirez, Richard. See Night Stalker case Rap. See Hip-hop and rap Rape . . . . . . . . . . . .
. . . . .
. . . . .
. . . . .
790 791 793 795 796
Rather, Dan. . . . . . . Reagan, Nancy . . . . . Reagan, Ronald . . . . Reagan assassination attempt . . . . . . . Reagan Democrats . . . Reagan Doctrine . . . . Reagan Revolution . . . Reaganomics . . . . . . Reagan’s “Evil Empire” speech . . . . . . . .
. . . . . 798 . . . . . 799 . . . . . 801 . . . . .
. . . . .
. . . . .
. . . . .
. . . . .
805 807 808 809 809
. . . . . 813
. . . 797
Volume III Complete List of Contents. . . . xlix Recessions . . . . . . . . . . . . 815 Recreation. See Hobbies and recreation Regan, Donald . . . . . . . . . . 816 Rehnquist, William H. . . . . . . 817 Religion and spirituality in Canada . . . . . . . . . . . . 818 Religion and spirituality in the United States . . . . . . . . . 819 R.E.M.. . . . . . . . . . . . . . . 822 Retton, Mary Lou . . . . . . . . 823 Reykjavik Summit . . . . . . . . 824 Rice, Jerry . . . . . . . . . . . . 826 Richie, Lionel . . . . . . . . . . 827 Richler, Mordecai . . . . . . . . 828 Ride, Sally. . . . . . . . . . . . . 829 Rivera, Geraldo. . . . . . . . . . 830 Roberts v. United States Jaycees . . . 831 Robertson, Pat . . . . . . . . . . 832 RoboCop . . . . . . . . . . . . . . 833 Robots . . . . . . . . . . . . . . 834 Rock and Roll Hall of Fame . . . 834 Rock music, women in. See Women in rock music Rose, Pete. . . . . . . . . . . . . 837 Run-D.M.C. . . . . . . . . . . . . 838 Runaway children. See Missing and runaway children Ryan, Nolan . . . . . . . . . . . 839 S&L crisis. See Savings and loan (S&L) crisis St. Elsewhere . . . . . . . . . . . . 841 San Ysidro McDonald’s massacre. . . . . . . . . . . . 842
Sauvé, Jeanne. . . . . . . . . . . Savings and loan (S&L) crisis . . . . . . . . . . . . . . Scandals . . . . . . . . . . . . . Schnabel, Julian . . . . . . . . . School vouchers debate . . . . . Schreyer, Edward . . . . . . . . . Schroeder, Pat . . . . . . . . . . Schwarzenegger, Arnold . . . . . Science and technology . . . . . Science-fiction films . . . . . . . Scorsese, Martin . . . . . . . . . SDI. See Strategic Defense Initiative (SDI) Senate bombing. See U.S. Senate bombing Sequels . . . . . . . . . . . . . . SETI Institute. . . . . . . . . . . sex, lies, and videotape . . . . . . . Sexual harassment . . . . . . . . Shamrock Summit . . . . . . . . Shepard, Sam. . . . . . . . . . . Shields, Brooke. . . . . . . . . . Shultz, George P. . . . . . . . . . Simmons, Richard . . . . . . . . Sioux City plane crash . . . . . . Sitcoms . . . . . . . . . . . . . . Skinheads and neo-Nazis . . . . SkyDome . . . . . . . . . . . . . Slang and slogans . . . . . . . . Slogans. See Slang and slogans Smith, Samantha . . . . . . . . . Smoking and tobacco . . . . . . Soap operas. . . . . . . . . . . . Soccer. . . . . . . . . . . . . . . Social Security reform . . . . . .
xxi
843 844 846 848 849 852 852 854 855 859 863
864 866 867 868 870 871 872 873 874 874 876 878 879 880 882 884 885 887 889
Soviet Union and North America . . . . . . . . . . . . 891 Space exploration . . . . . . . . 894 Space shuttle program . . . . . . 896 Special effects . . . . . . . . . . 901 Spielberg, Steven . . . . . . . . . 903 Spirituality. See Religion and spirituality in Canada; Religion and spirituality in the United States Sports . . . . . . . . . . . . . . . 904 Spotted owl controversy . . . . . 908 Springsteen, Bruce . . . . . . . . 909 Standards and accountability in education. . . . . . . . . . 911 Star Search . . . . . . . . . . . . . 912 Star Trek: The Next Generation . . . . . . . . . . . 913 “Star Wars” defense system. See Strategic Defense Initiative (SDI) Starbucks . . . . . . . . . . . . . 914 Stark, USS. See USS Stark incident Statue of Liberty restoration and centennial . . . . . . . . 915 Stealth fighter . . . . . . . . . . 918 Steel, Danielle . . . . . . . . . . 919 Sting . . . . . . . . . . . . . . . 920 Stock market crash. See Black Monday stock market crash Stockton massacre . . . . . . . . 921 Stone, Oliver . . . . . . . . . . . 922 Strategic Defense Initiative (SDI) . . . . . . . . . . . . . 923 Streep, Meryl . . . . . . . . . . . 925
The Eighties in America Subway Vigilante. See Goetz, Bernhard Sununu, John H. . . . . . . . Superconductors . . . . . . . Superfund program . . . . . Supreme Court decisions . . Swaggart, Jimmy . . . . . . . Synthesizers. . . . . . . . . . Tabloid television . . . . . . Talk shows . . . . . . . . . . Talking Heads . . . . . . . . Tamper-proof packaging. . . Tanner ’88 . . . . . . . . . . . Tax Reform Act of 1986 . . . Taylor, Lawrence . . . . . . . Technology. See Science and technology Teen films . . . . . . . . . . Teen singers . . . . . . . . . Televangelism . . . . . . . . Television . . . . . . . . . . . Tennis. . . . . . . . . . . . . Terminator, The . . . . . . . . Terms of Endearment . . . . . . Terrorism . . . . . . . . . . . Theater . . . . . . . . . . . . Third Wave, The . . . . . . . . thirtysomething. . . . . . . . . This Is Spin¨al Tap . . . . . . . Thomas, Isiah . . . . . . . . Thompson v. Oklahoma . . . . Times Beach dioxin scare . . Titanic wreck discovery . . . Tobacco. See Smoking and tobacco Torch Song Trilogy . . . . . . . Toronto bathhouse raids of 1981 . . . . . . . . . . . . Tort reform movement . . . Tower Commission. . . . . . Toys and games. . . . . . . . Transplantation . . . . . . . Trivial Pursuit . . . . . . . . Tron . . . . . . . . . . . . . . Trudeau, Pierre . . . . . . . Turner, John . . . . . . . . . Turner, Kathleen . . . . . . . Turner, Ted. . . . . . . . . . Turner, Tina . . . . . . . . . Twilight Zone accident . . . . Tylenol murders . . . . . . . Tyler, Anne . . . . . . . . . . Tyson, Mike. . . . . . . . . .
. . . . . .
. . . . . .
926 927 929 931 936 937
. . . . . . .
. . . . . . .
938 939 941 942 943 944 945
. . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
946 948 950 951 955 957 958 959 962 965 966 967 968 969 970 971
. . 973 . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
974 974 976 979 982 983 985 986 988 989 990 991 992 993 994 995
Ueberroth, Peter . . . . . . . . . 997 Unemployment in Canada . . . 998 Unemployment in the United States . . . . . . . . . 999 Unions. . . . . . . . . . . . . . 1001 United Nations . . . . . . . . . 1005 US Festivals . . . . . . . . . . . 1007 U.S. Senate bombing . . . . . . 1009 USA for Africa . . . . . . . . . 1009 USA Today . . . . . . . . . . . . 1011 USS Stark incident . . . . . . . 1011 USS Vincennes incident . . . . . 1013 U2 . . . . . . . . . . . . . . . . 1014
Women in rock music . . . . Women in the workforce . . . Women’s rights . . . . . . . . Wonder Years, The . . . . . . . Workforce, women in the. See Women in the workforce World music . . . . . . . . . . World Wrestling Federation . . . . . . . . . Wright, Jim . . . . . . . . . . WWF. See World Wrestling Federation
. . . .
1050 1053 1057 1063
. 1064 . 1065 . 1067
Xanadu Houses . . . . . . . . . 1069 Valenzuela, Fernando . . Valley girls . . . . . . . . . Van Halen . . . . . . . . . Vancouver Expo ’86 . . . Vangelis . . . . . . . . . . Video games and arcades. Vietnam Veterans Memorial . . . . . . . Vincennes, USS. See USS Vincennes incident Virtual reality . . . . . . . Voicemail . . . . . . . . . Voyager global flight . . . .
. . . . . .
. . . . . .
. . . . . .
1015 1016 1017 1018 1020 1020
. . . 1022
. . . 1025 . . . 1025 . . . 1026
Wages. See Income and wages in Canada; Income and wages in the United States Wall Street . . . . . . . . . . . . 1029 Washington, Harold . . . . . . 1029 Water pollution . . . . . . . . . 1030 Watson, Tom . . . . . . . . . . 1032 Watt, James G.. . . . . . . . . . 1033 Wave, the . . . . . . . . . . . . 1034 Weaver, Sigourney . . . . . . . 1035 Webster v. Reproductive Health Services . . . . . . . . . . . . 1036 Weinberger, Caspar . . . . . . . 1037 Welfare . . . . . . . . . . . . . 1038 West Berlin discotheque bombing . . . . . . . . . . . 1041 When Harry Met Sally . . . . . . . 1042 White, Ryan . . . . . . . . . . . 1043 White Noise . . . . . . . . . . . . 1044 Who Framed Roger Rabbit . . . . . 1045 Williams, Robin . . . . . . . . . 1046 Williams, Vanessa . . . . . . . . 1047 Williams, Wayne Bertram. See Atlanta child murders Wilson, August . . . . . . . . . 1048 Winfrey, Oprah . . . . . . . . . 1049
xxii
Yankovic, Weird Al . . . . . . . 1070 Yellowstone National Park fires. . . . . . . . . . . . . . 1071 Yuppies . . . . . . . . . . . . . 1072 Entertainment: Major Films of the 1980’s . . . . . . . . . Entertainment: Academy Awards . . . . . . . . . . . . Entertainment: Major Broadway Plays and Awards . . . . . . . . . . . . Entertainment: Most-Watched U.S. Television Shows . . . . Entertainment: Emmy Awards . . . . . . . . . . . . Legislation: Major U.S. Legislation . . . . . . . . . . Legislation: U.S. Supreme Court Decisions . . . . . . . Literature: Best-Selling U.S. Books. . . . . . . . . . . . . Literature: Major Literary Awards . . . . . . . . . . . . Music: Popular Musicians . . . Music: Grammy Awards. . . . . Sports: Winners of Major Events . . . . . . . . . . . . Time Line . . . . . . . . . . . . Bibliography . . . . . . . . . . Web Sites . . . . . . . . . . . . Glossary . . . . . . . . . . . . . List of Entries by Category . . .
1075 1083
1085 1092 1094 1098 1105 1111 1115 1119 1130 1137 1144 1156 1161 1165 1169
Photo Index . . . . . . . . . . . . III Personages Index . . . . . . . . . IX Subject Index . . . . . . . . . . XIX
The Eighties in America
A ■ Aboriginal rights in Canada Definition
Rights of First Nations peoples to maintain distinct political and cultural identities within Canadian legal and social structures
In late April, 1980, at a constitutional conference sponsored by the National Indian Brotherhood in Ottawa, Ontario, the phrase “First Nations” was used in public for the first time, setting a tone of dialogue for the discussion and philosophical definition of rights for the native peoples of Canada that was to continue well beyond that decade. Two years later, the nation’s new constitution enumerated specific rights of First Nations peoples and established a requirement that those peoples be consulted by the government before any laws were passed that directly affected them as a separate constituency. During the 1980’s, the debate over the relationship between the structures of governance of aboriginal peoples and the Canadian government continued. The term “nation” was employed during this debate, sometimes to refer to Canada and sometimes to refer to individual aboriginal tribes. The tribal meaning became the more focused and relevant, when, in December, 1980, the Declaration of the First Nations was adopted at a gathering of aboriginal peoples in Ottwawa. The declaration stated that the Canadian First Nations continue to exercise the rights and fulfill the responsibilities and obligations given to us by the Creator for the land upon which we were placed. . . . [T]he right to govern ourselves and . . . to selfdetermination . . . cannot be altered or taken away by any other Nation.
The patriation of a new constitution of Canada in 1982 also took account of the changed atmosphere of aboriginal politics. While many of the First Nations’ desired objectives were not achieved, three sections of the Constitution Act, 1982, did address major First Nations issues, many of which had been variously addressed by treaties and other agreements made between 1763 and 1930. Section 25 of
the act prevented the Canadian Charter of Rights and Freedoms from inadvertently overriding aboriginal and treaty rights. Section 35 recognized the particular land rights of the Indian, Inuit, and Metis populations with regard to ownership, trapping, hunting, and fishing, and section 37 called for a constitutional conference to be held to address a range of aboriginal issues. The 1983 report from the Special Parliamentary Committee on Indian Self-Government (popularly known as the Penner Report) set the tone for formal Canadian government dialogue in this area by its extensive use of the term “nation” to refer to a group of people united by language, culture, and self-identification as members of a common body politic. This usage entailed a significant expansion of the idea of rights from the prior concept of a historic claim to a particular area of land and resources, transforming it into a political platform for self-assertion and definition. This expanded concept of rights would affect many spheres of Canadian political and cultural life over the next two decades. The National Indian Brotherhood changed its name to the Assembly of First Nations in 1982 to mirror the changing political atmosphere, becoming a body of native government leaders rather than a gathering of regional representatives. Between 1983 and 1987, four First Ministers Conferences were held. These constitutionally mandated meetings of the prime minister of Canada, provincial premiers, and delegates from the four major aboriginal groups and organizations marked the first time that native peoples were actively represented in constitutional discussions affecting their issues and status. The beginning focus of the conferences was upon concerns at the federal and provincial levels rooted in section 35(1) of the Constitution Act, 1982, which recognized and affirmed existing treaty rights of the aboriginal peoples. The 1983 conference failed to win recognition of the concept of inherent aboriginal rights by all provinces and indeed was marked by a lack of agreement
2
■
The Eighties in America
Abortion
among the First Nations representatives themselves on a broad spectrum of issues, ranging from claims to land and natural resources to quality of education to questions of self-determination. The conference did, however, result in amendment section 35(3) to the Constitution Act, which provided that First Nations’ rights in land-claim agreements were to be given the same constitutional protections as were treaty rights. It was also agreed to hold three more meetings to discuss further constitutional questions affecting the native peoples. At these next three conferences, the major issue addressed was the question of the rights of the First Nations to self-government, with delegates pressing for constitutional recognition of an inherent aboriginal right to autonomy that would be guaranteed at both the federal and the provincial levels. This demand was ultimately rejected by the final conference in 1987. Impact
In the 1980’s, the legal relationship between the aboriginal peoples of Canada and the Canadian government was reconceived. The First Nations were transformed from client cultures whose relationships with the central government were limited and defined by treaties into more overtly political entities possessing sovereign rights and with a constitutional right to be consulted on the issues directly affecting them. The adoption of the term “First Nations” was itself an acknowledgment of this change, denoting the extent to which aboriginal peoples were understood to constitute their own nations within the nation of Canada. The new status of First Nations peoples had many significant consequences, affecting such issues as hunting and fishing rights, forest and environmental management, aboriginal language rights, the legal status of aboriginal artifacts recovered at archaeological excavations, and public education. Indeed, although the question of control of curriculum content and the values promoted through Indian education in Canada was not new, it became a key sphere of activism and change during and after the 1980’s. Activists focused on the need to use education as a means of First Nations cultural renewal, rather than a tool for assimilating aboriginal peoples to an alien society with alien mores. Further Reading
Brant Castellano, Marlene, Lynne Davis, and Louise Lahache, eds. Aboriginal Education: Fulfilling the Promise. Vancouver: University of British Columbia Press, 2000.
Cook, Curtis, and Juan D. Lindau, eds. Aboriginal Rights and Self-Government: The Canadian and Mexican Experience in North American Perspective. Montreal: McGill-Queen’s University Press, 2000. Flanagan, Tom. First Nations? Second Thoughts. Montreal: McGill-Queen’s University Press, 2000. Sanders, Douglas. “The Rights of the Aboriginal Peoples of Canada.” Canadian Bar Review 61 (1983): 314-338. Robert B. Ridinger See also
Canada Act of 1982; Native Americans.
■ Abortion Definition
Intentional termination of a pregnancy
In the wake of a 1973 U.S. Supreme Court decision establishing a woman’s right to terminate her pregnancy, abortion became an increasingly polarizing issue for Americans in the 1980’s. Meanwhile, abortion remained unlawful in Canada during most of the decade, becoming legal only in 1989. Most U.S. laws outlawing abortion were declared unconstitutional as a result of the Supreme Court’s decision in Roe v. Wade (1973). Thus, the 1980’s began with women of all fifty states possessing the right to terminate a pregnancy. The controversy over the Supreme Court’s decision never dissipated, however. Polls in the 1980’s found that about 40 percent of Americans believed that abortion should be legal and unregulated under all circumstances. Others supported the right, but only under more limited circumstances, such as for pregnancies resulting from rape or incest. Still others thought that abortion should never be legal. Protests began to mount when “pro-life” groups staged rallies opposing abortion in Washington, D.C., each January 22, the anniversary of the Roe v. Wade decision. Counterdemonstrations by feminists and other “pro-choice” groups followed. Pro-life groups engaged in a variety of strategies to overturn Roe v. Wade, including making and distributing a film called The Silent Scream (1985). This film was countered by Abortion: For Survival (1989), made by the Fund for the Feminist Majority. Religion and Abortion in the United States
Opposition to abortion came primarily from conservative
The Eighties in America
Abortion
■
3
Dr. Lynn Negus, left, holds up pro-abortion-rights signs, as Debbie Thyfault protests abortion outside a clinic in Torrance, California, in 1985. (AP/Wide World Photos)
religious groups. Groups arose that were focused specifically on abortion, such as Operation Rescue, founded in 1986 by Randall Terry. These groups had their roots in more general evangelical and Protestant Fundamentalist organizations led by wellknown pro-life advocates such as Jerry Falwell, the founder of the Moral Majority. Many of these groups picketed women’s health care clinics across the United States, as well as conducting mass protests against abortion. Roman Catholic bishops, following the official teaching of the Church, condemned abortion and lobbied in Washington, D.C., for laws to prohibit abortion or at least to make abortions more difficult to obtain. However, in 1983, Catholics for a Free Choice took out a full-page ad in The New York Times declaring that there was a diversity of beliefs about abortion within the Roman Catholic Church. Signed by hundreds of lay Catholics, as well as by clergy and religious, the ad set off a divisive argu-
ment in the Church that affected not just Catholicism but American politics as well. The Religious Right started to target pro-choice political candidates for defeat. Pro-life candidates, similarly, were targeted by the Left. For many voters, politicians’ stands on abortion became a test of whether they were qualified to hold public office. When Geraldine Ferraro, a Roman Catholic from New York, was nominated as the first woman vicepresidential candidate in 1984, she was criticized by her own church for being pro-choice. Supreme Court nominees were vetted based on their stand on abortion. Robert H. Bork was nominated for the Court by President Ronald Reagan in 1987 but was not confirmed by the U.S. Senate, based in part on his belief that there was no constitutionally protected right to abortion. The connection between abortion and religion in American politics set off a national debate on the relationship between church and state that would last
4
■
The Eighties in America
Abortion
into the next century. Both the Democratic and the Republican Parties took positions on abortion as part of their respective party platforms, despite the fact that the membership of each political party was no more in agreement over the issue than was the membership of the Church. State and Local Politics The conflict over abortion existed at the state as well as at the federal level. Groups opposed to legal abortion supported local and state efforts to limit abortion rights, even if they could not ban the practice outright. State legislatures passed laws requiring parental notification for minors seeking abortions, as well as attempting to legislate men’s involvement in the decision. Opponents and advocates of legal abortion continued to battle each other in state legislatures, in state and federal courts, and even in the streets as each side marched and demonstrated. Despite polls showing that most Americans favored a middle ground on abortion, Supreme Court Justice Sandra Day O’Connor noted that the 1973 decision was “on a collision course with itself.” The close of the decade saw the Supreme Court hearing arguments in Webster v. Reproductive Health Services (1989). The case started with the passage of a 1986 Missouri law asserting that life begins at conception and forbidding doctors or hospitals to perform abortions if they received state funds. The law constituted a major test of Roe v. Wade before a more conservative Court largely shaped by President Reagan’s appointees and therefore opposed to abortion rights. Although the Court did not reverse Roe v. Wade, it allowed states to place restrictions on abortion. In an evenly divided Court, Justice O’Connor provided the swing vote to maintain Roe v. Wade. Canada and Abortion
Canada passed a law in 1969 permitting abortion only when the health of the mother was in danger. In the 1970’s, opposition to this law developed, and in 1988’s Her Majesty the Queen v. Morgentaler decision, the Supreme Court of Canada declared the law to be unconstitutional. The court determined that any attempt to force a woman to carry an unwanted pregnancy to term violated the right to security of the person guaranteed by Canada’s Charter of Rights and Freedoms. The court left it to the legislature to craft a new law regulating abortion, and in 1989 the government submitted such a bill. It passed the House of Commons but was defeated in the Senate. Thus, as the 1980’s ended, Can-
ada became one of the few nations in which abortion was completely unregulated by law. As in the United States, advocacy groups supporting and opposing legal abortion continued to operate, and liberal and conservative political candidates took positions on the issue. Abortion, however, tended to be a less central issue in Canadian culture and politics than it was in American culture and politics. Impact The U.S. struggle over abortion rights in the 1980’s was largely defined by the Court’s decision in Roe v. Wade. The pro-choice movement sought to protect that decision, in part by focusing on the attitude toward abortion rights of any judicial nominee. Pro-life activists took a different approach: While agreeing that judges were important to their cause, they also sought to circumvent constitutional issues by attacking abortion rights on a financial, public policy level. In this endeavor, the pro-life movement found a crucial ally in President Reagan. In 1984, Reagan refused foreign aid to any organization that performed or facilitated abortions. In 1987, he extended the policy to domestic organizations, announcing that no federal funds would be given to any organization that helped pregnant women obtain abortions—or even to organizations that mentioned abortion as an option. Thus, by the end of the decade, a disparity had developed between America’s rich and poor women. Both nominally had the right to terminate their pregnancies, but in practice it was much more difficult for poor women to exercise that right. The Canadian experience was quite different. It took almost the entire decade for Canada’s growing pro-choice movement to achieve victory in the courts. While the victory came fifteen years later than it had in the United States, however, the Canadian right to abortion remained undiluted for far longer than did the American right. Indeed, by the beginning of the twenty-first century, Canada remained one of the few nations in the world without any restrictions on abortion in its criminal code. Further Reading
Bennett, Belinda, ed. Abortion. Burlington, Vt.: Ashgate/Dartmouth, 2004. Compilation of articles published between 1971 and 2002 in medical and political journals; covers the full range of scientific, legal, moral, and religious issues related to abortion. Fried, Marlene Garber, ed. From Abortion to Reproduc-
The Eighties in America
tive Freedom: Transforming a Movement. Boston: South End Press, 1990. Anthology of essays that provide an historical and critical account of the abortion rights movement. Gorney, Cynthia. Articles of Faith: A Frontline History of the Abortion Wars. New York: Simon & Schuster, 2000. Study of abortion focusing primarily on relevant public policy issues and court decisions. Solinger, Rickie, ed. Abortion Wars: A Half Century of Struggle, 1950-2000. Berkeley: University of California Press, 1998. Chronicle of the evolution of the debate on abortion. Tatalovich, Raymond. The Politics of Abortion in the United States and Canada: A Comparative Study. Armonk, N.Y.: M. E. Sharpe, 1997. Study of how the two nations developed policies on abortion. Torr, James D., ed. Abortion: Opposing Viewpoints. Farmington Hills, Mich.: Greenhaven Press/ Thompson-Gale, 2006. Collects arguments on both sides of the abortion debate. Includes moral issues, practical issues, and public policy debates. Susan A. Farrell See also Bork, Robert H.; Elections in the United States, 1984; Falwell, Jerry; Feminism; Ferraro, Geraldine; Moral Majority; O’Connor, Sandra Day; Reagan, Ronald; Supreme Court decisions; Women’s rights.
■ Abscam The Event
A corruption scandal involving members of the U.S. Congress and other politicians who took bribes from undercover FBI agents in return for special favors
The Abscam scandal reinforced the American public’s distrust of politicians in the post-Watergate era. Abscam is the name given to a twenty-three-month Federal Bureau of Investigation (FBI) sting operation. The operation itself took place in the late 1970’s, but its public revelation and subsequent trials and convictions occurred in the 1980’s. The Abscam operation was first set up in 1978 to lure public officials into accepting bribes for personal favors. The name “Abscam” was derived from the front organization that the FBI set up on Long Island, New York—Abdul Enterprises, Ltd. The scam started in July, 1978, when the FBI created this ficti-
Abscam
■
5
tious company in order to catch underworld figures dealing in stolen art. The FBI portrayed the owner of the company, Kambir Abdul Rahman, as a wealthy Arab sheikh who wanted to invest his oil money in artwork. The agency then recruited as an informant a convicted swindler with connections to criminals who wanted to sell stolen artwork. The scam was successful, and within two months the FBI recovered two paintings worth an estimated $1 million. Furthermore, the initial artwork-focused operation led the FBI to other criminals who were selling fake stocks and bonds. The bureau was able to prevent the sale of almost $600 million worth of fraudulent securities. The Focus Shifts to Politicians As part of the Abscam sting, the FBI rented a house in Washington, D.C., through another fictitious company, Olympic Construction Corporation, and furnished it with antiques borrowed from the Smithsonian Institution. In addition, the bureau set up operational sites on a sixty-two-foot yacht called the Corsair in Florida and at a set of Kennedy International Airport hotel suites in New York, the Barclay Hotel in Philadelphia, and a condominium in Ventnor, New Jersey. Finally, agents created another fictitious sheikh, Yassar Habib, who claimed that he might have to flee his home country and seek asylum in the United States. This asylum could be achieved if a member of Congress introduced a private bill granting the sheikh special status for the purpose of bypassing the normal immigration process. Once everything was in place, the FBI began to investigate public officials. First, FBI agents set up a meeting with Camden mayor and New Jersey state senator Angelo Errichetti. Errichetti was told that the sheikh was interested in investing money in the Camden seaport and in Atlantic City casinos. The FBI taped Errichetti offering to help the sheikh for a fee of $400,000 and accepting $25,000 as a down payment for his services. Errichetti also said that in order to get a casino license the sheikh would have to give $100,000 to Kenneth MacDonald, the vice chairman of the Casino Control Commission. This money was later given to the two men at the office of Abdul Enterprises in New York. In March, 1979, Errichetti, along with U.S. senator Harrison A. Williams, Jr., of New Jersey, met with undercover FBI officials, including sheikh Kambir Abdul Rahman, on the Corsair in Delray Beach,
6
■
Abscam
The Eighties in America
Florida. The sheikh expressed an interest in investfirst U.S. senator to be imprisoned in eighty years, ing in land and casinos in Atlantic City and in a and, had the expulsion vote been taken, he would titanium mine in Virginia. After several more meethave been the first senator expelled from the Senate ings, the sheikh’s aides agreed to invest $100 million since the Civil War. Mayor Errichetti was convicted of in the titanium mine and to give Senator Williams bribery and conspiracy and was sentenced to six years shares of the mine’s stock for free. In return, Wilin prison; Louis Johanson, a member of the Philadelliams said he would seek military contracts for the phia city council, was sentenced to three years in mine. prison, and Criden was sentenced to six years in Mayor Errichetti also introduced the undercover prison. Altogether, nineteen people were convicted FBI agents to Howard Criden, a Philadelphia lawyer. in the Abscam sting. Criden in turn introduced members of Congress to Impact The Abscam sting operation revealed to Yassar Habib. Pennsylvania Democratic representathe American public that some of its highest elected tives Raymond F. Lederer and Michael Myers and officials were corrupt. It further tarnished the imNew York Democratic representative John M. age of a federal government that was still suffering Murphy were filmed accepting $50,000. In addition, from the Watergate controversy of the early 1970’s. Florida Republican representative Richard Kelly was Abscam also raised questions about the methods filmed accepting money, and South Carolina Demoused by the FBI. Some people felt that the FBI had cratic representative John W. Jenrette, Jr., accepted entrapped the politicians by enticing them into $50,000 through an intermediary. Criden also accommitting crimes they would not normally have cepted $50,000 on behalf of New Jersey Democratic considered. Thus, while the scam was successful, it representative Frank Thompson, Jr. Pennsylvania was not without controversy. Democratic representative John P. Murtha met with the undercover agents but never accepted any money and was not charged with any crimes. The FBI shut down the sting operation on February 2, 1980, after the agency heard rumors that news organizations were about to break the story. The scam had lasted for twenty-three months and had involved approximately one hundred agents. The scam resulted in the convictions of Senator Williams and U.S. representatives Jenrette, Murphy, Kelly, Lederer, Myers, and Thompson on various federal charges including bribery and conspiracy. All were sentenced to prison terms of one to three years. Kelly’s sentence was initially overturned on appeal on the grounds of entrapment; however, it was reinstated by a higher court. Most of those convicted resigned from office voluntarily, although Myers was expelled by the This videotape shows Congressman Michael Myers of Pennsylvania (second from left) House and Williams did not resign holding an envelope containing $50,000, which he has just accepted from undercover until the Senate was about to vote FBI agent Anthony Amoroso (far left) as part of the Abscam sting operation. (AP/Wide on his expulsion. Williams was the World Photos)
The Eighties in America Further Reading
Greene, Robert W. The Sting Man Inside. New York: E. P. Dutton, 1981. Looks at Melvin Weinberg, a convicted con artist whom the FBI used to set up the scam. Tolchin, Susan J., and Martin Tolchin. Glass Houses: Congressional Ethics and the Politics of Venom. Boulder, Colo.: Westview Press, 2004. Examines the politicization of the ethics process in Congress, including discussion of the Abscam scandal and its aftermath. Williams, Robert. Political Scandals in the USA. Edinburgh: Keele University Press, 1998. Study of major American political scandals, comparing Abscam to the other such events that rocked the federal government. William V. Moore See also
Congress, U.S.; Congressional page sex scandal of 1983; Iran-Contra affair; Organized crime; Scandals.
■ Academy Awards The Event
Annual presentation of awards by the Academy of Motion Picture Arts and Sciences
The Academy Awards of the 1980’s often went to relatively safe, uncontroversial films made in—or in the style of— Hollywood’s remaining studios. Several African Americans received nominations, however, and the success of Oliver Stone’s Platoon proved that the Academy was not completely averse to controversy. Beginning in the early 1960’s, the Academy Awards (or Oscars) gradually shed their Hollywood provincialism, as the Academy of Motion Picture Arts and Sciences increasingly acknowledged the international scope of cinema in its nominations and awards. As had occurred in the past, however, the nominations and awards of the 1980’s often went to a relatively small number of films and film personalities. For example, in 1983 Richard Attenborough’s epic Gandhi (1982) swept the major awards, as did Sidney Pollock’s Out of Africa (1985) at the awards ceremony of 1986. In other years, the awards were often split between two favorite films; on rare occasions, though, a more democratic spirit seemed to prevail.
Academy Awards
■
7
Early Years The awards ceremony for films released in 1980 (which took place in 1981) was dominated by two films. Ordinary People won Best Picture, as well as Best Director (first-time director Robert Redford), Best Supporting Actor (Timothy Hutton), and Best Adapted Screenplay (Alvin Sargent). Raging Bull was nominated in several categories, and it won the awards for Best Actor (Robert De Niro) and Best Film Editing (Thelma Schoonmaker). Henry Fonda was awarded an honorary Oscar. The awards for 1981’s best films were distributed more evenly. The Best Picture award went to Chariots of Fire, but Best Director went to Warren Beatty for Reds. Henry Fonda and Katharine Hepburn won Best Actor and Best Actress for On Golden Pond. Steven Spielberg received his second nomination for Best Director (for Raiders of the Lost Ark), Meryl Streep received her third nomination for an acting award and her first as a leading actress (for The French Lieutenant’s Woman), and Barbara Stanwyck received an honorary Oscar. Attenborough’s Gandhi absolutely dominated the 1982 awards, garnering Best Picture, Best Director, Best Actor, Best Original Screenplay, Best Cinematography, Best Film Editing, Best Art Direction-Set Decoration, and Best Costume Design and proving once again that the Academy loved big pictures. Spielberg received two nominations for E.T.: The Extra-Terrestrial, including his first nomination as a producer for Best Picture. Meryl Streep won Best Actress (for Sophie’s Choice), and Glenn Close received the first of her five nominations of the decade. With his performance in An Officer and a Gentleman, Louis Gossett, Jr., became the first African American actor to win a supporting acting award and only the second African American to win since 1963. The highlight of 1983, aside from James L. Brooks’s tear-jerker Terms of Endearment, which won Best Picture, Best Director, and a couple of acting awards, was the recognition accorded Ingmar Bergman’s last film, Fanny och Alexander (1982; Fanny and Alexander). Bergman’s film won Best Foreign Film, Best Cinematography, Best Art Direction-Set Decoration, and Best Costume Design, but not Best Director. Robert Duvall won Best Actor for his role in Bruce Beresford’s fine film Tender Mercies, and Hal Roach won an honorary Oscar. Middle Years Amadeus, a film version of Peter Schaffer’s play about Wolfgang Amadeus Mozart, collected
8
■
Academy Awards
most of the awards for 1984 films, beating out David Lean’s A Passage to India in most categories. Lean’s film won Best Supporting Actress (Peggy Ashcroft) and Best Score (Maurice Jarre, who expressed gratitude that Mozart was not eligible). When 1985’s The Color Purple and its director, Spielberg, both lost to Pollock and Out of Africa, it suggested not only that the Academy still preferred big pictures but also that it was uneasy with controversial ones. After the more controversial films of the 1970’s, perhaps the Academy’s members sought a respite from politics—or maybe they were influenced by the conservatism that Ronald Reagan’s presidency brought to the nation. The same year, John Huston received his last nomination for Best Director and his daughter Angelica got her first for Best Supporting Actress for
The Eighties in America
Prizzi’s Honor (1985), and Paul Newman won the honorary Oscar. The following year, the Academy made less cautious choices, as it awarded Oliver Stone’s Platoon (1985) Best Picture and Best Director. Stone’s devastating portrayal of combat in Vietnam opened the way for many more anti-Vietnam films to come, and the list of Best Director nominees for 1985 films showcased some excellent films: Woody Allen’s Hannah and Her Sisters, Roland Joffe’s The Mission, James Ivory’s A Room with a View, and David Lynch’s Blue Velvet. Spielberg was awarded the Irving G. Thalberg Memorial Award for his work as a producer. Dexter Gordon, an African American jazz musician, was nominated for Best Actor for his roleof-a-lifetime performance in ’Round Midnight, but the Oscar went to Newman for The Color of Money.
From left: Richard Attenborough, Meryl Streep, and Ben Kingsley display their Oscars at the 1983 Academy Awards ceremony in Los Angeles. Attenborough won both the Best Director and the Best Picture Awards. (AP/Wide World Photos)
The Eighties in America Final Years The sixtieth anniversary of the Academy Awards fell in 1988, so it was celebrated at the ceremony that honored the films made in 1987. The nominations for those films reflected the changing nature of the business. Four of the five Best Director nominations went to international directors, two British (Adrian Lyne and John Boorman), one Italian (Bernardo Bertolucci), and one Swedish (Lasse Hallström). Some of the films honored were harder edged than those of recent years had been, with Fatal Attraction, Wall Street, Ironweed, Cry Freedom, Broadcast News, and Good Morning, Vietnam all receiving nominations of one sort or another. Bernardo Bertolucci’s The Last Emperor received many of the awards. Whatever politically progressive impulse the members of the Academy might have felt in 1988 did not survive to the end of the decade, and the Best Picture trophies awarded in 1989 and 1990 both went to fairly benign films, Rain Man (1988) and Driving Miss Daisy (1989), respectively. African Americans Denzel Washington and Morgan Freeman were both nominated for acting roles, however, and Stone received another Best Director award for Born on the Fourth of July, while the famed Japanese director Akira Kurosawa received the decade’s last honorary Oscar. The Academy Awards in the 1980’s continued trends that had been set during the years following the studio system’s demise around 1960. Epics and films with high production values and a recognizably classical Hollywood style continued to fare better than did alternative films. African American actors began to gain some recognition, however, and the honorary Oscars continued to make up for the Academy’s past neglect of deserving talent. Although Stone was honored, Spielberg was not. Many of the films honored were made in the United States, some even in what was left of Hollywood. The tide would change in the coming decades, as the Academy recognized more films that were independently produced, of foreign origin, or made on smaller budgets elsewhere than in California. Impact
The relatively conservative choices of the Academy in the 1980’s confirmed that Hollywood, even after the demise of the studio system, remained in many ways a consolidated industry. Free agents made films, and several, such as Stone and Spielberg, achieved fame and success without the longterm studio contracts of earlier decades. The spec-
ACT UP
■
9
trum of aesthetic choices open to such filmmakers remained quite narrow, however, and if films with politically controversial content could win Academy Awards, experimental form was rarely represented among the nominees. Further Reading
Hayes, R. M. Trick Cinematography: The Oscar SpecialEffects Movies. Jefferson, N.C.: McFarland, 1986. A specialized study of the technological wizardry that shapes so many contemporary films. Levy, Emanuel. All About Oscar: The History and Politics of the Academy Awards. New York: Continuum, 2003. Examination of what went on behind the glitz and public face of the awards. Matthews, Charles. Oscar A to Z: A Complete Guide to More than Twenty-Four Hundred Movies Nominated for Academy Awards. New York: Doubleday, 1995. Listing of the films, studios, and individuals nominated and winners by category. Michael, Paul. The Academy Awards: A Pictorial History. New York: Crown, 1975. Lavish picture book of the stars and personalities associated with the Oscars. Osborn, Robert. Seventy Years of the Oscars: The Official History of the Academy Awards. New York: Abbeville Press, 1999. Authoritative history of the Oscars written by one of Hollywood’s insiders. Pickard, Roy. The Oscar Movies. New York: Facts on File, 1994. A comprehensive look at the films that have won or been considered for Academy Awards. Charles L. P. Silet See also Action films; African Americans; Blue Velvet; Color Purple, The; Film in the United States; Ordinary People; Platoon; Raging Bull; Scorsese, Martin; Spielberg, Steven; Stone, Oliver; Streep, Meryl; Terms of Endearment; Wall Street.
■ ACT UP Identification
Grassroots activist group committed to direct-action protests to demand increased resources for fighting AIDS Date Founded in 1987 ACT UP’s primary goal was to protest the difficulty of gaining access to experimental drugs, the high cost of the few AIDS treatments then available, and the lack of a coherent
10
■
ACT UP
national policy initiative to fight the disease. The group became most famous for its tactics, however, which emphasized confrontation and a refusal to be ignored. Probably best known for slowing down Wall Street in the year of its formation, the AIDS Coalition to Unleash Power (ACT UP) is a political activist group that began in New York City. In 1987, the year the group was formed, public awareness of acquired immunodeficiency syndrome (AIDS) in the country largely took the form of paranoia. People living with AIDS had few advocates. New treatment drugs were costly to develop, and drug manufacturers charged outrageous fees to sell their products, making it impossible for most AIDS sufferers to hope for an available cure. Doctors treating these patients had little access to the drugs, and only a few individual voices challenged the manufacturers’ authority. Larry Kramer and the Call to Action
One of those voices, Larry Kramer’s, had long been active in the fight for AIDS awareness. Born in 1935, Kramer began his career as a screenwriter. However, with the gay liberation movement in the 1970’s, Kramer’s explorations of his own homosexuality came to the forefront of his writing. His novel Faggots, published in 1978, examined gay male lifestyles from the perspective of an insider, but with scathing humor that elicited ire from the gay community. Kramer was among the first to recognize the devastating effect of AIDS on gay men and to call for increased government funding and better media coverage of the disease, as well as improved treatment of patients. However, his position was not given high esteem because of the general attitude toward his novel, which in time would, like Kramer himself, gain deserved respect. In 1982, he and several friends formed Gay Men’s Health Crisis (GMHC) to help AIDS sufferers. However, Kramer’s outspoken political positions soon put him on the outs with the rest of the group’s board, and he resigned in 1983. Kramer continued speaking and writing about the impact of AIDS on the gay community, and by 1987, he was as well known for his anger as for his activism. On March 10, 1987, he was invited to speak at the Lesbian and Gay Community Services Center in New York City. He took advantage of the speech to urge others to very specific political action. He asked the audience if they were as frustrated as he was by the lack of progress toward a cure for AIDS and by the trouble doctors were having obtaining new AIDS
The Eighties in America
drugs. He asked for interested parties to join him to form a political activist group. He received a resoundingly positive response, and the resulting organization became known as the AIDS Coalition to Unleash Power, or ACT UP/New York for short. Branches formed throughout the country, indeed throughout the world, in the following months and years. Activities in the 1980’s ACT UP did not waste any time in establishing itself as a strong voice in AIDS activism. Within two weeks of its formation, the group already had a specific political goal and a method of broadcasting its message. On March 24, 1987, 250 members gathered on Wall Street’s trading floor with the intent of delaying the opening bell of the stock exchange. Their message was simple. They felt Burroughs Wellcome, manufacturer of the new AIDS treatment drug azidothymidine (AZT), was charging too much for its medication,
Activist and ACT UP founder Larry Kramer in 1989. (AP/ Wide World Photos)
The Eighties in America
ACT UP
Looking Back at ACT UP On May 13, 2007, Larry Kramer delivered a speech in New York City in which he reflected on ACT UP’s accomplishments since its founding in 1987. Kramer described some of the tactics ACT UP employed in its early years to increase public awareness of AIDS: We invaded the offices of drug companies and scientific laboratories and chained ourselves to the desks of those in charge. We chained ourselves to the trucks trying to deliver a drug company’s products. We liberally poured buckets of fake blood in public places. We closed the tunnels and bridges of New York and San Francisco. Our Catholic kids stormed St. Patrick’s at Sunday Mass and spit out Cardinal O’Connor’s host. We tossed the ashes from dead bodies from their urns onto the White House lawn. We draped a gigantic condom over [former Senator] Jesse Helms’s house. We infiltrated the floor of the New York Stock Exchange for the first time in its history so we could confetti the place with flyers urging the brokers to SELL WELLCOME. We boarded ourselves up inside Burroughs Wellcome (now named GlaxoSmithKline), which owns AZT, in Research Triangle so they had to blast us out. We had regular demonstrations, Die-Ins we called them, at the Food and Drug Administration and the National Institutes of Health, at city halls, at the White House, in the halls of Congress, at government buildings everywhere, starting with our first demonstration on Wall Street, where crowds of us lay flat on the ground with our arms crossed over our chests or holding cardboard tombstones until the cops had to cart us away by the vansfull. . . . There was no important meeting anywhere that we did not invade, interrupt, and infiltrate.
with prices around ten thousand dollars per patient annually. They were particularly incensed because, though AZT was new to the market, some of the research behind it came out of federally funded studies dating back to the 1960’s. ACT UP’s campaign was successful. Seventeen ACT UP members were arrested, and Wall Street had to push back the opening of the day’s trading, garnering huge publicity. The next month, ACT UP took advantage of the standard media coverage of last-minute tax filers on April 15 by staging a protest at the New York City general post office. The news crews came to film downto-the-wire filers, guaranteeing attention for ACT UP’s cause as well. It was at this protest that the motto “Silence = Death,” still associated with ACT UP, first appeared. In 1988, Cosmopolitan magazine published an arti-
■
11
cle about AIDS that implied the disease was virtually impossible to transmit via heterosexual sex, and ACT UP had another significant goal. The ensuing protest was organized by women involved with ACT UP, and the group staged protests outside the offices of the Hearst Corporation, Cosmopolitan’s parent company, leading to mainstream media coverage of the article’s inaccuracies. Cosmopolitan eventually issued a partial retraction.
Impact Besides having a very specific impact on the prices of AIDS drugs, which, though still quite high, have been lowered since the 1980’s, ACT UP represented a new kind of militant activism. It went beyond the civil disobedience tactics of its 1970’s forerunners in the gay liberation movement to incorporate a more sophisticated understanding of mainstream media practices, which were more than capable of blunting the efficacy of 1960’s and 1970’s style protests. The group’s militant approach, which embraced almost any action that would generate publicity for its cause, alienated some potential supporters, who did not always believe that the ends justified the means. However, ACT UP consistently received the news coverage it sought in the late 1980’s. As a result, the group not only spurred a new age of AIDS awareness but also spawned numerous splinter groups—both within the gay liberation movement and in other grassroots movements—that used similar tactics to achieve success. Further Reading
Cohen, Peter F. Love and Anger: Essays on AIDS, Activism, and Politics. New York: Haworth, 1998. Examines literary works surrounding AIDS activism and includes several scholarly fictional works alongside interviews with activists to broaden readers’ understanding of AIDS activism. Hubbard, Jim, and Sarah Schulman. ACT UP Oral
12
■
Action films
History Project. http://www.actuporalhistory.org/ index1.html. 2003-present. Interviews with the surviving members of the original ACT UP/New York designed to encourage other activists by demonstrating effective tactics to bring about change. Kramer, Larry. Reports from the Holocaust: The Making of an AIDS Activist. New York: St. Martin’s Press, 1989. Autobiographical reflections from the founder of ACT UP. Kramer’s political activities prior to founding ACT UP were central in creating in him an activist who was willing to challenge popular notions. Shepard, Benjamin Heim, and Ronald Hayduk, eds. From ACT UP to the WTO: Urban Protest and Community Building in the Era of Globalization. New York: Verso, 2002. A study of militant activism including the tactics used by ACT UP’s founders and discussions of some of the associated breakaway groups. Jessie Bishop Powell See also AIDS epidemic; AIDS Memorial Quilt; Health care in the United States; Homosexuality and gay rights; Reagan, Ronald; White, Ryan.
■ Action films Definition
Large-budget movies featuring stunts, special effects, and action sequences
The soaring popularity of action films led Hollywood to invest considerable resources in developing new effects technologies and to produce films with more explicit and sensationalist representations of violence. This popularity contributed significantly to the studios’ growing emphasis on producing a few costly, potentially lucrative, spectacledriven blockbusters, rather than a greater variety of more modest, plot-driven films. The action-film genre has its roots in early Hollywood productions starring the likes of Douglas Fairbanks and Errol Flynn. However, it was the James Bond franchise in the 1960’s, known for extended action sequences, daring stunts, and big explosions, that finally defined the action movie. The American cinema of the 1970’s saw the birth of its own action movies in such films as the Dirty Harry series. It would be George Lucas’s vision of the action adventure in his 1977 film Star Wars that would help the ac-
The Eighties in America
tion genre grow from mass entertainment to megablockbuster status in the 1980’s. Trendsetters, Stars, and Blockbusters Raiders of the Lost Ark (1981), the first film of the Indiana Jones series, helped take the action movie beyond the police genre. It opened up possibilities for action movies to look to more than just civil servants as heroes. First Blood (1982), which solidified Sylvester Stallone’s career, personalized the war movies of the 1970’s, representing the action hero as a unit of one. Mel Gibson made his name in the postapocalyptic Mad Max series (1979-1985). Gibson went on to play another pain-filled part in the Lethal Weapon franchise (19871998), which was structured safely within the confines of the police action genre. Bruce Willis also played a cop in 1988’s Die Hard. This film evolved out of a decade of explosions and stunts and combined the police action drama with international terrorism to provide viewers with some of the most extreme stunts of the 1980’s. Other action stars, such as Arnold Schwarzenegger, became staples of the genre. Perhaps the most crucial actor to help popularize action films, especially in the 1990’s, Schwarzenegger starred in several small but successful action movies of the 1980’s as well, including The Terminator (1984) and Predator (1987). The stunts of 1980’s action movies did not exclusively involve explosions or guns; actors like Chuck Norris provided martial arts techniques that kept audience members on the edge of their seats. Franchises of the past also evolved during the 1980’s. Lucas would complete his Star Wars trilogy with The Empire Strikes Back (1980) and Return of the Jedi (1983). He and Steven Spielberg would also bring back Indiana Jones for two more movies. All four movies minted Harrison Ford as the definitive action hero of the 1980’s, as he starred in fully onehalf of the ten most successful films of the decade. The James Bond franchise, another strong player in the action-film industry, produced five official movies and one unofficial movie; all were box-office successes. Action Films as a Cultural Mirror The popularity of action and violence in the United States during the 1980’s can be traced to a few elements. First was the desensitization of the American public by media coverage of the Vietnam War in the 1970’s. With a television in most living rooms, the American family could sit down and watch the horrors of war on their televi-
The Eighties in America
Action films
■
13
Action hero Arnold Schwarzenegger in the 1987 Stephen King adaptation, Running Man. (AP/Wide World Photos)
sion. When the children of this generation reached their movie-going years in the 1980’s, producers had to find impressive action and violence that this market had not seen in childhood. This created a oneupmanship attitude in Hollywood and led rival studios to produce bigger, bolder, and bloodier films to attract audiences. Unfortunately, in many cases the script writing suffered. This situation was due to either the need for more action or lack of money for or concern with a decent script. Action movies of the 1980’s became notorious for generic plots that were not bound by the rules of coherency. A second reason for the popularity of action in the 1980’s was the rapid technological improvement in the film industry. Computers were becoming cheaper and more powerful, and with this development, computer-generated imaging (CGI) began to become a viable application for the film industry. Futurist Syd Meade would later claim that when Tron was made in 1982, it required one-third of the total
computing power available in the United States. By the end of the decade, that power had grown exponentially year after year, until directors no longer were shackled to exploding models and the limits of the human body; through digital technology, larger explosions, better backdrops, and inhuman stunts could be performed. The members of a 1980’s society already moving rapidly through technological barriers demanded more extreme and fantastic feats of action, and they were willing to subordinate realism to this goal. This trend in the action genre led budgets to soar and post-production time to escalate, resulting in the bigger and better pictures that audiences craved. These films also spoke to a generation that wanted an escape from reality. International relations with the Soviet Union was always a hot topic, even in the movies, as many action films’ writers chose some part of the communist bloc as their films’ antogonists. However, the decade was a time
14
■
The Eighties in America
Adams, Bryan
of individualism, and the action genre mimicked this ideology with the production of heroes whose actions were not in service of their country or who took matters into their own hands. Notably, it was in the 1980’s that Americans finally rediscovered the female action star: When Sigourney Weaver returned to her role as Ripley in Aliens (1986), she was every bit as lethal as her male science-fiction counterparts. This role was another reflection of individualism in a decade of change for the movie industry. The extreme violence and unrealistic action that characterized 1980’s action films made them the top draw at the box office for the decade. Impact The action-film genre became extremely profitable in the 1980’s. Its increased popularity was due to better stunts, explosions, and special effects. Hollywood’s filmmakers and mass audiences alike seemed to abandon the plot-driven films of the midtwentieth century and to embrace instead movies whose primary pleasures were to be found in the spectacles displayed on the screen. Part of the appeal of spectacle-driven films was their ability to showcase exciting new technologies, and part lay in a rejection of classical realism, as audiences discovered the new types of hyperrealism made available by computer animation and special effects. Film’s ability to create realities on screen that were alien to the reality outside the theater defined the appeal of many of the decade’s most successful motion pictures. Further Reading
Gallagher, Mark. Action Figures: Men, Action Films, and Contemporary Adventure Narratives. New York: Palgrave Macmillan, 2006. Study of the representation of masculinity and heroism in action films. Bibliographic references and index. Julius, Marshall. Action! A-Z of Action Movies. London: Chrysalis Books, 1996. A listing of action movies, including all of those made in the 1980’s. Includes short descriptions and production information about each film. King, Neal. Heroes in Hard Times: Cop Movies in the U.S. Philadelphia, Pa.: Temple University Press, 1999. Examines trends of police-related action movies from 1980 to 1997. Analyzes common themes and stories and looks at the genre as a whole. Ross, Steven J. Movies and American Society. Oxford, England: Blackwell, 2002. Details trends of movies and their impact on American society.
Tasker, Yvonne, ed. Action and Adventure Cinema. New York: Routledge, 2004. Compilation of essays discussing the history of action films, from before 1910 through the early twenty-first century. Places the distinctive features of 1980’s action films in historical perspective. Daniel R. Vogel See also
Aliens; Blade Runner; Empire Strikes Back, The; Film in the United States; Ford, Harrison; Gibson, Mel; Martial arts; Raiders of the Lost Ark; RoboCop; Schwarzenegger, Arnold; Sequels; Special effects; Spielberg, Steven; Terminator, The; Weaver, Sigourney.
■ Adams, Bryan Identification Canadian pop rock singer Born November 5, 1959; Kingston, Ontario
Starting with his third album in 1983, Adams gathered a large fan base for his fast-paced melodies, as well as his ballads. Bryan Adams dropped out of school at the age of fifteen to pursue a career in rock music. He spent the latter half of the 1970’s developing his style, and he started to send out demo tapes to labels like A&M Records in 1978. At that time, the Canadian music industry was on the verge of a change in direction that would benefit Adams’s career enormously. Canada had already produced some compelling rock stars, and their success made it easier for others to succeed, as the nation more fully supported its own musicians. Thus, Adams’s debut was well timed, as he released his first, eponymous, album in 1980. Though the album was not particularly successful, Adams gained a valuable professional connection with collaborator Jim Vallance, who headed the band Prism. Indeed, as the 1980’s progressed, Adams would find himself working with Vallance frequently, both on his solo material and on songs for Prism. (One Vallance-Adams collaboration, 1982’s “Don’t Let Him Know,” became Prism’s only and Adams’s first solo foray into the Billboard Top 40 list.) Adams’s second album, You Want It You Got It, appeared in 1981, and his third, Cuts Like a Knife, came in 1983. Cuts Like a Knife represented Adams’s real breakthrough, reaching number sixty-seven on the Billboard Hot 200 chart. The following year, he built
The Eighties in America
on that success and released Reckless, which reached number one on the chart and was certified quintuple platinum. The complete list of Adams’s 1980’s albums also includes Into the Fire (1987) and Live! Live! Live! (recorded in 1988 but released in the United States in 1994). He became an activist during this period as well, partly to combat the stereotypical rock-star image being assigned to him by critics. He donated his music to a number of charities, including Greenpeace. Impact While many rock critics considered Adams’s work unoriginal and generic, his music still appealed to a huge fan base in the United States and Canada. The success of Reckless and some of Adams’s other albums gave him the distinction of having the widest distribution of any Canadian rock act. Adams took advantage of the youth-oriented 1980’s with his rock anthems and ballads. Most of his songs focused on young love in its various stages and incarnations. Further Reading
Betts, Raymond. A History of Popular Culture: More of Everything, Faster and Brighter. New York: Routledge, 2004. Blythe, Daniel. The Encyclopedia of Classic 80’s Pop. London: Allison & Busby, 2003. Saidman, Sorelle. Bryan Adams: Everything He Does. Toronto: Random House, 1993. Thompson, Graham. American Culture in the 1980’s. Edinburgh: Edinburgh University Press, 2007. Jessie Bishop Powell See also
Music; Music videos; Pop music.
■ Advertising Definition
Public promotion of goods and
services The 1980’s witnessed the creation of potentially lucrative new consumer demographics and the simultaneous fragmentation of both the media landscape and consumers’ attention spans. As a result, advertisers had to reconceive and reinvent both their target audiences and their methods of reaching those audiences. The 1980’s was a decade obsessed with self-advancement, exploration, and improvement. It was a decade rife with technological advancements in the workplace and in the home and one that saw dereg-
Advertising
■
15
ulation and corporate mergers change the industrial and media landscapes. As a result of President Ronald Reagan’s “trickle-down” economic theory, as well as the deregulation of the banking industry and the stock market, many Americans—especially those between twenty-five and thirty-five—enjoyed high-paying jobs that provided them with new levels of disposable income. The decade’s higher disposable incomes combined with the redistribution of wealth to younger professionals to change not only the daily lives of wealthier Americans but also the strategies of advertisers seeking to reach potentially lucrative audiences. Shifts in Demographics
In advertising, fundamental questions every advertiser or agency must ask are: Who are the intended recipients, or the demographic, of the ad? and What vehicle will be employed to enable the ad to reach that desired demographic? In general, the key demographics are groups most likely to need or desire the product and groups who can be convinced to desire the product. The most desirable demographics are those people who not only make purchasing decisions for themselves but also influence the decisions of other consumers, such as members of their household or peers who wish to emulate them. The 1980’s saw a great shift in wealth, redefining traditional advertising demographics. Traditionally, Americans gained wealth with age, as they rose through the ranks to occupy more senior positions in their chosen fields. By 1984, however, 23 percent of America’s disposable income belonged to ambitious, young, upwardly mobile professionals, the socalled yuppies. This segment of society found itself with high-paying jobs relatively early in life as a result of deregulation and mergers, and it was obsessed both with workplace advancement and with personal improvement. Unlike the self-improvement trends of the 1970’s, however, the self-improvement goals of 1980’s yuppies were often superficial, consisting largely of purchasing products that functioned as status symbols to their peers. Advertisers were all too glad to help this demographic achieve such goals. It did not take long for Madison Avenue (the traditional home of the major U.S. advertising firms) to recognize, embrace, and start to define the new class of wealthy, young, status-conscious consumers as a highly prized demographic. Advertisements, which
16
■
The Eighties in America
Advertising
Memorable 1980’s Advertising Slogans and Jingles Product
Slogan or Jingle
Apple Macintosh computer
On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like Nineteen Eighty-Four.
Ball Park Franks
They plump when you cook ’em.
Band-Aids
I am stuck on Band-Aids, ’cause Band-Aid’s stuck on me.
Barbie dolls
We girls can do anything.
Bartles & Jaymes wine coolers
And thank you for your support.
Bounty paper towels
The thicker, quicker picker-upper.
Budweiser Light beer
Bring out your best.
Buick automobiles
The great American road belongs to Buick.
Calvin Klein jeans
You know what comes between me and my Calvins? Nothing! (spoken by fifteen-year-old actor Brooke Shields)
Chrysler automobiles
If you can find a better car, buy it! (spoken by Lee Iacocca, chief executive officer of Chrysler)
Coca-Cola
Coke is it. Coke is the real thing. Catch the wave. Coke. You can’t beat the feeling.
Colt 45
It works every time.
Coors Light beer
Coors Light. It’s the right beer now.
De Lorean automobiles
Live the dream.
Delta Airlines
We love to fly, and it shows.
Excedrin pain reliever
I’ve got an Excedrin headache this big!
Ford Motor Company
Have you driven a Ford lately?
Jolt Cola
All the sugar and twice the caffeine.
Kentucky Fried Chicken
We do chicken right.
Kit Kat candy bars
Gimme a break, gimme a break. Break me off a piece of that Kit Kat bar!
McDonald’s restaurants
It’s a good time for the great taste of McDonald’s.
Miller beer
Made the American way.
National Dairy Council
Milk. It does a body good.
Nike sports apparel and equipment
Just do it!
Partnership for a Drug-Free America
(The speaker holds up an egg.) This is your brain. (He picks up a frying pan.) This is drugs. (He cracks open the egg and fries it.) This is your brain on drugs. Any questions?
Pepsi-Cola
Pepsi’s got your taste for life. The choice of a new generation.
Plymouth automobiles
The pride is back. Born in America.
Pontiac automobiles
We build excitement.
The Eighties in America
Advertising
Product
Slogan or Jingle
Remington Microscreen shaver
I liked the shaver so much, I bought the company. (spoken by Victor Kiam, who purchased Remington in 1979)
Seagram’s Golden Wine Coolers
It’s wet and it’s dry. (sung by actor Bruce Willis)
Smith Barney financial services
They make money the old-fashioned way. They earn it. (spoken by actor John Houseman)
Toyota Corolla automobile
Oh what a feeling!
Trident sugarfree gum
Four out of five dentists surveyed recommend sugarless gum for their patients who chew gum.
Wendy’s restaurants
Where’s the beef? (spoken by actor Clara Peller)
had always featured attractive people, began to portray its models as successful professionals as well. Marketing departments realized that if they could turn a previously mundane item, like a pair of jeans, into a status symbol, they could sell more of them at a significantly higher price per unit. The clear message of many advertisements therefore became: Use this product and become strong, powerful, accepted, and popular. Yuppies responded by investing their disposable cash in high-priced designer jeans, colognes, alcohol, and cars. Advertisers of the 1980’s relied on consumers’ subjective feelings to ensure sales. Ads encouraged the belief that people could feel powerful, accepted, admired, and confident about themselves and their positions in society if only they made the right consumer choices. To help achieve such subjective emotional responses, television commercials adopted the same frenetic editing style that had been pioneered by music videos shown on MTV and by heavily edited news programs that were cut up into many brief segments. This editing style helped define the new generation’s aesthetic. Commercials thus came to resemble music videos, as they became shorter but more plentiful. This resemblance was especially noticeable in advertisements for products that appealed to young demographics, such as those for Shasta Cola and other sodas, Doritos and other snack foods, and Sergio Valente jeans and other designer clothing. These ads addressed and captured the nation’s decreased attention span by using catchy pop songs, vibrant visuals, and quick edits. The length of ads also changed to accommodate the
■
17
nation’s decreasing attention span, as America witnessed the birth of the fifteen-second commercial in 1986. Technological Advances The 1980’s witnessed a surge of technological advances that posed a unique set of challenges for advertisers to overcome. With a decline in the popularity of newspapers, advertisers began to look to other vehicles to distribute their messages, redirecting the large sums that had traditionally been spent on local newspaper advertisements. Meanwhile, equipment changes and new media choices in radio threatened that medium’s historic ability to provide advertisers with a captive audience of drivers in their cars during morning and evening commutes. Finally, changes in programming, delivery methods, and hardware threatened the future of television advertising. Advertisers could no longer rely on a limited number of viewing choices and consumer loyalty to network lineups to guarantee them viewers. Technology and industry changes offered consumers new tools through which to experience media and necessitated equally drastic changes to advertisers’ methods if ads were to continue to reach consumers. In 1950, about 125 percent of households purchased newspapers. That is, for every household in the United States, 1.25 copies of a daily newspaper were purchased. Newspapers thus offered advertisers, especially local advertisers, a welcome seat at the breakfast table every morning. By 1970, the ratio had fallen to about 100 percent, and over the next two decades circulation rates remained relatively
18
■
Advertising
constant at about 60 million as the nation’s population exploded, growing from more than 60 million households to more than 90 million. As a result, by 1990, the percentage of households that purchased a daily newspaper had fallen to a mere 68 percent. Americans were no longer receiving their news through newspapers, but rather through morning talk and news shows. This development forced advertisers to invest less in print ads and to increase their television budgets. Commercials mimicked the format of news shows, similarly presenting information in brief segments or catchphrases, rather than scripted thirty- or sixty-second ads. Local advertisers that could not afford television airtime or whose messages leant themselves particularly to print media invested in direct-mail circulars and inserts and advertised in local or suburban editions of newspapers. Technological advances in radio did not bode well for advertising either. With the increase in popularity of the cassette, and later compact discs, commuters gained more listening options, and they often preferred their own selection of commercial-free music over radio’s repetitive play lists and commercial breaks. Even those drivers who continued to listen to radio gained an increased ability to switch channels at the touch of a button, as car stereos were produced with “seek” or “scan” buttons. These buttons allowed drivers to seek the next available station automatically, rather than be relegated to the few stations assigned to their stereos’ preset buttons. These factors diminished commercials’ listenership and the opportunities for advertisers to reach their demographic locally. As a result, advertisers began to purchase airtime through national radio networks, allowing their messages to be heard multiple times across all of those networks’ affiliates (even if some affiliates were in regions that did not carry the product). The quality of local radio commercials began to suffer. Finally, technological changes also affected the way in which people viewed television, the most lucrative advertising medium. The proliferating viewing options available to 1980’s Americans presented incredible hurdles for advertisers. With the traditional Big Three television networks (ABC, CBS, and NBC), advertisers basically had captive audiences. Viewers had relatively little choice of what to watch, and once they decided on a channel, they often watched that network’s entire prime-time lineup
The Eighties in America
(commercials and all), because to change the channel required physically getting up and rotating a dial on the television set. The 1980’s, however, introduced two new devices that threatened the future of television advertising: the remote control and the videocassette recorder, or VCR. With the remote control, people were able to channel surf, changing channels every time a commercial aired (or a program failed to retain their interest) with the click of a button. The increase in VCR ownership in America, moreover, presented three new predicaments for advertisers. First, and most threatening, people could record their favorite shows and fast-forward through the commercials. Second, VCRs destroyed commercial placement and time-sensitive commercials. Finally, VCRs offered viewers expanded choices in entertainment: Americans no longer had to settle for what was on television; they could rent or purchase movies from the corner video store and completely avoid commercials. In addition to technological changes, television’s very landscape was expanding with the advent of cable television and the previously unthinkable emergence of a fourth, and eventually fifth and sixth, broadcast network. Cable, whose subscription rate increased from 28 percent in 1980 to 60 percent by 1990, suddenly offered consumers not only multiple channels but also specialized channels. The concept of “narrowcasting” (that is, broadcasting to a relatively circumscribed, specific audience) took on new importance in the 1980’s, when advertisers realized that reaching 100,000 affluent, likely consumers of a product was more valuable than reaching millions of viewers in the wrong demographic. Cable channels with specialized programming therefore developed in tandem with more specialized, targeted commercials. Between the multitude of cable channels and the ease of the remote control, however, a person could watch television for twenty-four hours and never view a commercial. When FOX emerged as a fourth network, and with the later advent of the WB and UPN, even non-cable subscribers gained more viewing options, depleting each network’s shares and ratings. Equally important, the new networks took advantage of the narrowcasting model: They siphoned specific, desirable demographics away from the traditional networks with shows that appealed to young viewers, such as 21 Jump Street, Married . . . with
The Eighties in America
Children, and The Simpsons. To accommodate each network’s decreased viewership, advertisers were allowed to purchase more minutes per hour during prime time as a result of a relaxation in Federal Communications Commission (FCC) rules. The FCC had recommended that no more than twelve minutes of commercials be broadcast per hour during weekdays and no more than ten on weekends. Deregulation and Corporate Mergers
Under the Reagan administration, America experienced an increased distrust of so-called big government, and the country underwent a massive surge of deregulation, especially in the banking and stock industries. Along with deregulation, the country witnessed an increase in corporate mergers. Corporations purchased other corporations, broke them into smaller companies, and sold them off for huge profits. This practice created a feeling of great prosperity among those fortunate enough to have a share in the profits, as there seemed to be no end to the rising stock market, foreign investments, and defense spending. These mergers created new, high-paying jobs in banking, corporate law, and stocks, even as they eliminated jobs within the merged companies. The new jobs attracted recent college graduates and helped created a class of highly paid, ambitious young people. Advertising agencies were not immune to these mergers, which proved beneficial to advertisers. As smaller boutique agencies began to be bought up by larger agencies with more resources, advertisers found they could rely on one agency for a comprehensive advertising campaign, including broadcast time, print media, and outdoor (billboard) space. These mega-agencies provided a “one-stop” shopping experience for the advertiser and created a more uniform, professional advertising campaign. Agencies exploited the distrust of pervasive government by creating ads that claimed to empower the individual and that spoke to the yuppie demographic. The most famous ad of this genre was a 1984 Apple Computer commercial that ran only once, during the Superbowl. The ad featured a young woman running down a bleak, gray hallway with a sledgehammer. At the end of the hallway was a room filled with people working on PCs and mindlessly watching a big screen, as if hypnotized. The woman hurled the sledgehammer at the screen and freed the mindless worker drones and, in doing so,
Advertising
■
19
freed Americans from their slavery to PCs and Big Brother (a reference to the totalitarian government portrayed in George Orwell’s 1949 novel Nineteen Eighty-Four). Other ads—such as Wendy’s “Where’s the Beef?” campaign, in which an elderly woman is dissatisfied with her puny hamburger, and Wendy’s Iron Curtain campaign, which featured a Soviet fashion show in which every “model” was unattractive, grim, and identically dressed—portrayed Wendy’s consumers as empowered by a fast-food chain that provided them with the choices and quality they deserved and introduced catchphrases that overtook the country. FTC and FCC Policy Changes
Under the Jimmy Carter administration of the late 1970’s, the Federal Trade Commission (FTC) created strict guidelines that regulated what claims advertisers could make and how they could present their products, especially to children. In the 1980’s, President Reagan appointed James C. Miller III to head the commission, and under Miller the FTC’s staff was cut in half and attention to advertising decreased. The FCC, the commission in charge of regulating mass media, also adopted a hands-off policy, allowing free enterprise to reign and advertisers to police themselves under the chairmanship of Mark Fowler. In the 1970’s, the FCC also had encouraged selfregulation, but it had threatened to impose regulations of its own if broadcasters did not do a sufficient job themselves. The National Association of Broadcasters was set up to create rules and ensure that they were followed. The 1980’s saw a reversal in those rules, revoking the previous twelve-minute and tenminute caps on television commercials per hour. The most drastic change arguably was Fowler’s creation of a “children’s-programming-length commercial,” that is, his decision to allow networks to air entire cartoon programs featuring characters who were also available as toys and other products. These characters included He-Man and the Masters of the Universe, the members of GI Joe, and the Thundercats. Suddenly, toy manufacturers had entire thirty- and sixty-minute programs with which to advertise to children every day, although they could not place advertisements for a cartoon’s products in commercial time during the program. The FCC under Fowler also loosened the restrictions on underwriting for public television. This relaxation of underwriting rules offered corporations
20
■
Advertising
a new advertising avenue and the ability to reach a desired demographic to which they previously lacked direct access. On average, consumers who watched public television were highly educated and earned high household incomes. They were also a more difficult demographic to reach, since they watched less commercial television. Underwriting public television programming provided a muchsought-after way for advertisers to deliver their messages to this demographic. The Reagan administration promoted less government interference with broadcasting and the arts and decreased federal funding of public broadcasting. This decrease caused public broadcasters gladly to accept generous corporate sponsorships. In 1973, public broadcasters received approximately 3.3 percent of their funding from private sources; by 1991, 16 percent of their funding came from corporate underwriting. Finally, since cable television did not broadcast over the public airwaves, cable channels were immune to many of the advertising rules and restrictions imposed by the FTC and the FCC, including the restrictions on commercial minutes per hour. Cable television therefore witnessed an explosion of infomercials, or program-length commercials, which promoted products for whatever amount of time the advertiser could afford and the channel had to offer. Suddenly, people could watch hourlong infomercials that were not held accountable to broadcast-advertising guidelines. By virtue of narrowcasting, cable was also able to fragment its audience into valuable niche demographics, supplying special-interest programming to viewers and creating specialized audiences for specific types of advertisers. Such a strategy previously had been available only through magazines. New Advertising Outlets As Americans in the 1980’s acquired more disposable income and technologies in the home, advertisers had to find new avenues through which to deliver their message. As consumers discovered their new entertainment options, they began listening to Walkmans, watching videotapes, and spending more time and money on recreation rather than watching television. Unfortunately for advertisers, the consumers whose financial resources made it possible to replace television with other luxuries were precisely the consumers that the advertisers most wanted to reach. Realizing the problem, advertisers began focusing more on
The Eighties in America
sponsorships of events, rather than relying primarily on television commercials. Sporting events, rock festivals and tours, and art exhibits became venues for corporate messages. Sporting-event sponsorships proved most valuable to advertisers: Sports audiences were loyal, and sponsorships at the events themselves allowed advertisers to reach the fans they had begun to lose to the remote control, which allowed sports fans to watch events on television all day without ever viewing a commercial. Advertisers began pouring huge sums into sponsorship of sporting events and other activities surrounding professional sports. Budweiser created the Bud Bowl; Nike and other logos began appearing on team jerseys; tobacco companies cleverly appeared on television, circumventing the ban on cigarette commercials, by sponsoring NASCAR drivers. Every inch of playing fields and stadiums was suddenly for sale, and there were plenty of advertisers ready to buy. Corporations and companies also began sponsoring music and art events, which enabled them not only to deliver their messages but also to replace government sponsorship of the arts, allowing arts programs to continue despite federal budget cuts. As a result of the 1980’s distrust of big government, some Americans did not mind the government allowing corporations instead of the National Endowment for the Arts to sponsor cultural events. Impact The 1980’s witnessed a pervasive apparent empowerment of the consumer. Consumers had more entertainment choices, new hardware to help them enjoy their entertainment, and more disposable income with which to entertain themselves. Advertisers in the 1980’s had to meet and overcome the challenges posed by the expansion of entertainment options first by redefining their demographics and then by changing their practices in response to these newly defined target audiences. Changes in FCC and FTC regulations, along with an increase in media options, provided advertisers with more ways to spread their message, even as they prevented them from relying on any single medium or method. The rise of narrowcasting made possible by cable television led to an explosion by the end of the decade in both niche marketing and the use of targeted advertising to reach extemely lucrative demographics with greater efficiency. The 1980’s continued a trend long established in
The Eighties in America
Aerobics
■
21
American advertising of selling a lifestyle rather than a specific product. The specific lifestyles sold were distinctive of the decade, emphasizing luxury items aimed specifically at youthful professionals. There was also an increased recognition by advertisers that form was at least as important as content in ads, which led them, for example, to incorporate youth-oriented aesthetic styles—especially MTVinspired editing—in advertisements designed to reach younger consumers.
can society decade by decade, with separate chapters for each era. Strasser, Susan. Satisfaction Guaranteed: The Making of the American Mass Market. New York: Pantheon Books, 1989. A complete history of mass marketing and the ways in which modern advertising has transformed America from a land of independent shopkeepers to one of mass corporations and consumers. Sara Vidar
Further Reading
See also
Berger, Arthur Asa. Ads, Fads, and Consumer Culture: Advertising’s Impact on American Character and Society. Lanham, Md.: Rowman & Littlefield, 2000. Explores the importance of advertising with respect to the economy, industry, society, and the individual. Cross, Mary. A Century of American Icons. Westport, Conn.: Greenwood Press, 2002. A look at America’s relationship with iconic advertising imagery. Gold, Philip. Advertising, Politics, and American Culture: From Salesmanship to Therapy. New York: Paragon House, 1986. Argues that advertising is instinctual in American society and that its pervasive and manipulative aspects are a basic part of the American communication process. Laird, Pamela Walker. Advertising Progress: American Business and the Rise of Consumer Marketing. Baltimore: Johns Hopkins University Press, 1998. Explores the effects that the modernization of industry and advertising have had on American society. McAllister, Matthew P. The Commercialization of American Culture: New Advertising, Control, and Democracy. Thousand Oaks, Calif.: Sage, 1996. An in-depth look into the factors that motivated advertisers internally and externally throughout the 1980’s and surrounding decades. Mierau, Christina. Accept No Substitutes: The History of American Advertising. Minneapolis, Minn.: Lerner, 2000. Aimed at young readers, Mierau’s book puts advertising’s purpose and power into terms that middle- and high-school-aged readers can understand. Sivulka, Juliann. Soup, Sex, and Cigarettes: A Cultural History of American Advertising. Belmont, Calif.: Wadsworth, 1998. A comprehensive account of the history of advertising in America. Examines the relationship between marketing and Ameri-
Apple Computer; Business and the economy in Canada; Business and the economy in the United States; Cable television; Consumerism; Demographics of Canada; Demographics of the United States; FOX network; Home video rentals; Infomercials; MTV; Reaganomics; Slang and slogans; Television; Yuppies.
■ Aerobics Definition
Form of strenuous exercise designed temporarily to increase one’s respiration and heart rate
Aerobic exercise, which is designed primarily to condition the respiratory system and muscles generally, can take many different forms as long as the exerciser keeps moving. It thus lent itself to more enjoyable exercise regimens based on dance, coaxing many previously sedentary people to take up exercise and generating significant changes in the fitness industry during the 1980’s. Aerobics was part of a larger fitness movement during the 1980’s that fueled a dramatic increase in the number of fitness clubs and club memberships in the United States. The physical-training concepts used in aerobics originated in 1968 with the publication of physician Ken Cooper’s book Aerobics, which explained how to improve one’s cardiovascular health through regular prolonged physical exertion. Aerobics became a best seller and inspired a number of exercise instructors with dance backgrounds to integrate Cooper’s principles of cardiovascular training into choreographed routines to create a new form of exercise, aerobic dance, which was later shortened to aerobics. The two women credited with starting the aerobic dance movement were Judi Sheppard Missett, founder of Jazzercise, and Jacki Sorensen, founder of
22
■
The Eighties in America
Aerobics
Aerobic Dancing, one of the first fitness instructors to publish a book of aerobics routines. Missett and Sorensen had worked steadily during the 1970’s teaching classes and popularizing aerobics, but it was actress Jane Fonda who made aerobics the exercise trend of the 1980’s. Fonda’s 1981 book, The Jane Fonda Workout, was number one on the New York Times best-seller list for nearly a year. An enormously popular series of exercise videotapes followed, at the rate of one a year, for the remainder of the decade. While Fonda was the best known of the decade’s celebrity fitness experts, many other stars—including Raquel Welch, John Travolta, Linda Evans, Victoria Principal, Jayne Kennedy, Marie Osmond, and Debbie Reynolds—also took advantage of the interest in fitness to publish books, videos, or both. In a reversal of the trend, some fitness instructors became stars. Peppy exercise guru Richard Simmons became famous, as he encouraged overweight Americans to eat less and exercise more on The Richard Simmons Show. Changes in the Fitness Industry
The explosion in fitness clubs helped expand the popularity of aerobics. In the past, group fitness classes in gyms had
taken second place to weights and machines. As aerobics entered gyms, it made exercise classes an important source of revenue and drew significant numbers of women into gyms, many of them exercising for the first time. By 1986, 24 million Americans were doing aerobics, 90 percent of them women. Jane Fonda’s mantra, “Feel the burn,” was evidence of the trend toward intense workouts seen in 1980’s exercise. Aerobics classes often featured jumping, running in place, high kicks, and other potentially damaging moves. Many participants suffered injuries from repetitive stress on the feet, knees, and back. One study found that more than 75 percent of instructors and 43 percent of participants had exercise-induced injuries. As a result, low-impact aerobics classes (in which one foot was kept on the floor at all times) and classes incorporating elements from other types of exercise, such as martial arts and yoga, were developed toward the end of the decade. The development of the aerobic step, launched in 1989 by Reebok, was part of the trend toward vigorous but safe workouts. The large number of injuries was also related to a lack of training and certification among fitness instructors, and especially among celebrities portraying themselves in videos as fitness instructors. To teach aerobics in the early 1980’s, one needed little more than a shapely body and an upbeat personality. By the end of the decade, a number of professional fitness organizations, such as the American Council on Exercise and the Aerobics and Fitness Association of America, had formed, calling for standards and certification measures. Cultural Influences of the Fitness Movement The widespread pop-
Actress Jane Fonda, who helped popularize aerobics, works out in her Beverly Hills exercise salon in 1979. (AP/Wide World Photos)
ularity of aerobics, as well as the larger fitness movement, greatly influenced popular culture. The physical ideal of the decade called for fat-free bodies with visible musculature, for both men and women. In fashion, trendy workout clothing became suitable street wear. Leg warmers, large T-shirts worn off the
The Eighties in America
shoulder, sweatbands, and Reebok’s Freestyle—the first shoe designed specifically for aerobics—all became part of a popular look that referenced exercise culture. Films such as Personal Best (1982), Flashdance (1983), The Toxic Avenger (1985), and Perfect (1985), which included dance or exercise sequences or took place in gyms, reflected the decade’s preoccupation with exercise and fit bodies. Impact The proliferation of aerobics during the 1980’s added new momentum to a fitness movement that had been growing for a decade. Because it involved general movement, which could be made fun by turning it into dance, aerobics became an exercise method of choice for those who did not want to deal with the more laborious weight-training approach, which required isolating specific muscle groups. Moreover, because the only equipment required was a video, or even just music, aerobics was much cheaper and easier to do at home than was weight training. As an activity that appealed primarily to women, aerobics helped bring about gender parity in health clubs and contributed to a new physical ideal that made it acceptable for women to sweat and develop muscle tone. The demand for qualified aerobics instruction transformed the fitness industry with the creation of new professional organizations that introduced certification standards for aerobics, as well as other fitness activities. Further Reading
Glassner, Barry. Bodies: Why We Look the Way We Do (And How We Feel About It). New York: G. P. Putnam, 1988. A critical look at the fitness movement of which aerobics was a part. Argues that increased attention to one’s body does not result in improved quality of life. Kagan, Elizabeth, and Margaret Morse. “The Body Electronic: Aerobic Exercise on Video.” TDR: The Drama Review 32, no. 4 (1988): 164-179. A critical, feminist analysis of aerobics videos with an emphasis on Jane Fonda’s videos. Leepson, M. “Physical Fitness: Has the Fitness Boom of the 1970’s and 1980’s Run Out of Steam?” CQ Researcher 2, no. 41 (1992): 953-976. Factual overview of 1980’s fitness trends with statistics and time lines. Luciano, Lynne. “Binging and Buffing Up.” In Looking Good: Male Body Image in Modern America. New York: Hill and Wang, 2001. Provides an overview of men’s fitness concerns during the decade.
Affirmative action
■
23
Rader, Benjamin G. “The Quest for Self-Sufficiency and the New Strenuosity: Reflections on the Strenuous Life of the 1970’s and the 1980’s.” Journal of Sport History 18, no. 2 (Summer, 1991): 255266. Views interest in exercise as part of a middleclass quest for success and improved appearance. Sabol, Blair. The Body of America. New York: Arbor House, 1986. A first-person account of fitness trends of the 1980’s. Seid, Roberta Pollack. “Obsession Becomes Religion: The Fitness Epidemic.” In Never Too Thin: Why Women Are at War with Their Bodies. New York: Prentice Hall, 1989. A critical overview of health and fitness trends during the 1980’s. Shelly McKenzie See also Diets; Fashions and clothing; Flashdance; Leg warmers; Martial arts; Simmons, Richard.
■ Affirmative action Definition
Programs in employment and education that attempt to increase participation of underrepresented minorities and women
During the 1980’s, employers, government agencies, and competitive universities expanded the number of affirmative action programs designed to benefit members of groups that had historically been victims of discrimination. Affirmative action programs first appeared on a large scale in the 1970’s. Because these programs usually included either numerical quotas or limited preferences, critics charged that they constituted “reverse discrimination” against white males. Legal challenges eventually reached the U.S. Supreme Court, which was called upon to decide whether the programs violated the Fourteenth Amendment’s equal protection requirement or the Civil Rights Act of 1964. In University of California v. Bakke (1978), the Court allowed admissions policies of competitive graduate schools to include limited preferences for disadvantaged minorities, while disallowing absolute quotas. In United Steelworkers v. Weber (1979), the Court permitted employers to institute some quotas to “eliminate manifest racial imbalance in traditionally segregated job categories.” Expansion of Programs In the case of Fullilove v. Klutznick (1980), the Supreme Court for the first
24
■
Affirmative action
time approved an affirmative action program containing a racial quota. The issue was the constitutionality of a provision in the federal Public Works Employment Act of 1977 that required that 10 percent of all public works grants by the Department of Commerce be awarded to minority business enterprises. The Court upheld the statute and rejected a white contractor’s claim that the law violated the Fourteenth Amendment. The decision effectively authorized Congress to exercise broad discretion to legislate racial preferences based on the principle of proportional representation for racial groups. The Supreme Court almost always approved of quotas ordered as remedies for proven instances of illegal discrimination. In Local 28 Sheet Metal Workers International v. Equal Employment Opportunity Commission (1986), the Court upheld a lower court’s imposition of a 29 percent membership quota on a union found guilty of racial discrimination. Likewise, in Paradise v. U.S. (1987), based on a finding of egregious discrimination, the Court upheld the constitutionality of a lower court’s order that 50 percent of promotions in the Alabama state police be awarded to African Americans until their representation among officers corresponded to their percentage of the population. The threat of legal suits based on statistical disparities induced employers to institute preferential policies, but such policies exposed employers to claims of reverse discrimination. In Johnson v. Santa Clara County (1987), the Court examined an instance in which a white woman had been promoted over a white male with slightly higher qualifications. Going beyond the idea of affirmative action as a remedy for illegal discrimination, the Court held that preference was justified by the existence of a “manifest imbalance” in the numbers of women “in traditionally segregated job categories.” The decision presented employers with a green light to continue their selfinstituted preference programs. Limits on Preferences Although endorsing most race-conscious programs until late in the 1980’s, the Supreme Court always recognized some limits on how far the programs might proceed, particularly when the issue was labor contracts that instituted seniority systems. In Firefighters Local Union v. Stotts (1984), the Court ruled that the lower courts did not have the authority to overturn seniority agreements in order to protect the jobs of recently hired black
The Eighties in America
workers. Likewise, in Wygant v. Board of Education (1986), the Court struck down an affirmative action program that protected minority teachers from layoff by requiring the layoff of more senior white teachers. In a 5-4 decision, the majority concluded that the program violated the principles of equal protection under the Fourteenth Amendment. President Ronald Reagan’s administration attempted to stop the spread of affirmative action programs. Although President George H. W. Bush, elected in 1988, was more moderate, he opposed aggressive programs. While their administrations sought remedies for victims of invidious discrimination, they took a stand against all quotas, as well as most racial preferences. Among their appointments to the Supreme Court, David Souter was the only liberal justice on affirmative action issues, but Associate Justice Sandra Day O’Connor would eventually endorse some limited preferences if they included individual assessments of qualifications. In 1989, the Supreme Court issued two rulings that dismayed proponents of racial preferences. Overturning a city’s mandate for a 30 percent set-aside for minority contractors in Richmond v. J. A. Croson Co., the Court held that the plan violated the constitutional rights of white contractors. Applying “strict scrutiny” review for the first time to an affirmative action program, the decision required that set-asides be justified by a showing of past discrimination. In Ward’s Cove Packing Co. v. Atonio, the Court reversed an earlier decision and required plaintiffs to assume the burden of proof in disparate-impact cases, or claims that unnecessary employment qualifications disproportionately harmed the opportunities of minorities. This decision produced a great political debate, until it was largely undone by the Civil Rights Act of 1991. Impact During the decade of the 1980’s, affirmative action programs became more widespread than at any time before or after. Preferences were common, and some educational and employment opportunities were available only to members of minority groups and women. Given the resentments of many white males, the emergence of a strong backlash was not surprising. Beginning in 1989, the Supreme Court began to show more consideration for claims of reverse discrimination. Although the Court would move in a zig-zag course on preferences during the 1990’s, the general direction was toward
The Eighties in America
increasing limitations. During the first decade of the twenty-first century, nevertheless, affirmative action programs continued to be a part of American culture, and it appeared that controversies about their fairness would continue for many decades into the future. Further Reading
Anderson, Terry H. The Pursuit of Fairness: A History of Affirmative Action. New York: Oxford University Press, 2004. An excellent and balanced historical account from the beginning of racial and gender preferences in the 1970’s until the early twentyfirst century. Eastman, Terry. Ending Affirmative Action: The Case for Colorblind Justice. New York: Perseus, 1997. In addition to a historical summary, Eastman argues the case for ending all preferences based on race or gender. Kranz, Rachel. Affirmative Action. New York: Facts on File, 2002. This introductory research guide for students summarizes the political debate and includes a historical overview, as well as a summary of important legal cases. Leiter, Samuel, and William M. Leiter. Affirmative Action in Antidiscrimination Law and Policy: An Overview and Synthesis. Albany: State University of New York Press, 2002. A comprehensive study of various programs, including their origin, growth, impact, and future prospects. Spann, Girardeau. Law of Affirmative Action: TwentyFive Years of Supreme Court Decisions on Race and Remedies. New York: New York University Press, 2000. A comprehensive chronicle of the Court’s rulings from the 1970’s until the end of the twentieth century. Weiss, Robert. We Want Jobs: A History of Affirmative Action. New York: Routledge, 1997. An account of the change in the civil rights movement from a demand for equal opportunity to an emphasis on statistical goals and timetables, resulting in a white backlash. Thomas Tandy Lewis See also African Americans; Latinos; Native Americans; O’Connor, Sandra Day; Racial discrimination; Reagan Revolution; Rehnquist, William H.; Supreme Court decisions.
Africa and the United States
■
25
■ Africa and the United States Definition
The state of affairs between the United States and the countries of Africa
In the 1980’s, the more humane, liberal policies of the Carter administration gave way to the more aggressive, conservative ones of the Reagan years. George H. W. Bush ended the decade with little or no change to Ronald Reagan’s policies, leaving much of Africa confused about the commitment of the United States to the continent’s plights and concerns. Relations between Africa and the United States were contentious as President Ronald Reagan swept his way into the White House with a strong social conservative program. Fresh in the memory of most Africans was the smoldering remains of a highly anticipatory but short-lived era of hope for greater U.S. assistance from the liberal administration of President Jimmy Carter. The Reagan Administration In the face of a world of increasing threats to Western democracy, a surge in conservatism in the United States with roots in the Richard M. Nixon administration in the 1970’s found its full expression in the Reagan administration of the 1980’s. As the decade began, Americans not only were troubled by the ever-present danger of nuclear annihilation by the Soviet Union but also became aggravated by the rise of fundamentalism in the Middle East. The fundamentalist revolution that brought the Ayatollah Khomeini regime to power in Iran was in particular quite threatening and humiliating to the American psyche, as the revolutionaries seized the U.S. embassy in Tehran and held American diplomats as hostages for 444 days. President Carter’s fumbled efforts to free the hostages had added to the state of despair. It came as no surprise that Reagan, with his message of restoring America’s pride through the infusion of patriotism and strong fiscal policies, resonated with the majority of Americans’ public longing for the rekindling of the country’s lost sense of greatness. The Nixon administration had paved the way to tougher foreign policies by emphasizing realpolitik and détente as foreign policies, while at the domestic level it maintained wage and price controls. The Reagan administration took these policies to new heights by emphasizing stronger measures in the Cold War containment policies of an earlier era.
26
■
Africa and the United States
Famine in Ethiopia threatened millions of lives in the 1980’s, and media images of the famine shaped American attitudes toward the entire African continent during the decade. (AP/Wide World Photos)
In South Africa, where the moral justifications for those policies had become too controversial, the administration substituted a “constructive engagement” policy to the apartheid regime. Elsewhere in the continent, the Reagan administration continued to embrace policies motivated by strong national interest: selecting and favoring African leaders or countries based largely upon their commitment to the renunciation of communist ideals. That most of these leaders led draconian governments that brutalized their populations—a strong contradiction of the larger democratic ideals that the United States professes—made little or no difference. Debt Crises, AIDS, and Dictatorships
Africa’s ability to fashion a foreign policy toward the United
The Eighties in America
States has always been determined by its ability to deal with some of its intractable domestic problems. Africa’s problems in the 1980’s became compounded by the spread of acquired immunodeficiency syndrome (AIDS), the impasse over burgeoning debts for foreign lenders, and the rise of dictatorships or authoritarian regimes masquerading as democracies. Furthermore, Africa’s fragmented loyalties to its various colonial authorities complicated or worsened its ability to deal with any undesirable foreign policy. This fragmentation would prove to make the term “undesirable” in the experience of Africans almost inexplicable. From most accounts, the three issues that linked Africa to the United States were the apartheid policies of the South African regime, the attitude toward the overwhelming effects of AIDS, and, perhaps most important, the debt crises that was strangulating most governments of the continent. Given the commitment to its social conservative position, the U.S. stance on apartheid in South Africa under the Reagan administration was almost predictable. As it appeared that the apartheid regime of South African president Pieter W. Botha was resistant to communist ideology, it gained support from the Reagan administration. To make matters worse, the appeal to communism by the neighboring Angolan government of President Agostinho Neto provided more justification for urgent U.S. support for the apartheid South African regime. To prevent the spread of socialist ideology in the region, the Reagan administration beefed up its support for the Swavimbi-led rebellion against the legitimate government, in complete disregard for the declaration of the Organization of African Unity (OAU). The Reagan administration then preferred to deal on a unilateral basis with individual governments and leaders despite their blatant record of human rights violations, political corruption, and outright disregard for the rule of law. Mobutu Sese Seko of Zaire (now the Democratic Republic of Congo), Ibrahim Babangida of Nigeria, Omar Bongo of Gabon, and Teodoro Obiang Nguema Mbasogo of Equatorial Guinea are a few examples. Africa’s fledgling administrations had in the 1960’s and 1970’s borrowed extensively from overseas lenders based on the belief that they could spur their economies toward development. Regrettably, both the well-meaning and the dubious soon found out the path to development was more daunting than mere
The Eighties in America
loans could cure. By the 1980’s, most African countries were so sunk in debt that the entire continent could be said to be steeped in a debt crisis. As the Third World nations gained more clout in the United Nations in the 1970’s, they assumed that their strength in numbers would translate into greater authority and control over the dynamics of global economics. The debts that most countries had inherited in the 1960’s and 1970’s they erroneously believed could be glossed over by lending institutions in the developed nations of the world. As most of these countries had also assumed membership in the International Monetary Fund (IMF), a subsidiary of the United Nations, they also mistakenly thought that they could sway the course of events in their direction and demand fairer treatment on their loans. The United States, with its budget increasingly devoted to military buildup and the containment of communism, could not have proven to be a worse ally to Africa on this subject in the 1980’s. In addition to the ravages of debt, broken infrastructure, and unfavorable trade relations with developed nations, Africa was hit by the AIDS pandemic. Whole communities were wiped out; governments were in chaos as the continent was left to grapple with the interminable effects of the deadly disease. Looking to the United States, African nations expected help but instead received the wellrehearsed message of self-help that was the mantra of social conservatism. When in their desperation they turned to the United Nations for help, they again met with stiff resistance and came to realize that the multinational institution was largely manipulated by the key players in the international community. Impact As Africa rapidly became the theater of extremely difficult problems during the decade, it constantly looked beyond its boundaries for aid and assistance from the United States and the rest of the developed world but was met with a unfavorable dynamic still dominated by the Cold War. Further Reading
Berkeley, Bill. The Graves Are Not Yet Full: Race, Tribe, and Power in the Heart of Africa. New York: Basic Books, 2001. The book draws an uncanny parallel between the evil intentions that motivated Adolf Hitler to kill six million Jews to outrageous atrocities in Zaire, Rwanda, and South Africa under racist regimes.
African Americans
■
27
Duignan, Peter, and Lewis H. Gann. The United States and Africa: A History. New York: Cambridge University Press, 1984. Traces more than four centuries of relations between Africa and North America. Challenges prevalent assumptions of the benefits of colonialism, stressing instead the valuable contributions of Africa to North America in the course of the relationship. Gordon, David F., David C. Miller, Jr., and Howard Wolpe. The United States and Africa: A Post-Cold War Perspective. New York: W. W. Norton, 1998. Offers a penetrating look at the moral and practical aspects of U.S. relations with nations of Africa and calls for a fresh approach to fill what appears to be an ideological void. Huband, Mark. The Skull Beneath the Skin: Africa After the Cold War. Boulder, Colo.: Westview Press, 2001. The author contends that the West from the colonial times has maintained a negative and debilitating effect on most of Africa and urges the West to “leave Africa alone.” Mokoena, Kenneth, ed. South Africa and the United States: The Declassified History. New York: New Press, 1993. A piercing review of the secret relations between the South African apartheid regime and the United States. Schraeder, Peter J. United States Foreign Policy Toward Africa: Incrementalism, Crisis, and Change. New York: Cambridge University Press, 1994. A theoretical analysis of U.S. foreign policy toward Africa in the post-World War II era. Austin Ogunsuyi See also
Cold War; Foreign policy of the United States; Reagan, Ronald; Reagan Doctrine.
■ African Americans Definition
U.S. citizens of African descent
As a result of affirmative action programs and legislation passed prior to the 1980’s, some African Americans experienced greater access to education and employment during the decade. However, African Americans as a group still remained at a disadvantage economically, educationally, and socially relative to their white counterparts. The affirmative action programs and legislation put into place in the 1960’s and 1970’s finally began taking root in the 1980’s. African Americans’ educa-
28
■
African Americans
tional attainment improved substantially during the decade. By 1980, more than 1.3 million African Americans were in college. By 1989, two-thirds of African American adults aged twenty-five years or older had completed high school, and 12 percent of them had college degrees. These statistics represented a vast improvement over those of earlier decades. The 1980’s also witnessed the rise of a number of prominent African American politicians. Harold Washington became the first African American mayor of Chicago in 1983. Six years later, David Dinkins became the first mayor of African descent to be elected in New York City. Thirty-one African Americans were in mayoral positions in the United States in 1984, representing many of the nation’s largest cities, such as Philadelphia, Charlotte, Los Angeles, Detroit, New Orleans, Birmingham, and the District of Columbia. In 1989, L. Douglas Wilder of Virginia was elected as the first African American governor of a state. African Americans were also appointed to several high-profile government positions. Among them, General Colin Powell at age fifty-two became the youngest person and the first African American to be named chairman of the Joint Chiefs of Staff, the highest office in the nation’s military. The Reagan Years Despite African American gains in representation and education, political conservatives were often hostile toward African Americans as a group, and they actively cultivated and exploited such hostility in the electorate. African Americans traditionally voted for Democrats, and the Republican Party, rather than court African American votes, chose instead to associate African Americans with welfare, crime, and “reverse discimination” by the affirmative action system, in an effort to appeal to disgruntled white voters. The 1980 presidential election of Ronald Reagan brought these conservative views to the White House. Reagan was opposed to most entitlement programs, and he attempted to reduce federal spending on such programs, as well as on other social programs that aided African Americans and other minorities. In 1981, more than 260,000 marchers participated in a Washington, D.C., rally known as Solidarity Day to protest Reagan’s policies toward organized labor and his reductions in social programs. Despite the conservative backlash against African
The Eighties in America
American progress, the nation took significant steps to honor the life and contributions of Martin Luther King, Jr., the key civil rights proponent of the twentieth century. The Martin Luther King, Jr., Library and Archives opened on October 19, 1981, in Atlanta, Georgia. Coretta Scott King, King’s widow, led the efforts to establish the facility to house King’s many written speeches and other private and public documents connected to his life and work. In 1983, President Reagan signed into law a bill creating Martin Luther King Day as a national holiday to be observed each year on the third Monday in January. African American unemployment grew during the 1980’s, and by the end of the decade, more than one in every four adult African American men between the ages of twenty-four and fifty-four was out of work. The rate was much higher for young African American men in the inner cities, and the overall African American unemployment rate was two and one-half times higher than that of white unemployment. In 1983, African American unemployment stood at a record high of almost 21 percent. Rising unemployment had significant economic and social consequences for many African Americans. The percentage of families headed by single women increased, and single-parent households were almost twice as likely to fall below the poverty line as were two-parent households. Not only did poverty and unemployment increase for African Americans but the income gap between African Americans and white Americans also grew dramatically. That gap had decreased during the 1960’s and early 1970’s, but by 1984 the disparity had returned to its 1960 level. A small number of middleclass African Americans did become more economically secure, however, as the proportion of African American households earning high incomes rose by 46 percent during the 1980’s. Debate continued throughout the decade concerning the appropriateness of employment affirmative action programs and court-ordered compensatory remedies for historically rooted patterns of discrimination. Despite the significant backlash against such programs, civil rights activists and others prevented them from being eliminated. In 1981, the Morbidity and Mortality Weekly Report, a journal of the Centers for Disease Control, featured a story by doctors Michael S. Gottleib and Wayne Shandera on the deaths of five gay men from what was diagnosed as Pneumocystis carinii pneumonia
The Eighties in America
(PCP). Over time, the medical community began to realize that such deaths of relatively rare diseases were symptomatic of a larger new epidemic that was emerging, and they named the overall disease acquired immunodeficiency syndrome (AIDS). Early U.S. media coverage of AIDS was focused primarily upon gay men and Haitian immigrants. Soon, however, connections began to be made to African Americans as well. At the 1985 AIDS and Minorities conference, it was announced that $7 million would be given to minority organizations to use for prevention and education programs. Contributions to American Culture African Americans continued to have a profound impact on American culture. Distinctively African American cultural production remained as vital as ever in the 1980’s, and it continued both to exist as a separate subcultural phenomenon and to influence the development of mainstream American culture. Rhythm and blues, funk, rock and roll, soul, blues, and other American musical forms had all originated in African communities in earlier decades, and their later development was shaped by African American artists as well as white artists. The 1980’s were most notable in this regard for the emergence of hip-hop culture, which included rap music and break dancing, as well as distinctive fashions and slang. Hip-hop emerged as a cultural movement initiated by inner-city youth, primarily by African Americans and Latinos living in New York City. By 1979, hip-hop had become a commercially popular music genre and began to enter the American music mainstream. By the end of the 1980’s, a form of hip-hop called gangsta rap became a major part of American music, causing significant controversy over lyrics that were perceived as promoting violence, promiscuity, and drug use. Michael Jackson, another controversial singer and performer, recorded his album, Thriller (1982), which became the best-selling record in U.S. history. In 1983, the album won eight Grammy Awards, and it sold more than 30 million copies worldwide. In 1980, Black Entertainment Television (BET), the first cable television network with an African American target audience, began broadcasting from its headquarters in Washington, D.C., under the leadership of Robert L. Johnson. Rap music played a prominent role on BET, and in 1988, music video channel MTV added a hip-hop show, Yo! MTV Raps, to its lineup as well. The twenty-four-hour music net-
African Americans
■
29
work had been criticized earlier in the decade for neglecting African American artists in its music video rotation. By launching Yo! MTV Raps, it not only addressed this criticism but also exposed hip-hop music, videos, and culture to its wide, mainstream audience. Hip-hop gained greater legitimacy by finding a home on a network that had become an arbiter of musical taste; in addition, specific African American musical artists found larger audiences and greater success. Vanessa Williams became the first African American woman to win the coveted crown of Miss America on September 18, 1983, in Atlantic City, New Jersey. However, In July, 1984, she had to give up her crown after Penthouse magazine published nude pictures of her. Suzette Charles, Miss New Jersey, assumed the crown, becoming the second African American Miss America in the process. Alice Walker’s The Color Purple (1982) won the American Book Award and Pulitzer Prize in 1983. The novel’s depiction of African American men dominating African American women in the South was met by criticism by many such men, who felt her depiction promoted racial stereotypes. Steven Spielberg directed a popular film adaptation of the novel in 1985. In sports, Magic Johnson led the Los Angeles Lakers to five National Basketball Association (NBA) championships and won the Most Valuable Player Award in 1987. Doug Williams, one of the first African American quarterbacks in the National Football League (NFL), led the Washington Redskins and was named the Most Valuable Player in Super Bowl XXII on January 31, 1988, in San Diego, California. Mike Tyson in 1986 won the World Boxing Council’s heavyweight championship, becoming the youngest boxer to ever hold the prestigious title. Bobsledders Jeff Gadley and Willie Davenport became the first African Americans to take part in the Winter Olympic Games in 1980. Hank Aaron, the home run king of the National League, was elected to the Baseball Hall of Fame in 1982. At the 1988 Olympics, Florence Griffith-Joyner won three gold medals and one silver medal. At the Calgary Winter Olympics of the same year, Deb Thomas became the first African American woman to win an Olympic medal for figure skating. Persisting Problems and Attempted Solutions Civil rights advocates continued to press with some success for the implementation of policies for group ad-
30
■
The Eighties in America
African Americans
vancement. The Supreme Court in 1980 ruled that Congress could impose racial quotas to counteract discrimination against African Americans in federal and state laws. Other court rulings supported affirmative action as a way to counteract years of racial discrimination. Some African Americans improved their social economic standing significantly, as the 1980’s witnessed the expansion of a robust, African American middle class across the United States. Despite these social and economic advances, persistent challenges for many African Americans remained, including inadequate health care access, discrimination in housing, and high levels of unemployment and poverty. Crime rates continued to escalate across the United States, and their effects were magnified in poor African American communities. As a result, racial tensions also increased during the 1980’s. On July 10, 1980, the U.S. Civil Rights Commission released a study indicating that police brutality was still a serious problem and a major cause of urban turmoil. During the 1980’s, the Ku Klux Klan increased their white supremacist activities in the South, engaging in more marches and cross burnings. That pattern was followed throughout the country, as hate groups became more active and more such groups sprang into being. Crime associated with the African American community was exemplified by two particularly highprofile incidents. First, the city of Philadelphia attempted to evict members of the radical group MOVE from a home in an African American residential neighborhood. Authorities dropped a bomb on MOVE’s rooftop bunker, killing eleven people and destroying more than sixty row homes, at a loss estimated at more than $8 million. In a second highprofile case that dominated national news for several weeks, Bernhard Goetz, a white man traveling alone on a New York subway, was approached by four African American men. Later claiming that the men threatened him, Goetz shot all four, paralyzing one. After much delay and many lawsuits, Goetz was acquitted of attempted murder and assault. The polarizing reaction to the Goetz shooting was one of the many stark racial incidents that marked major cities in the 1980’s. Impact The 1980’s were a mixed decade for African Americans. Many saw their lives improve, but many others remained trapped by persisting institu-
tional structures of racism. The success stories were often used to argue against the need for affirmative action and other programs designed to eliminate those structures. Moreover, as personal statements of racism became less common among whites (either through a decrease in the sentiment or through a decrease in its public acceptability), the importance of institutional racism in the absence of racist intent was questioned. During the 1980’s, political conservatism increased as a force in American public discourse as well as in electoral politics. To a certain extent, this conservatism entailed direct racial discrimination. More often, however, race was used to symbolize class. The decade witnessed an increasing gap between the haves and have-nots, and discussions of poverty focused with great frequency on poor African Americans, who, while a significant proportion of the poor, were not in the majority of that category. Nevertheless, urban poverty, unemployment, and welfare were often discussed, whether implicitly or explicitly, in racial terms, and public attitudes toward race and class became mutually imbricated. As Spike Lee pointed out in Do the Right Thing (1989), however, African Americans received a disproportionate amount of attention, not only as symbols of impoverishment but also as symbols of success. Many of the most successful cultural icons of the decade, including Michael Jackson, Eddie Murphy, Magic Johnson, and Oprah Winfrey, were African Americans. As a result, many Americans adopted the odd double standard evinced by the character of Pino in Lee’s film: They were hostile toward African Americans as a group while adulating individual African American performers and sports heroes. Thus, both as positive and as negative figures, African Americans were featured prominently in the public discourse of the decade. Further Reading
George, Nelson. Post-soul Nation: The Explosive, Contradictory, Triumphant, and Tragic 1980’s as Experienced by African Americans. New York: Viking Books, 2004. Offers a year-by-year accounting of the major political, sports, and entertainment events that had an impact on African Americans in the 1980’s. Hampton, Henry, Steve Fayer, and Sara Flynn. Voices from Freedom: An Oral History of the Civil Rights Movement from the 1950’s Through the 1980’s. New
The Eighties in America
York: Bantam Books, 1991. Provides unique insights into the Civil Rights movement by collecting first-person accounts of the fight for civil rights from those who participated in it. Kitwana, Bakari. The Hip Hop Generation: Young Blacks and the Crisis in African American Culture. New York: Basic Civitas, 2002. In-depth discussion of the hip-hop cultural movement with particular emphasis on its meaning to African American youth. Also focuses on the negative stereotypes promoted in the images associated with the music. Mary McElroy See also
Affirmative action; Atlanta child murders; Basquiat, Jean-Michel; Beloved; Bonfire of the Vanities, The; Brawley, Tawana; Central Park jogger case; Color Purple, The; Cosby Show, The; Crack epidemic; Do the Right Thing; Elections in the United States, 1988; Goetz, Bernhard; Griffith-Joyner, Florence; Hawkins, Yusef; Hip-hop and rap; Holmes, Larry; Horton, William; Houston, Whitney; Howard Beach incident; Jackson, Bo; Jackson, Jesse; Jackson, Michael; Johnson, Magic; Kincaid, Jamaica; Leonard, Sugar Ray; Lewis, Carl; Marriage and divorce; Martin Luther King Day; Minorities in Canada; MOVE; Mr. T; MTV; Murphy, Eddie; Nation of Yahweh; Prince; Public Enemy; Racial discrimination; Reaganomics; Rice, Jerry; Richie, Lionel; RunD.M.C.; Thomas, Isiah; Turner, Tina; Tyson, Mike; Washington, Harold; Williams, Vanessa L., Wilson, August; Winfrey, Oprah.
■ Age discrimination Identification
Unequal treatment of a person based on age
During the 1980’s, the laws protecting workers from age discrimination were significantly expanded, both by court rulings and by statutory amendments. By the end of the decade, federal law provided more protections to more people than ever before, and it created a correspondingly larger number of obligations on the part of employers. The most important law banning discrimination based on age in the United States is the Age Discrimination in Employment Act (ADEA) of 1967, which initially covered employment discrimination in the private sector for those aged forty to sixty-five. As later amended, it empowers the Equal Employment
Age discrimination
■
31
Opportunity Commission (EEOC) to investigate allegations of age discrimination, issue rulings, and negotiate with errant employers on behalf of victims. Originally, the ADEA had many exceptions. Employers with fewer than twenty employees were exempt, as were elected officials when choosing their personal staffs or appointing policy makers. Moreover, the law did not protect legal advisers on specific cases, firefighters and law-enforcement officials, or corporate executives eligible for pensions of $27,000 or more. However, coverage expanded in the 1970’s to include employees of federal, state, and local governments; mandatory retirement was abolished for federal workers, and the highest age covered was raised to seventy. Nevertheless, courts have allowed “reasonable factors” to permit employers to favor younger over older workers. For example, it is permissible to favor younger workers if an employer can demonstrate that age is a “bona fide occupational qualification” (BFOQ) for a given job. In Geller v. Markham (1980), the U.S. Court of Appeals for the Second Circuit ruled that age discrimination occurred when a school district hired a twenty-six-year-old teacher instead of a fifty-five-yearold teacher because the school district sought to save money. The case was appealed to the Supreme Court as Markham v. Geller (1981), but the high court refused to review the decision, which effectively banned the West Hartford, Connecticut, school board from cutting costs by hiring less experienced teachers. During the 1980’s, the ADEA was amended on several occasions. In 1982, the health-benefits guarantee was extended to age seventy, and mandatory retirement for tenured teachers was repealed. In 1984-1985, the health-benefits guarantee was extended to spouses of employees up to the age of seventy, coverage was extended to overseas employees of American corporations, and mandatory retirement of corporate executives was disallowed unless their annual pensions were at least $44,000. Amendments in 1986 eliminated mandatory retirement for private-sector workers and required employers to extend health-insurance benefits to workers beyond age seventy. In 1987, Congress banned denial of accrued pension benefits for those working after the age of sixty-five. A 1988 amendment extended the time limit for filing EEOC complaints. States’ rights advocates, who wanted to prevent
32
■
The Eighties in America
Age discrimination
private parties from suing state governments on the basis of federal laws, were narrowly in the minority when the Supreme Court ruled 5 to 4 in EEOC v. Wyoming (1983) that the federal law constitutionally trumped a Wyoming state law mandating retirement at the age of fifty-five for a Game and Fish Department supervisor. In Johnson v. Mayor & City Council of Baltimore (1985), however, the Court unanimously held that the mere fact that federal firefighters were required to retire at age fifty-five did not establish being younger than fifty-five to be a BFOQ for all state and local firefighters. In other words, the Court left it to state and local governments to provide evidence supporting their claim that age is a relevant factor for firefighters in the performance of their duties. Although the Federal Aviation Administration (FAA)continued to require pilots and co-pilots to retire at age sixty, in Western Air Lines v. Criswell (1985), the Supreme Court unanimously disallowed involuntary retirement of flight engineers (those who monitor the cockpit’s side-facing instrument panel) at age sixty, because the company refused to provide specific medical evidence of unfitness. Trans World Airlines (TWA), meanwhile, gave younger pilots and co-pilots medical fitness exams and reassigned those who flunked as flight engineers. In TWA v. Thurston (1985), the Court unanimously ruled that TWA could not require those who passed their fitness exams, upon reaching sixty, to await reassignment as flight engineers on the basis of the seniority of their application. During the 1980’s, corporate and university downsizing seemed imperative, as personnel costs mounted even as shareholders demanded higher and higher profits. Accordingly, employers offered “golden handshake” plans with early-retirement incentives. As a condition of such plans, employees were asked to waive various rights, including the right to sue in the event that the incentives benefited some employees more than others. These plans continued through the end of the decade, but in 1990, Congress passed the Older Workers Benefit Protection Act. The act provided procedural protections, including ample time to consider an early-retirement plan and an interval of time to cancel a decision to accept a plan.
Impact Age discrimination became the most frequent type of discrimination complaint handled by the EEOC and the courts in the 1980’s. Successful litigants won lucrative settlements, and businesses were forced to alter their practices if they wanted to avoid lawsuits of their own. The age-discrimination laws applied primarily or solely to employment discrimination, however. Age discrimination in public accommodations (such as buses, shopping malls, and theaters) and public facilities (such as government offices and public parks) was not prohibited. Still, the high profile of age-discrimination litigation combined with statutory expansions to give the issue of elderly rights a prominent place in the public consciousness. The 1980’s was thus a crucial decade in the acceptance and expansion of those rights. Further Reading
Eglit, Howard. “Health Care Allocation for the Elderly: Age Discrimination by Another Name?” Houston Law Review 26 (October, 1989): 813-900. A discussion of the practice of providing transplanted organs to younger rather than older persons. Issacharoff, Samuel, and Erica Worth Harris. “Is Age Discrimination Really Age Discrimination? The ADEA’s Unnatural Solution.” New York University Law Review 72 (October, 1997): 780-840. Argues that the American Association of Retired Persons perverted the ADEA by securing amendments enabling rich executives to obtain lucrative awards by suing employers after being forced to retire because of their high salaries. U.S. Equal Employment Opportunity Commission. Age Discrimination. Washington, D.C.: Government Printing Office, 1998. A comprehensive review of the Age Discrimination in Employment Act, as amended. Whitton, Linda S. “Ageism: Paternalism and Prejudice.” DePaul Law Review 46 (Winter, 1997): 453482. Reviews social and psychological bases for age discrimination. Michael Haas See also Affirmative action; Mandatory retirement; Supreme Court decisions.
The Eighties in America
■ Agriculture in Canada Definition
The raising and preparation of crops and livestock for Canadian and foreign markets
Crisis characterized Canada’s agriculture in the 1980’s: Tens of thousands of Canadian farmers lost their land, homes, and way of life, as the agricultural economy collapsed. The 1980’s was a decade of crisis for Canadian farmers. A host of problems arose that combined and continued throughout the decade, devastating Canada’s agricultural sector. Farmers across Canada suffered immensely, although those in the Prairie Provinces of Alberta, Saskatchewan, and Manitoba experienced the greatest hardships. Many of the older, better-established farmers endured and survived the decade. Younger farmers, however, especially those who began farming in the 1970’s, were decimated by the economic events of the 1980’s. Revenues Fall as Expenses Rise At its most fundamental level, the Canadian agricultural crisis of the 1980’s was precipitated by overproduction, which resulted in low commodity prices and abysmally low farm incomes. As world markets were flooded in the early 1980’s with surplus commodities, including wheat, barley, oats, and canola, prices fell sharply. Early in the decade, the net farm income fell to between $10,000 and $12,000 annually, a paltry amount for farm families to provide for life’s necessities for a year. As the agricultural recession intensified in the middle and end of the decade, farm income plummeted further, reaching one-half the 1970’s level. By 1987, net farm income in Canada had fallen below zero. It was simply impossible for many farm families to sustain themselves, given these deplorable conditions. While the prices Canadian farmers received for commodities experienced this drastic decline, production costs accelerated sharply. The 1980’s were marked by dramatic increases in the cost of chemicals, fertilizer, pesticides, herbicides, and seeds, which were the mainstays of modern Canadian agriculture. Of perhaps more importance was the onerous increase in the cost of borrowing money, necessary for most farms to continue operating. In the early 1980’s, interest rates on farm operating loans increased to 20 percent. Interest rates remained high throughout the decade. These exorbitant in-
Agriculture in Canada
■
33
terest rates combined with farmers’ negative net incomes forced many farmers to abandon their enterprises altogether. In short, inflation devastated Canada’s farmers. The Canadian farmer’s existence was further jeopardized by an acute decline in the value of farm capital, comprising such assets as livestock, machinery, buildings, and land. Most damaging was the decline in the value of land, especially for younger farmers who had purchased farms in the 1970’s when land prices were high. Canadian land values declined by $40 billion in the 1980’s, while the value of farmland and buildings dropped nearly 50 percent. In some areas, declining land values exceeded these averages. In Saskatchewan, for example, in the five years between 1983 and 1988, agricultural land prices declined from $300 to $80 per acre. Most banks and other lending institutions refused to extend operating loans to farms on greatly devalued land. Many farmers were simply unable to acquire the capital necessary to continue operating. In addition to the agricultural recession and the rise in interest rates, a prolonged and severe drought struck much of Canada, especially the Prairie Provinces. Between 1984 and 1988, Prairie-Province farmers suffered immense losses caused by the unrelenting drought, which was the worst in sixty-five years. To complicate matters, the hot, dry weather was ideal for grasshoppers, which repeatedly ravaged crops. When the sparse storm clouds finally brought some rain to the prairies, they were often accompanied by strong winds and intense hail that left crops ruined in their wake. Impact For individual farmers, the agricultural recession of the 1980’s was devastating. Burdened by enormous debt, low prices, and excessive expenses, many were compelled simply to abandon their operations. The Canadian farmers who were driven from business in the 1980’s typically disposed of their assets at farm auctions, where they often received a pittance for their land, machinery, equipment, and other assets. These farmers abandoned their land, homes, and way of life, realizing the futility of continuing, given the severity of the agricultural recession. Some received sufficient proceeds from their sales to start life anew in another job or profession. Many other farmers struggled on until they were forced into bankruptcy or were foreclosed on by banks and other lending institutions. These farmers
34
■
The Eighties in America
Agriculture in the United States
often received little or nothing when their assets were disposed of by forced sales. In some areas, farmers protested mass evictions, but these protests were ineffective, only occasionally even delaying the inevitable dispossessions. In addition to the ruin experienced by individual farmers and their families, hundreds of small rural Canadian communities—located in areas where farming was a mainstay of the local economy—were devastated as well. When farmers lost their homes and farms, they typically migrated to large cities some distance from their former homes. The exodus of hardworking farm families often tore apart the social fabric of the small, isolated farm communities. Thousands of businesses, schools, churches, and other social institutions closed their doors. For many of the communities, whose existence was in jeopardy even prior to 1980, the recession was catastrophic. Further Reading
Boyens, Ingeborg. Another Season’s Promise: Hope and Despair in Canada’s Farm Country. Toronto: Penguin Books, 2001. Anecdotal study of the Canadian farm crisis that reveals the human costs and tragedy associated with the calamity. Lind, Christopher. Something’s Wrong Somewhere: Globalization, Community, and the Moral Economy of the Farm Crisis. Halifax, N.S.: Fernwood, 1995. This monograph examines an array of problems associated with the farm crisis, while special attention is given to moral issues associated with the tragedy. Revealing in regard to the impact the catastrophe had on individuals and communities. Wilson, Barry K. Farming the System: How Politics and Farmers Shape Agricultural Policy. Saskatoon, Sask.: Western Producer Prairie Books, 1990. Analyzes the Canadian agricultural crisis from both national and global perspectives. Demonstrates how agricultural policy evolves, especially in regard to the political environment, while revealing the impact the process and decisions have on individual farmers. Robert R. McKay See also
Agriculture in the United States; CanadaUnited States Free Trade Agreement; Farm Aid; Farm crisis; Globalization; Income and wages in Canada; Inflation in Canada; Natural disasters.
■ Agriculture in the United States Definition
The raising and preparation of crops and livestock for U.S. and foreign markets
Throughout the 1980’s, U.S. farmers faced difficult economic conditions. Both the government and private organizations sought to aid them, but the decade witnessed the widespread failure of small- and medium-scale farms, the collapse of rural communities that depended on them, and the consolidation of American agriculture in the hands of large-scale farming corporations. As the 1980’s dawned, the population of the United States surpassed 227 million people. Of those individuals, 6,051,000 considered themselves to be farmers and ranchers, representing a mere 3.4 percent of the nation’s population. They labored on 2,439,510 farms, which averaged 426 acres in size. Farmers awoke on January 1, 1980, hoping that the decade that lay before them would be better than the closing years of the decade they had rung out the night before. U.S. agriculture in the late 1970’s could only be described as difficult. The opening years of the 1970’s had been relatively halcyon ones for the nation’s farmers. With trade barriers lowered and record purchases of American grain by the Soviet Union, farm exports soared to new levels. Accordingly, farm life improved dramatically for most families: as commodity prices rose, so did incomes. With a seemingly unquenchable foreign appetite for American grains, the Federal Land Bank, its lending restrictions recently removed, allowed farmers to incur substantial debt, as did other lending institutions. This practice caused land prices to rise dramatically, as farmers sought to cash in on a perpetually rising market. Farm incomes rose above the national average in nine of the decade’s years. The dreams of the 1970’s, however, were not borne out by the realities of the 1980’s. Problems Facing Farmers These realities started to become clear as early as January 4, 1980, when President Jimmy Carter announced a series of sanctions against the Soviet Union in retaliation for that nation’s invasion of Afghanistan. While some bemoaned the fact that the United States would not be participating in the 1980 Olympics, farmers took note of the strict embargo of grain sales to the Sovi-
The Eighties in America
Agriculture in the United States
■
35
Creditors auction off the farm machinery of farmer Roger Escher, center left, as Escher asks his fellow farmers not to bid during a forced property sale in Washington County, Iowa, in 1985. Many farmers lost their farms during the 1980’s. (AP/Wide World Photos)
ets. However, over the course of the 1980’s, the loss of the Soviet market would prove to be the proverbial drop in the bucket of the economic crisis facing farmers. Throughout the 1970’s, developing nations made large purchases of American grains, but over the early years of the 1980’s, the picture changed in a variety of ways. Some nations that had relied almost exclusively on the United States for their agricultural imports turned their attention to other sources, fearing that their own imports might someday be
embargoed. Other developing nations had deeply indebted themselves to the United States and, unable to meet repayment schedules, had to stop purchasing from the nation altogether. Moreover, the value of the American dollar rose relative to other currencies in the early 1980’s. This made U.S. commodities, including grain, more expensive for other countries to buy, lessening demand and contributing to a surplus of grain on the American market. Thus, during the early 1980’s, the once buoyant American agricultural economy started to sink. With
36
■
The Eighties in America
Agriculture in the United States
lessening demand for grain, the price of farmland began to falter, slowly at first and then in a free fall. Some federal officials estimated that land prices throughout the Midwest dropped by nearly 60 percent between 1981 and 1985. Prices for farmland did not reach their nadir until 1986. As farmland’s purchase price declined, its value as collateral fell as well. American farmers collectively owed an estimated $215 billion in early 1984, double their debt in 1978. Accordingly, lenders throughout the United States were called upon to reduce farmers’ indebtedness so that the value of their land might serve as sufficient collateral. Farmers across the United States were struggling to pay their debt’s interest, let alone repay the principal. Bankers and lenders, once popular as the source of cash that would make a farmer’s dreams come true, became the symbol of personal failure, as they sought to recoup the losses their banks faced. The disdain for bankers occasionally resulted in violence. In Hills, Iowa, farmer Dale Burr killed not only his banker, John Hughes, but also his neighbor, his wife, and then himself, once he realized that there was no way out of the debt he had incurred in the 1970’s. Near Ruthton, Minnesota, banker Rudy Blythe fell before the gunfire of James and Steven Jenkins, a father and son who had lost their farm to the bank’s foreclosure. In Union County, South Dakota, a Farmers Home Administration official killed his wife, son, daughter, and dog before turning his gun on himself. According to his suicide note, the pressures of foreclosing on his friends and neighbors had become too much to bear. Nongovernmental Responses
Farmers did not face their troubles in isolation. The American Agriculture Movement, founded in the 1970’s, briefly stirred to life to sponsor “tractorcades” on America’s largest cities, including Washington, D.C., encouraging a grassroots response to the crisis. Similarly, the North American Farm Alliance coalesced in Ames, Iowa, to raise awareness of agricultural conditions among governmental officials and urban dwellers. Funds raised by country music star Willie Nelson in a series of Farm Aid concerts helped the United Farmer and Rancher Congress. Elsewhere, farmers protested at bank-required foreclosure sales, occasionally with violence. It was not only the farmers themselves who felt the effect of their economic woes. As farmers had
less and less disposable income, agribusiness felt an immediate impact: Implement dealers, seed houses, and elevators closed their doors in the face of the farmers’ financial difficulties. Following shortly thereafter, grocery stores, furniture stores, banks, and hardware stores, among others, closed their doors, forced out of business by the economic decline of their regions. Even harder hit were the churches, schools, homemaker clubs, and other population-based organizations that dotted the countryside. As more and more farmers left the countryside for urban occupations, their children no longer attended rural schools, their families no longer attended rural churches, and their wives found jobs in town. Rural women’s clubs disbanded, and baseball diamonds, football fields, and 4-H clubs dwindled and then fell silent, no longer needed in regions with dwindling populations. Governmental Responses
The federal government was not idle in the face of the agricultural crisis. Congress passed some laws designed to aid the nation’s farmers. Some actions taken by the Department of Agriculture, however, deepened the disaster. For example, in 1980, the Department of Agriculture, facing its own budget constraints, determined to cut entitlement programs, including school breakfast and lunch programs; prenatal nutrition programs such as Women, Infants, and Children (WIC); food stamps; and Commodity Food Distribution. These cuts hurt not only the recipients of government aid but also the farmers who produced the products that the entitlement programs distributed. The tight financial policy practiced by the Ronald Reagan administration did not particularly aid farmers. With 60 percent of the nation’s food and fiber production consumed within the United States, the farmers’ welfare depended in large part upon the purchasing power of American consumers. Tight money meant less disposable income for all, but as urbanites trimmed back their meat, breadstuffs, and vegetable consumption, farmers felt the sting. Some farmers suffered as a result of being deemed too small for federal assistance. By the early 1980’s, 17 percent of farmers received 60 percent of all agricultural subsidies paid by the federal government. Farms receiving subsidies generally were large-scale operations, and the government’s agri-
The Eighties in America
cultural plan generally left small- and medium-scale operations without the resources to continue. More and more farmers gave up farming for urban occupations. Much of the farmland sold in the 1980’s to satisfy mortgages left the hands of small- and medium-scale farmers and was acquired by foreign investors, nonagriculturalists, or large-scale farmers. The Farm Credit System’s rules stated that the land it had foreclosed upon could be purchased at 4.9 percent interest, with 40 percent of the purchase price paid immediately. Large and well-financed entities were the only ones capable of meeting these conditions. Accordingly, insurance companies more than doubled their land holdings from 1985 to 1986, and investor-owned farm-management companies increased their holdings by 36 percent between 1979 and 1987. In order to face the exigencies forced upon them, many farmers turned to alternative products. Artichokes, catfish, wildflowers, herbs, crayfish, honey, and garden truck were all touted as the solution to the farmers’ woes, as were llamas and alpacas. Still, most farmers maintained their traditional crops, because they were the ones the federal government would subsidize. This preference for subsidized, traditional crops, however, added to the surplus of grains already on the market. Also adding to the surplus was the introduction of biogenetic seed stock. Grain stock was modified to be more resistant to disease, as well as to tolerate the chemical herbicides, fungicides, and insecticides used during the growing season. Superior yields resulted, often rendering impotent the government’s programs to reduce grain surpluses. Genetically modified livestock became available to farmers. These livestock could be raised on less feed, inoculated to encourage growth, and marketed sooner. Again, these innovations decreased costs, but they also increased supply, driving down the prices farmers could receive for their products. The federal government sought other avenues of surplus reduction throughout the 1980’s, including Secretary of Agriculture John Block’s Payment in Kind (PIK) endeavor. PIK, referred to by President Reagan as a “crop swap,” allowed farmers to take land out of production in return not only for cash payments but also for payments in grain, which they could then sell on the open market. As a tool to aid farmers, the program was accepted at first as an emergency measure. As land was taken out of pro-
Agriculture in the United States
■
37
duction, however, tenant farmers and farm laborers lost their livelihoods, implement and fertilizer dealers were bereft of customers, and taxpayers were left with ever-increasing taxes in order to support grain prices, even with PIK in place. Replacing PIK as the nation’s premier farm policy in 1985 was the Food Security Act (FSA). Generally considered a failure, the five-year plan was intended to maintain farm income while reducing the costs of production. Target prices were projected to drop closer to the market price, saving the federal government money. If agricultural surpluses became excessive, the legislation had a clause to pay farmers for reducing their acreage. The FSA also allowed for the retirement of 45 million acres under the Conservation Reserve Program, designed to retire highly erodible land from production. Despite these measures, overproduction and low prices prevailed for the remainder of the 1980’s. Public Responses Despite the failure of measures intended to elevate farm prices, the image of the farmer rose during the 1980’s. As the decade wore on, more and more non-farmers came to support higher taxes as a way to preserve a viable agricultural economy. Slightly more than 50 percent of Americans declared that they would pay more in taxes to help farmers keep their land. Clearly, a corner had been turned with regard to the image of the farmer and farming. While negative images of farmers had prevailed throughout much of the twentieth century, the image now shifted to a more positive view. As the 1990’s dawned, it became clear that American agriculture was changing. Federal and state laws and regulations, the introduction of biogenetics, enhanced and enlarged farm equipment, a declining and aging farm population, larger farmers, fewer independent operators, and prices below the price of production all would be taken into consideration in the decade to come. Impact Agricultural conditions during the 1980’s changed the face of the American countryside. As debts rose and profit margins fell, many farmers were forced from their farms by foreclosures. Foreclosed land was often acquired by large-scale farming corporations, insurance companies, and investors. Farmers and their families left the countryside for the cities, leaving churches, schools, businesses, and other entities that depended on them to close as well.
38
■
AIDS epidemic
Further Reading
Amato, Joseph. When Father and Son Conspire: A Minnesota Farm Murder. Ames: Iowa State University Press, 1988. History of one family’s farm foreclosure and the murder it inspired. Bonanno, Alessandro, et al., eds. From Columbus to ConAgra: The Globalization of Agriculture and Food. Lawrence: University Press of Kansas, 1994. Study of the globalization of agricultural commodities that combines theoretical analysis with concrete case studies and emphasizes the extent to which the constantly changing nature of global markets results in different groups benefiting or suffering from globalization at different times. Davidson, Osha Gray. Broken Heartland: The Rise of America’s Rural Ghetto. New York: Free Press, 1990. Examination of the impact of the farm crisis upon the nation’s farmers, including the rise of the radical right. Hurt, R. Douglas. American Agriculture: A Brief History. Ames: Iowa State University Press, 1994. Concise overview of American agriculture from prehistoric times to the 1990’s. Nordin, Dennis S., and Roy V. Scott. From Prairie Farmer to Entrepreneur: The Transformation of Midwestern Agriculture. Bloomington: Indiana State University Press, 2005. A positive exploration of the benefits of large-scale farming and the entrepreneurial farmers who operate them. Raeburn, Paul. The Last Harvest: The Genetic Gamble That Threatens to Destroy American Agriculture. New York: Simon & Schuster, 1995. A detailed, scientific examination of the impact of biogenetics on American food supplies. Kimberly K. Porter See also Bioengineering; Business and the economy in the United States; Farm Aid; Farm crisis; Income and wages in the United States; Reagan, Ronald; Soviet Union and North America.
■ AIDS epidemic Definition
Appearance and spread of an infectious immunodeficiency syndrome
The appearance of rare opportunistic infections among populations of gay men and intravenous drug abusers led to the discovery of a previously unrecognized agent, now
The Eighties in America
called HIV. By the end of the decade, thousands of Americans had been infected, and the disease itself, AIDS, had begun to spread throughout the world. While the presence of a disease subsequently known as acquired immunodeficiency syndrome (AIDS) was initially recognized in 1981, the disease’s etiological agent, the human immunodeficiency virus (HIV), had entered the human population several times during the previous decades. Computergenerated data measuring the rate of mutation of a simian virus to one in humans has supported the theory that penetration into the human population may have occurred as early as the 1930’s. Medical historian Jonathan Engel has suggested that between 1950 and 1972, infection may have occurred at least nineteen times. The oldest confirmed infections took place in 1959. Antibodies against HIV were found in blood collected in 1959 from a Bantu man in Leopoldville, Belgian Congo, who succumbed to an immunodeficiency disease. That same year, another man died in Manchester, England, exhibiting the same immunodeficiency defects. Retrospective analysis of his stored blood confirmed infection by HIV. Beginning of the Pandemic Recognition of an immunodeficiency syndrome was first reported in the June 5, 1981, issue of Morbidity and Mortality Weekly Report. The story, originating from the Centers for Disease Control (CDC) in Atlanta, described an unusual and rare parasitic lung infection, Pneumocystis carinii pneumonia (PCP), in five homosexual men in Los Angeles. The outbreak came to the attention of the CDC because the only known treatment, a drug called pentamadine isothionate, was available only from that agency. Later that summer, the CDC reported that an unusual epidemic among gay men was more widespread than had earlier been thought: More than 140 previously healthy young men had been diagnosed with either PCP or a rare form of cancer called Kaposi’s sarcoma (KS). Generally only observed previously among Italian or Jewish men of Mediterranean origin, KS was unheard-of in the age population now being observed. Furthermore, the newly detected form of KS was much more aggressive than were previously known instances. Because the disease had only been reported until then in homosexuals, it was initially referred to as gay-related immunodeficiency disorder (GRID).
The Eighties in America
AIDS epidemic
■
39
AIDS Cases, Deaths, and Case-Fatality Rates in the United States Through December, 1989 Adults/Adolescents Interval
Children Under 13
Cases Diagnosed
Deaths
Before 1981
78
30
6
1
1981: Jan.-June
91
38
8
2
1981: July-Dec.
194
83
5
6
1982: Jan.-June
385
151
13
9
1982: July-Dec.
664
276
14
4
1983: Jan.-June
1,249
507
33
13
1983: July-Dec.
1,611
902
40
16
1984: Jan.-June
2,515
1,362
47
24
1984: July-Dec.
3,303
1,895
61
23
1985: Jan.-June
4,722
2,695
97
43
1985: July-Dec.
6,092
3,667
127
68
1986: Jan.-June
7,956
4,811
131
64
1986: July-Dec.
9,528
6,089
162
82
1987: Jan.-June
12,157
7,035
205
110
1987: July-Dec.
13,386
7,351
239
150
1988: Jan.-June
14,704
8,439
210
120
1988: July-Dec.
14,581
9,401
266
141
1989: Jan.-June
14,626
8,793
233
132
1989: July-Dec.
7,944
5,551
98
68
115,786
69,233
1,995
1,080
TOTAL*
Cases Diagnosed
Deaths
*Death totals include 157 adults/adolescents and children known to have died but whose date of death is unknown. Source: Centers for Disease Control and Prevention, HIV/AIDS Surveillance Report, January, 1990.
Although the initial belief was that transmission of the disease, cause still unknown, was somehow related to homosexual behaviors, it soon became apparent that other means of transmission were also likely—most notably through contaminated blood. By the end of 1982, at which time more than six hundred cases had been reported, it was clear that intravenous (IV) drug abusers were at risk; cases were also observed in several hemophiliacs, whose only possible exposure had been through their use of Factor VIII (blood-clotting) products obtained from donated blood. The name of the illness was also changed, reflecting its more widespread nature, to acquired immunodeficiency syndrome, or AIDS.
The range of opportunistic infections associated with the immune disorder was also widened to include illnesses such as fungal and other rare parasitic infections. If there was any fortunate aspect associated with the outbreak at the time, it involved a growing understanding of the unknown etiological agent’s method of transmission. While it clearly could be transmitted through sexual behaviors, as well as in contaminated blood, it was not transmitted through the air. Victims were classified by the CDC as falling into four specific categories, including homosexual or bisexual males (75 percent of known victims), IV drug abusers (13 percent), and hemophiliacs or transfusion recipients
40
■
AIDS epidemic
(around 0.3 percent). Since a number of Haitians who did not then appear to fall within the other categories had been diagnosed with the disorder, Haitians were included among the risk groups. Isolation of the Etiological Agent
Speculation within the general public, and even among some medical professionals, as to the cause of AIDS initially focused on homosexual behaviors, such as the use of amyl nitrate to enhance sexual pleasure or even the practice of anonymous sex with multiple partners. Among some evangelicals, the belief was that the disease represented a punishment from God. Since semen itself was felt to have some immunosuppressive properties, “sperm overload” was suggested as a possible cause. The demographics of the disease, however, did not fit. Increasing numbers of cases were observed among hemophiliacs, women, and even infants and young children, twenty-six of whom had been diagnosed with AIDS by late 1982. Furthermore, the specific cause of the immunodeficiency had become apparent, a loss of a class of lymphocytes called T cells, named for their site of maturation in the thymus. Researchers began to narrow their focus in the search for a cause, believing that it likely was a virus. Suspicion by 1983 began to focus on a group of viruses known as human T-lymphotropic viruses (HTLVs), which had the ability to infect lymphocytes. HTLV-1 and HTLV-2, the two initial suspects, were in a group known as retroviruses. Retroviruses are viruses containing ribonucleic acid (RNA) that also carry an enzyme called reverse transcriptase, a protein that copies their RNA into deoxyribonucleic acid (DNA) following infection. Ultimately, two laboratories laid claim to isolation of the etiological agent associated with AIDS, one in Paris, the other in Bethesda, Maryland. Among the leading researchers in this field was Robert Gallo at the National Institutes of Health. Gallo was already well known for his development of a method to grow lymphocytes in culture. In retrospect, the timing of this procedure turned out to be critical to the hunt for the cause of AIDS, since the ability to grow HIV in the laboratory and to develop an effective method for testing blood supplies was the result of Gallo’s work. In April, 1984, Gallo announced the isolation and identification of a virus that he felt was the cause of AIDS and that he named HTLV-3. However, the issue of priority quickly introduced politics into the sci-
The Eighties in America
ence. In January, 1983, Luc Montagnier at the Pasteur Institute had also isolated a virus that he felt was the etiological agent of AIDS and that he called the lymphadenopathy associated virus (LAV). The two viruses were later shown to be identical. The issue of priority was never completely settled, though the evidence is that Montagnier was probably first, while Gallo is credited with developing the blood test for the virus’s detection. To eliminate the confusion over names, the virus was given the name HIV. In 1985, a second, similar virus was isolated in West Africa; the original virus was named HIV-1, while the newer isolate became HIV-2. Widening Epidemic Though the initial features of the growing epidemic were focused primarily in the United States, it became clear by 1984 that the outbreak was taking place in much of the world. What had been known as “slim disease” in Africa was identified as AIDS and was seen in hundreds of patients there. By 1985, the disease had been found in more than fifty countries. More than seven thousand persons with AIDS were diagnosed in the United States, though likely many more were actually HIV-positive. The impact of the disease on Americans was made particularly poignant by coverage of two highprofile cases. In 1984, a thirteen-year-old Indiana student named Ryan White acquired AIDS from a blood transfusion used to treat his hemophilia. Fear of transmission resulted in his being removed from the school system and forced him to be schooled at home. The issue was brought to court, which resulted in a ruling that he be allowed to return to school. Despite a five-year fight to educate the public on how AIDS can, and cannot, be transmitted, he was frequently harassed by other students and their parents and eventually moved to another town where he was accepted. In his last years, White frequently spoke to other students, explaining his illness and philosophy of life. White died in 1990. In his honor, the Ryan White Comprehensive AIDS Resources Emergency Act was passed by Congress in 1990 to provide health care for persons with AIDS who had no insurance coverage. The second high-profile case was that of movie star Rock Hudson, who was diagnosed with AIDS in 1984, although the information was not released until the following year. Though it was an open secret in the movie community, the public was unaware that Hudson was gay. Despite a courageous fight,
The Eighties in America
Hudson died in October, 1985. Another prominent person with AIDS was Liberace. A well-known entertainer, Liberace died from AIDS in 1987. The response to the AIDS epidemic by the administration of President Ronald Reagan was largely neglect during the early years of the outbreak. Despite the fact that AIDS had clearly expanded far beyond the homosexual community, conservatives largely ignored the problem or simply blamed a “choice” of lifestyle. The gay communities in larger cities did respond, attempting to close the bathhouses that often served in spreading the disease, as well as educating the gay community on how the disease could be avoided. The effort was successful, and new infections began to level off within the community. By the end of the decade, several events served to bring the problem of AIDS to the general public. In 1986, the National Council of Churches established an ecumenical task force, which shortly met with the U.S. surgeon general, C. Everett Koop. The group later produced a pamphlet that attempted to educate the public about the disease. Beginning in December, 1988, the first annual World AIDS Day was held, with the goal being to bring the issue to the attention of the world at large. The first effective treatment for AIDS appeared in 1987. Azidothymidine (AZT), originally developed as an anti-cancer drug, was shown to be effective in inhibiting the replication of the virus. Although HIV would develop resistance to the drug, it did provide a means to extend the life of affected individuals. Impact When AIDS surfaced in Western countries in the early 1980’s, it was treated primarily as limited to those exhibiting certain social behaviors. Scientists, medical professionals, activists, and other advocates struggled throughout the decade to educate the populace as to the epidemic’s severity, HIV’s ability to spread to anyone, and the specific, limited number of methods of transmission. By the end of the decade, AIDS was widely recognized as a problem of global significance requiring global resources to combat. Indeed, AIDS went on to become a worldwide pandemic that would create particular havoc in developing nations. Lack of proper medical facilities in these countries, poor education in presenting the means to avoid the disease, and the difficulty of altering longheld sexual mores all contributed to the problem. As a result, the middle class in much of central and
AIDS epidemic
■
41
southern Africa was devastated, creating a generation of orphans and taxing the economy of these countries. Much the same scenario developed in portions of Asia as well. Debate continues as to whether the scope of this tragedy could have been limited by a swifter, more decisive response on the part of of the U.S. government and President Reagan, who refused to address the crisis in public until May, 1987. Further Reading
Diamond, Jared. “The Mysterious Origin of AIDS.” Natural History 101, no. 9 (September, 1992): 2529. One of the earlier presentations about the likely origin of HIV as a simian virus. The author includes speculation as to how the virus may have jumped species. Engel, Jonathan. The Epidemic: A Global History of AIDS. New York: HarperCollins, 2006. The author, a medical historian, provides a history of the outbreak, from its first recognition in the early 1980’s to the situation as of 2006. An extensive bibliography is included. Gallo, Robert. Virus Hunting. New York: Basic Books, 1991. Autobiography of one of the scientists considered to have discovered HIV. Includes his own (albeit arguably biased) description of the discovery. Gallo, Robert, and Luc Montagnier. “AIDS in 1988.” Scientific American 259, no. 4 (October, 1988): 4048. Description of the recognition of HIV and its association with the disease, by the two scientists most closely linked with the discovery. Montagnier, Luc. Virus. New York: W. W. Norton, 2000. Autobiography of the other scientist with a claim to having discovered HIV. Shilts, Randy, and William Greider. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin’s Press/Stonewall Inn Editions, 2000. Updated description of the outbreak of the AIDS epidemic and how the lack of recognition by agencies contributed to its spread. Shilts was a newspaper reporter who later succumbed to the illness. Stine, Gerald. AIDS Update, 2007. San Francisco: Benjamin Cummings, 2007. Yearly update on research into the AIDS virus, as well as information about biological events that follow infection. Discussion about the progress of treatment is also included. Richard Adler
42
■
AIDS Memorial Quilt
See also
ACT UP; AIDS Memorial Quilt; Cancer research; Fetal medicine; Genetics research; Health care in the United States; Homosexuality and gay rights; Hudson, Rock; Medicine; White, Ryan.
■ AIDS Memorial Quilt Identification
A community art project honoring those killed by AIDS in the United States Date Begun in 1987 by the NAMES Project Foundation The AIDS Memorial Quilt was conceived by a group of San Franciscans to honor and remember the citizens of San Francisco who had died of AIDS since 1981. The project became much larger, as people all over the country contributed to, viewed, and were memorialized by the quilt. The AIDS Memorial Quilt was conceived during the November, 1985, candlelight vigil marking the anniversary of the 1978 assassinations of San Francisco
The Eighties in America
mayor George Moscone and openly gay San Francisco supervisor Harvey Milk. That year, vigil organizer Cleve Jones asked participants to write on large placards the names of friends and partners that had been previously claimed by the acquired immunodeficiency syndrome (AIDS) epidemic. More than one thousand San Franciscans had perished from the disease since it was first identified in 1981 by American medical scientists. Like many others, Jones was concerned that these people would be forgotten because of their homosexuality and the public fear of AIDS. Additionally, many of those who had died of AIDS had been abandoned by their biological families, and the remains of some had even been refused by mortuaries for proper burial and memorial services. Resembling a patchwork quilt when posted together on a wall, the memorial placards inspired a larger project of connected, sewn, quilted panels that was subsequently administered by the nonprofit NAMES Project Foundation. Composed of individual blocks encompassing 144 square feet, each block
On June 25, 1988, nearly fifteen hundred panels of the AIDS Memorial Quilt are assembled in New York’s Central Park. (AP/Wide World Photos)
The Eighties in America
comprised eight quilted panels measuring 3 feet by 6 feet. Made by friends, families, partners, or acquaintances, virtually all the panels honored the memory of an individual claimed by AIDS. They were generally displayed separately, with discrete groups of panels traveling simultaneously to different locations across the country. On October 11, 1987, the quilt was first displayed to the public in its entirety, on the National Mall in the District of Columbia. At the time, it comprised 1,920 panels and covered an area the size of a football field. This spectacular display, to be followed by larger Washington, D.C., displays in 1988, 1989, 1992, and 1996, effectively demonstrated on both an emotional and an intellectual level the magnitude of the global AIDS pandemic. The 1987 display was viewed by more than 500,000 people during a single weekend. Its popularity dramatically highlighted the official indifference of the Ronald Reagan administration to AIDS awareness, research, and treatment. Other large displays at various locations in the United States and Canada followed, supported by numerous affiliated chapters of the NAMES Project, including the Blue Light Candle Project in San Antonio, Texas, and many others, although Washington, D.C., remains the only place the quilt has been displayed in its entirety. Impact Administered by the NAMES Project Foundation in Atlanta, Georgia, the AIDS Memorial Quilt was recognized as the world’s largest community art project. Through its public displays, the quilt has been effectively used to memorialize the deceased victims of AIDS while globally focusing attention and awareness for the living regarding AIDS, HIV, intolerance, human rights, and medical services. Nominated for a Nobel Prize in 1989, the quilt was the subject of a major film, Common Threads: Stories from the Quilt (1989) that was awarded the Academy Award for Best Documentary Feature in 1990. Further Reading
Brown, Joe, ed. A Promise to Remember: The NAMES Project Book of Letters. New York: Avon Books, 1992. Jones, Cleve, and Jeff Dawson. Stitching a Revolution: The Making of an Activist. San Francisco, Calif.: HarperSanFrancisco, 2000. Remember Their Names: The NAMES Project Quilt, Washington, D.C., October 7-10, 1988. San Francisco, Calif.: NAMES Project, 1988. Hayes K. Galitski
Air India Flight 182 bombing
■
43
See also
ACT UP; AIDS epidemic; Health care in the United States; Homosexuality and gay rights; Reagan, Ronald; Torch Song Trilogy; White, Ryan.
■ Air India Flight 182 bombing The Event Terrorist attack on a civilian airliner Date June 23, 1985
The bombing of Air India Flight 182, from Canada to India, killed all 329 people on board. The event caused Canada to tighten airport security and reevaluate prevention policies on terrorism. The bombing of Air India Flight 182 resulted from a religious and political power struggle between India’s government and a religion known as Sikhism. In the 1960’s and 1970’s, Indian Sikhs had appealed unsuccessfully to their government to create an independent state in Punjab, India, which would be called Khalistan. Later, in 1984, the Golden Temple of Amritsar, the Sikhs’ holiest temple, was raided. Four months later, Sikh bodyguards retaliated by assassinating Indira Gandhi, India’s prime minister. The movement for succession of an independent state was supported by Sikhs around the world, especially in Canada, the United States, the United Kingdom, and Germany. Check-in The Air India Flight 182 bombing took place on June 23, 1985. The conspirators were also responsible for bombing Canadian Pacific Flight 003 to Tokyo on the same day. On June 20, 1985, two airline reservations were made in the names of M. Singh and L. Singh. M. Singh was ticketed on Canadian Pacific Flight 060, departing Vancouver for Toronto. He was wait-listed on Air India Flight 181/ 182 from Toronto to Delhi via Montreal. L. Singh was to board Canadian Pacific Flight 003 from Vancouver to Tokyo and to connect with Air India Flight 301 to Bangkok. On the morning of June 22, 1985, a Canadian Pacific Airlines reservation agent received a phone call from an individual claiming to be M. Singh, who wanted to know if his flight was confirmed and if his luggage could be sent to Delhi on Flight 182 even if he remained on the waiting list. The reservation agent informed the caller that according to airline regulations, luggage could only be sent on confirmed flights. Around 7:50 a.m. Pacific daylight time, an individ-
44
■
Air India Flight 182 bombing
The Eighties in America
The crew of the Irish patrol ship Emer searches in vain for survivors of the Air India Flight 182 bombing on June 24, 1985. (AP/Wide World Photos)
ual arrived at the check-in desk in Vancouver with a ticket for M. Singh. The airline agent later recalled that the man was very insistent that his luggage be sent to Delhi even without a confirmed flight from Montreal to Delhi. After several heated words, the agent finally agreed to transfer the luggage from Canadian Pacific Flight 060 to Air India Flight 181/ 182. Later that day, an individual claiming to be L. Singh checked into Flight 003 to Tokyo. It was later established that the identifications of the individuals were not verified and that neither of the passengers boarded his flight. The Explosion
The explosion occurred off the coast of Ireland at 7:14 a.m. Greenwich mean time. The aircraft relayed no distress signals. As soon as the plane went off radar, air traffic control initiated a marine rescue mission. The plane fell from an altitude of thirty-one thousand feet and sank about
sixty-seven hundred feet into the ocean. In theory, even if some passengers survived the detonation, they would have drowned once they entered the water. A total of 307 passengers and 22 crewmembers were killed. In Tokyo, Canadian Pacific Airlines Flight 003 arrived at 2:37 p.m. local time (5:37 a.m. Greenwich mean time) from Vancouver. An hour later, approximately fifty-five minutes before the Flight 182 explosion, luggage handlers were removing baggage from Flight 003 when a bomb detonated, killing two baggage handlers and injuring four others. It was later established that the same conspirators were behind both explosions. The Suspects The Canadian police found four men to be the primary suspects in the bombings. They believed Talwinder Singh Parmar to be the leader of both operations. Parmar was the Canadian
The Eighties in America
leader of a militant Sikh separatist group called Babbar Khalsa. The focus of this violent organization was to establish Punjab, India, as an independent state for Sikhs. In 1983, Parmar was indicted in India for killing two Indian police officers. After he spent a year in Canadian jail, however, the Canadian government refused to extradite him to India. Parmar was closely watched by the Canadian Security Intelligence Service (CSIS) as a result of various suspicious activities. Less than a month before the bombings, CSIS observed Parmar and Inderjit Singh Reyat enter the woods in Vancouver Island. The agents then heard a loud noise, which they assumed to be the pair firing guns. Shortly after the bombing, the Royal Canadian Mounted Police searched Parmar and Reyat’s dwellings and charged the men with possession of weapons and explosives, as well as conspiracy. Both men were released after paying fines. In 1992, Parmar was killed in a reported gunfire exchange in India. There was never sufficient evidence to charge Parmar with the Air India bombings, despite the Canadians’ belief that he was the leader of the conspiracy. Out of the four main suspects in the Air India bombings, Inderjit Singh Reyat was the only one convicted for the crimes. Reyat’s involvement began when he started receiving phone calls from Parmar and others who were known to be Sikh extremists. In 1990, he was charged with manslaughter for the Tokyo Airport bombing and was found guilty and sentenced to ten years in prison. Reyat was later charged with murder in the Flight 182 case; he pleaded guilty to a lesser charge and was sentenced to five years in prison. Ripudaman Singh Malik and Ajaib Singh Bagri were also charged for the Air India bombings, but they were found not guilty on all counts. Impact The Air India Flight 182 bombing alerted people around the world to the potential implications of religious disputes in seemingly distant countries. It confirmed that in a decade witnessing the rise of globalism, terror too was becoming global, and it illustrated the dangers posed by terrorist organizations to neutral or third-party nations. The bombing forced the Canadian government to increase its security precautions at airports and elsewhere in the country, and it gave pause to other nations as well. Subsequent Events A new investigation on the Air India bombings was launched in 2005 by Jack Mayor,
Air pollution
■
45
a retired Canadian Supreme Court justice. The investigation’s primary objectives were to evaluate the circumstances of the incident, to determine if justice was served by the trials of those accused, and to conclude whether the incident could occur again despite modern precautions. Further Reading
Dobson, Christopher, and Ronald Payne. “The Golden Temple (India).” In The Never-Ending War: Terrorism in the 1980’s. New York: Facts on File, 1987. Examination of Indian Sikh terrorism in the 1980’s, especially in the wake of the raid on the Golden Temple; part of a general study of terrorism in the 1980’s. Laqueur, Walter. Terrorism: A Study of National and International Political Violence. Boston: Little, Brown, 1977. A comprehensive look at the intersections of nationalist and international politics and terrorism. Netanyahu, Benjamin. Fighting Terrorism. New York: Farrar, Straus and Giroux, 2001. This overview of terrorism in the 1980’s and 1990’s provides details of terrorist activity in the Middle East, Canada, and the United States. Tusty Zohra See also
Air traffic controllers’ strike; Foreign policy of Canada; Immigration to Canada; Minorities in Canada; Pan Am Flight 103 bombing; Religion and spirituality in Canada; Terrorism.
■ Air pollution Identification
Contamination of the air with human-generated gases or particles
Air pollution continued to be a serious environmental problem during the 1980’s. The issue aroused a good deal of political debate, as the Reagan administration relaxed or failed to enforce some standards. During the 1970’s, the United States made progress in dealing with air pollution and other environmental problems. The Clean Air Act of 1970 gave the Environmental Protection Agency (EPA) authority to define and implement environmental standards. For the rest of the decade, the country made measurable progress, and air quality had improved by 1980. Nonetheless, much remained to be done. Fac-
46
■
Air pollution
tories and power plants continued to emit pollutants such as sulfur dioxide (SO2) and nitrogen oxides. Automobile-induced smog contained nitrogen oxides as well as ozone and heavy metals. Continuing and New Air-Pollution Issues
Several air-quality issues remained unresolved during the 1980’s. Acid rain generated by SO2 and nitrogen oxide emissions from electric power plants remained an issue throughout the decade. The issue gradually captured public interest, as the damage acid rain was causing became apparent. In addition, automobile emissions continued to degrade air quality. In some areas, such as Los Angeles, air quality showed little improvement during the decade, although catalytic converters had some impact on automobile emissions. Increasing traffic volume combined with local topography in Los Angeles and elsewhere to make improving air quality a difficult task to accomplish. During the 1980’s, two additional aspects of air pollution started to become more noticeable. The harmful impact of chlorofluorocarbons (CFCs) on the ozone layer in Earth’s stratosphere first became evident in the 1970’s. By 1985, a hole in the ozone layer over the Antarctic was noted. The ozone layer intercepts ultraviolet solar radiation, which would otherwise harm many living creatures—including humans—when it reached the surface of the planet. In 1987, the United States and other nations signed the Montreal Protocol, which banned CFCs. Some scientists also noted the impact of the emission of carbon dioxide (CO2) arising from burning fossil fuels in causing global warming. CO2 is often not classified as a pollutant, so global warming is technically not a species of air pollution. Without CO2 in the atmosphere, there would be no life on Earth. Plants would be unable to survive, because they depend on CO2 as a component of photosynthesis, and animals would run out of oxygen, because there would be no plants to replenish its supply in the atmosphere. Earth, moreover, would probably be too cold to be inhabited. However, the increased amount of CO2 in the atmosphere helped create a blanket effect known as the “greenhouse effect” that led to a potentially dangerous increase in global temperatures. During the 1980’s, scientists were only beginning to raise the question of global warming, but it would become an increasingly important environmental issue in the years ahead. The decade’s two Republican administrations both de-
The Eighties in America
nied the existence of global warming, so no governmental action was taken concerning this issue during the 1980’s. Republican Initiatives Members of the Ronald Reagan administration sought to roll back environmental regulations they saw as harmful to competition or that concentrated too much power in the hands of the federal government. James G. Watt, the secretary of the interior, and Anne M. Gorsuch (later Burford), the head of the EPA, were critical of environmental laws and regulations and worked to weaken their impact. Reagan himself tried to appear more neutral than his appointees, a wise approach given that most Americans continued to be supportive of measures to improve air quality. The Reagan administration applied strict cost-benefit standards to environmental regulations, including those governing air quality. For example, regulations designed to reduce SO2 emissions from coal-fired electric power plants were evaluated in terms of the potential costs to a company of implementing the regulations, including potential job losses that might result from higher operating costs. The cost-benefit analyses did not take into account environmental costs, such as the damage caused by SO2 when it mixed with water vapor in the atmosphere to produce acid rain. Nor did it take into account long-term costs to American business resulting from environmental damage or the potential of environmentally friendly policies to create new jobs and new business opportunities—an idea that was unheard-of in the 1980’s, when environmentalism was widely seen as simply opposed to corporate success. Its narrow definition of costs and benefits led the Reagan administration to oppose all forms of commandand-control environmental regulation. Instead of command-and-control environmental regulation, Reagan appointees to the EPA advocated market-based incentives to improve the environment. Using this approach, a company that decreased the emissions of regulated substances such as SO2 was eligible for tax abatements. This incentive approach was seen as more cost effective than was command and control. The effectiveness of an incentive-based approach to air pollution continues to be debated, but it did produce some successes. Overall, however, the Reagan administration’s approach to air pollution was to ignore the problem as unimportant.
The Eighties in America
This approach changed somewhat after the resignation of Gorsuch as head of the EPA in 1982 and her replacement by William Ruckelshaus. For example, in September, 1983, Ruckelshaus proposed new regulations to tax SO2 emissions from power plants in New York, Pennsylvania, Ohio, and West Virginia to reduce SO2 emissions by 3.5 million tons. The proposal was defeated in the Cabinet Council on Natural Resources, as Watt, Energy Secretary Donald Hodel, and Office of Management and Budget Director David Stockman opposed the regulation as producing small environmental benefits at great cost. Lee Thomas, who succeeded Ruckelshaus in the second Reagan administration, also did more to enforce air-quality regulations. George H. W. Bush, who succeeded Reagan after winning the election of 1988, portrayed himself as an environmentally friendly president. William Reilly, his appointee as head of the EPA, adopted a pro-environment stance that would soon be tested. Revisions of the Clean Air Act had been under consideration for more than a decade. During the 1980’s, the inability of the government to deal with acid rain had become a symbol of the environmental failure of the Reagan administration. A set of amendments to the Clean Air Act was passed in 1990 specifically addressing SO2, as well as other longstanding problems of air pollution in urban areas. Impact
Air pollution’s pervasive and harmful nature was recognized increasingly in the 1980’s, as pollution became both more intense and more widespread. Alongside this recognition, however, the belief persisted that strong regulations and enforcement would improve air quality only at the expense of the United States’ corporate bottom line. Thus, neither the Reagan nor the Bush administration did much to clean up the air or to prevent its further pollution. Air pollution remained an unresolved environmental problem at the end of the decade. Further Reading
Bryner, Gary C. Blue Skies, Green Politics: The Clean Air Act and Its Implementation. Washington, D.C.: CQ Press, 1995. Details the enforcement of the Clean Air Act over time. Freedman, Barry D. Regulation in the Reagan-Bush Era: The Eruption of Presidential Influence. Pittsburgh, Pa.: University of Pittsburgh Press, 1995. Provides an analysis of the Reagan-Bush environmental record.
Air traffic controllers’ strike
■
47
Peirce, J. Jeffrey, Ruth F. Weiner, and P. Aarne Vesilind. Environmental Pollution and Control. 4th ed. Boston: Butterworth-Heinemann, 1998. Several chapters detail the science of air pollution. Turco, Richard P. Earth Under Siege. New York: Oxford University Press, 1997. Excellent, comprehensible introduction to the topic of air pollution. John M. Theilmann See also Environmental movement; Ozone hole; Reagan, Ronald; Water pollution; Watt, James G.
■ Air traffic controllers’ strike The Event
An unlawful strike by government employees Date August 3, 1981 Almost thirteen thousand air traffic controllers went on strike, violating the terms of their contracts as well as Civil Service laws. When more than eleven thousand of them refused a presidential ultimatum to return to work, the striking workers were fired and replaced. The president’s strategy weakened other government-employee unions and contributed to the weakening of organized labor in general during the 1980’s. Collective bargaining between the Professional Air Traffic Controllers Organization (PATCO) and the Federal Aviation Administration (FAA) began in February, 1981, shortly after President Ronald Reagan was inaugurated. At issue were three major concerns: high-stress working conditions, wages, and retirement benefits. In 1946, Congress had banned strikes by federal employees, and in 1955 it had passed laws that made such strikes a crime punishable by fine or one year in prison. However, in 1968, 1969, 1970, 1974, 1975, and 1978, PATCO had circumvented the law and pressured the government by conducting slowdowns—in which workers intentionally decreased their efficiency—and sick-outs— in which a majority of workers simultaneously called in sick). This strategy had secured for the union’s members increased pay and benefits, ongoing training, and new equipment. In early negotiations in 1981, PATCO president Robert E. Poli presented a list of ninety-six demands that collectively would have cost the government more than $1 billion to satisfy. FAA administrator
48
■
Air traffic controllers’ strike
J. Lynn Helms rejected the proposal as too costly. Negotiations continued until May 22, 1981, when Poli submitted PATCO’s final proposal, calling for an annual wage increase of $10,000 for all controllers, plus cost-of-living increases of 1.5 percent above the rate of inflation every six months and a reduction in the workweek from forty to thirty-two hours with no corresponding reduction in pay. PATCO sought higher pension benefits as well. The existing benefit was a payment equal to 50 percent of base pay for controllers who retired at age fifty or older after twenty years of employment, or at any age after twenty-five years of employment. PATCO wanted an increase to 75 percent of base pay for workers who retired after twenty years regardless of age. Poli informed Helms that PATCO members would strike within thirty days if the government did not offer an acceptable package. Secretary of Transportation Andrew L. Lewis, Jr., replaced Helms as the government’s chief negotiator, in the hope that he could mediate the stalemate.
The Eighties in America
Just prior to the June 22 deadline, Lewis offered PATCO a $40 million package, including a shorter workweek, an across-the-board raise of $4,000, a 10 percent pay hike for controllers who served as instructors, a 20 percent pay differential for nighttime work, guaranteed thirty-minute lunch-period breaks, and increased retirement benefits. After intense bargaining, Lewis also agreed to provide retraining benefits for medically disqualified controllers and a time-and-a-half pay rate for all hours beyond thirty-six in a forty-hour workweek. After these concessions, Poli agreed to present the settlement to PATCO members for a vote. The package was rejected by 95 percent of the union’s 17,500 members. New talks began on July 31, 1981, with PATCO proposing a package that Poli claimed would cost $500 million. The FAA’s computations placed the package’s cost at $681 million, seventeen times that of the earlier settlement that union members had voted down. Negotiations reached an impasse, and
Air traffic controllers picket an air traffic control center in Ronkonkoma, New York, on August 5, 1981. (AP/Wide World Photos)
The Eighties in America
on August 3, during the peak summer travel season when union members believed that the government would have no choice but to yield in order to save the economy, about 12,660 PATCO members went out on strike. Response to the Strike President Reagan declared the air traffic controllers’ strike illegal and ordered the controllers to return to work within forty-eight hours or face permanent loss of their jobs. Only 1,260 returned to work, so the government fired the other 11,400 strikers and began accepting applications for new workers and trainees. More than 45,000 applicants responded. Training centers increased enrollments, offering classes in two or three shifts per day. While the new recruits were trained to replace the terminated controllers, the government needed to employ temporary replacements to keep the airlines running safely. The ranks of the two thousand nonstriking controllers were supplemented with three thousand supervisors, nine hundred military controllers, and one thousand controllers from sixty small airports that were closed temporarily. The FAA ordered airlines at major airports to reduce scheduled flights by one-half during peak hours and to maintain five miles between all aircraft, instead of the ordinary one-half mile, to ensure safety. Within five days, the airports were operating at 80 percent of their normal capacity. Federal judges ordered the arrest of the PATCO leaders who had ignored federal court injunctions against the strike, and they levied fines of $1 million per day against the union. The Justice Department brought indictments against seventy-five controllers. On October 22, 1981, the Federal Labor Relations Authority (FLRA) decertified PATCO. Later, on June 19, 1987, the FLRA certified the National Air Traffic Controllers Association as sole bargaining union for air traffic controllers. Public response overwhelmingly favored the government. While sympathetic to their stressful jobs, taxpayers were aware that air traffic controllers’ pay in 1981 was two to three times the national average salary. The public also realized that if the controllers’ strike succeeded, U.S. Postal Service employees would expect similar concessions, as would other federal employees, which would add billions of dollars to the federal budget.
Air traffic controllers’ strike
■
49
The strikers drew support from AFL-CIO labor leaders, but airline pilots and machinists did not join the strike. Some foreign unions supported PATCO by causing delays in flights to the United States. Canada closed Gander International Airport, in Newfoundland, to U.S. aircraft but reopened it immediately when President Reagan threatened permanently to abandon air service to Gander by U.S. airplanes. Impact
The success of President Reagan’s response to the air traffic controllers’ strike decisively shifted the balance of power in labor disputes between the federal government and its employees. The controllers’ union was iconic, because its members’ jobs both required highly specialized skills and were manifestly essential to the nation’s infrastructure. If their strike could be broken, it seemed that any strike could be. The major airlines reported losses of up to $30 million per day during the strike, but the FAA implemented a strike contingency plan that lessened potential losses and allowed commercial and military planes to remain in the air. Many labor leaders and historians have said that the failure of PATCO’s strike contributed to the decline in the power of labor unions over the next two decades.
Further Reading
Noonan, Peggy. When Character Was King: A Story of Ronald Reagan. New York: Viking, 2001. Sheds light on the effect of President Reagan’s decisive handling of the strike on foreign relations. Nordlund, Willis J. Silent Skies: The Air Traffic Controllers’ Strike. New York: Praeger, 1998. Discusses the power of labor unions to affect the economy and analyzes the relationship between unions and public policy. Round, Michael A. Grounded: Reagan and the PATCO Crash. Rev. ed. New York: Routledge, 1999. Examines federal statutes, particularly the no-strike laws, and the relevance of President Reagan’s rhetorical background in relation to PATCO and FAA negotiations and the strike. Marguerite R. Plummer See also Canada and the United States; Reagan, Ronald; Reaganomics; Unions.
50
■
Airplane!
The Eighties in America
■ Airplane! Identification Disaster film spoof Directors Jim Abrahams (1944-
), David Zucker (1947), Jerry Zucker (1950Date Released July 2, 1980
)
Airplane! gleefully attacked the underlying Hollywood conventions of narrative realism in addition to parodying specific cinematic genres. Its success confirmed the boxoffice potential of such spoofs.
ing these serious actors doing comedy and making fun of themselves. The movie created a new career for Leslie Nielsen, who went on to star in the parody television series Police Squad and the spin-off Naked Gun movies. Los Angeles Lakers basketball star Kareem Abdul-Jabbar also starred as the co-pilot— and broke character during the movie to play himself playing the co-pilot. Impact Airplane! established a viable market for outrageous parodies that made fun of specific movies or genres while also rejecting any semblance of realism. The trend continued with movies such as Top Secret! (1984), The Naked Gun: From the Files of Police Squad! (1988), Scary Movie (2000), and others. The movie was nominated for a Golden Globe Award in the best musical or comedy category and won a Writer’s Guild of America Award for Best Screenplay Adapted from Another Medium. In 2000, members of the American Film Institute voted Airplane! as number ten on the list of the one hundred funniest movies of the twentieth century. In 2005, the American Film Institute also voted the “Don’t call me Shirley” line number seventy-nine of the one hundred best movie quotes.
Written and directed by the team of Jim Abrahams, David Zucker, and Jerry Zucker, Airplane! was a comedy hit. The movie is a parody of airplane disaster movies from the 1970’s, such as Airport (1970) and Airport 1975 (1974). Though most viewers at the time recognized those references, the movie actually uses the 1957 movie Zero Hour for most of its source material and plot. A young man who flew a disastrous mission in a war has to take over as pilot for a commercial jetliner when the crew succumbs to food poisoning. Though the plot is potentially serious, the movie’s handling of the material embraces an over-the-top comedic approach. The movie fills almost every minute of screen time with some type of joke. These range from offbeat visual images, such as a ticket for a seat in Further Reading the plane’s smoking section (a “smoking ticket”) acGehring, Wes D. Parody as Film Genre: “Never Give a tually smoking, to stupid verbal jokes like “Surely you Saga an Even Break.” Westport, Conn.: Greenwood can’t be serious!” “I am serious . . . and don’t call me Press, 1999. Shirley.” “Don’t call me Shirley” became one of the many running gags used throughout the movie. Viewers also enjoyed catching references to a variety of other movies throughout the film such as Jaws (1975) and Saturday Night Fever (1977). The movie starred Robert Hays and Julie Haggerty as the young pilot and the stewardess who must overcome their relationship problems and past history to land the plane safely. However, the movie also had an all-star list of actors known for their dramatic roles, particularly on television. Lloyd Bridges, Peter Graves, Robert Stack, and Leslie Nielsen all had prominent roles in the film, and Kareem Abdul-Jabbar, left, and Peter Graves in a scene from Airplane! (AP/Wide part of the movie’s humor was seeWorld Photos)
The Eighties in America
Aliens
■
51
Karnick, Kristine Brunovska, and Henry Jenkins, eds. Classical Hollywood Comedy. New York: Routledge, 1995. Rickman, Gregg, ed. The Film Comedy Reader. New York: Limelight Editions, 2001. P. Andrew Miller See also
Comedians; Film in the United States.
■ Aliens Identification Science-fiction film sequel Director James Cameron (1954) Date Released July 18, 1986
Aliens capitalized on the success of the 1979 film Alien, cementing James Cameron’s reputation as a reliable director of science-fiction films and beginning his interest in using women as action heroes in his movies. The original Alien, directed by Ridley Scott, was a surprise hit in 1979. With no well-known stars and a clichéd premise—space travelers menaced by an alien monstrosity—the movie impressed audiences with its realistic depiction of the life cycle of an extraterrestrial species and surprising plot twists. Characters who usually functioned as the heroes of such stories—the stalwart captain, the no-nonsense science officer—died early in the film, and a character tangential to the early scenes, the protocol-conscious second-in-command Ellen Ripley, emerged as the film’s heroine. This skillful inverting of sciencefiction tropes by screenwriter Dan O’Bannon delighted viewers, and Sigourney Weaver’s portrayal of Lieutenant Ripley immediately established her as a star. When the decision was made to make a sequel, director James Cameron and his scenarists, David Giler and Walter Hill, were faced with a difficult task: to fashion a worthy successor to a science-fiction film that had succeeded by confounding its viewers’ expectations. They audaciously decided not to replicate the genre-bending tendencies of the original but to do the reverse: make a conventional sciencefiction action picture in which Ripley would return to the lair of the alien with a squad of Marines to fight the monsters with a staggering array of futuristic weapons. However, Cameron wisely maintained two striking elements of the original film: the dual themes of female empowerment and parenting.
The cover of the July 28, 1986, issue of Time magazine featured Sigourney Weaver and the alien queen, from Aliens. (Hulton Archive/Getty Images)
Weaver’s Ripley in Aliens is even bolder and more resourceful than she was in the original. She is wiser and more realistic in her outlook, having internalized the lessons of her first alien encounter. She is joined by other strong women among the Marines, especially Private Vasquez (Jenette Goldstein). In the original Alien, Ripley was portrayed as a positive maternal figure, while the ship’s computer, called “Mother” by the crew, failed its “children,” particularly Ripley herself, at key moments. In contrast, Ripley risked her life to save the ship’s cat and in the final scene was depicted in a Madonna-and-child pose with the feline. Similarly, in Aliens, Ripley fights for the safety of the child Newt against a “queenmother” alien intent on spawning an ever-increasing number of predatory offspring. A perennial problem for scriptwriters of sciencefiction and horror sequels is motivating the protagonists to return to a situation in which they previously suffered incredible dangers. Why would an astro-
52
■
The Eighties in America
Alternative medicine
naut return to a planet where she knows hideous monstrosities lurk? Cameron and his colleagues astutely tied this tricky plot element to the delineation of Ripley’s character in Alien. Ripley agrees to return to the aliens’ world only after she learns that the planet has been colonized by settlers who are in danger. Her decision to save them is wholly consistent with the courage and sense of self-sacrifice Ripley displayed in the original. Impact Aliens illustrated two important cinematic trends of the 1980’s. It demonstrated Hollywood’s growing tendency to turn any successful picture into a franchise, whereas sequels in the past were typically associated primarily with inexpensive “B-movies.” Also, it helped establish that the viewing public would accept women as leads in action films by proving clearly that Sigourney Weaver’s success in the original was no fluke. Further Reading
Cameron, James. Aliens: Foreword and Screenplay. London: Orion, 2001. Clute, John, and Peter Nicholls. The Encyclopedia of Science Fiction. New York: St. Martin’s Press, 1995. Hardy, Phil. The Overlook Film Encyclopedia: Science Fiction. Woodstock, N.Y.: Overlook Press, 1994. Thomson, David. David Thomson on the “Alien” Quartet. New York: St. Martin’s Press, 1999. Thomas Du Bose See also
Action films; Feminism; Film in the United States; Science-fiction films; Sequels; Special effects; Terminator, The; Weaver, Sigourney.
■ Alternative medicine Definition
Holistic medical practices that address mental, spiritual, and environmental factors, as well as physical ones, to treat and prevent illness
A convergence of political, economic, social, and religious movements caused an increasing interest and revival in alternative medicine during the 1980’s, signaling a change in health care practices. Alternative medicine represented a reaction to the practices of mainstream medicine that stressed the need to diagnose and treat disease rather than its underlying causes. More than 60 million Americans have relied on alternative forms of medicine, such as folk healing,
unorthodox fitness and diet programs, acupuncture, chiropractics, and self-help treatments. Public interest first began turning to alternative medical methodology in reaction to bleeding and purging, common methods used by physicians in the eighteenth century. Throughout the next century, a grassroots movement arose that challenged traditional medical practices. The American Medical Association (AMA) began to respond to the practice of alternative medicine by promoting licensure laws in each state by the end of the nineteenth century. The AMA then commissioned Abraham Flexner to conduct a study and issue a report on medical institutions. The Flexner Report (1910) criticized the lax educational standards that prevailed in most of the medical schools and institutions that offered alternative methods of healing. The report recommended imposing rigorous training and establishing uniform guidelines for all schools that offered medical training. The result of rising standards was a backlash against alternative medicine in the United States, and by the end of World War II, the medical profession perceived alternative practices as quackery. The Rise of Alternative Medicine
The acceptance of alternative medicine began to increase again in the 1960’s alongside the countercultural search for inner tranquillity and self-knowledge. As the counterculture movement gained momentum, the public perceived the medical profession as a bastion of the establishment, supporting economic inequality and the greed of corporate America. The social unrest of the time, characterized by the resentment of the militaryindustrial complex, the Vietnam War, and the stratification of society, coincided with the environmental movement. Environmentalism’s concern over the impact of pollution on the planet was transferred to the human body as well, as physical disease came to be linked to environmental hazards. By the late 1970’s and early 1980’s, three forces were at work that elevated the status of alternative medicine in the United States. Pentecostalism, the rise in consumerism, and holistic healing emerged as powerful societal forces. Evangelists such as Oral Roberts, Pat Robertson, Jimmy Swaggart, Jim Bakker, and Kenneth Copeland professed divine healing to devoted followers via television and radio. Roberts established the City of Faith Hospital at his university in 1980, a research hospital that empha-
The Eighties in America
sized the healing power of prayer and forms of alternative medicine such as naturopaths, homeopathy, and osteopathy. Oral Roberts University closed the hospital in 1989 because of funding issues. In addition, the concept of treating the person in a holistic manner acknowledged the spiritual and psychic components of healing. Americans began to show interest in Eastern medical practices such as acupuncture. The rising cost of health care also altered the ways in which medical practitioners and hospitals offered services in the 1980’s. The growing concern for patient rights was best illustrated by the rise in family medicine as an area of specialization. Central to this concern was the belief that patients were responsible for their own health care. The new wellness model was one of a dynamic process in which doctors and patients were actively engaged in the prevention of disease through lifestyle change. However, the majority of conventional physicians maintained that the alternative-medicine movement was a fad, stating that wellness educators lacked qualifications and certification. Impact Some scholars contend that the alternativemedicine movement represented a return to tribalism; people wanted to converse with nature and create harmony in their lives. Traditional Western medicine was often able to cure disease and prolong life with technological tools, but it removed both illness and cure from the emotional and psychological contexts in which they occurred. This isolation of disease from the broader context of a patient’s life prevented Western physicians from treating individuals in a holistic fashion. However, by the end of the 1990’s, unconventional therapies began to receive acceptance in the medical profession with the improvement of certification for those practicing alternative medicine. Congress even relented to public demand, renaming the Office of Alternative Medicine as the National Center for Complementary and Alternative Medicine in 1998. Further Reading
Gevitz, Norman, ed. Other Healers: Unorthodox Medicine in America. Baltimore: Johns Hopkins University Press, 1988. Gevitz and eight other writers provide scholarly analyses of the trends, practices, and perspectives in alternative medicine from 1800 to 1985. Grossinger, Richard. Planet Medicine: From Stone-Age
America’s Most Wanted
■
53
Shamanism to Post-Industrial Healing. Garden City, N.Y.: Anchor Books, 1980. The author explores the psychological, spiritual, and cultural origins of healing and the rise of the alternative medicine movement in the United States. Novey, Donald W. Clinician’s Complete Reference Guide to Complementary and Alternative Medicine. St. Louis, Mo.: Mosby, 2000. Provides information, suggested readings, and Internet resources for sixty-four forms of alternative treatment; written by ninety practitioners of those therapies. Sobel, David S., ed. Ways of Health: Holistic Approaches to Ancient and Contemporary Medicine. New York: Harcourt Brace Jovanovich, 1979. Twenty essays advocate a holistic approach to healing, contending that technical advances in conventional medicine can be successfully integrated with unorthodox practices. Whorton, James C. “The History of Complementary and Alternative Medicine.” In Essentials of Complementary and Alternative Medicine, edited by Wayne B. Jonas and Jeffrey S. Levin. Baltimore: Lippincott Williams & Wilkins, 1999. Whorton traces the historical developments and movements in American complementary medicine from the eighteenth through the twentieth centuries. _______. Nature Cures: The History of Alternative Medicine in America. New York: Oxford University Press, 2002. Traces the history of changing medical and popular views toward medicine over the past two centuries. Gayla Koerting See also Health care in the United States; Psychology; Religion and spirituality in the United States; Televangelism.
■ America’s Most Wanted Identification Nonfiction television series Date Began airing in 1988
FOX network’s long-running and popular series profiled missing persons and suspects wanted for committing violent crimes and asked viewers to provide information leading to their recovery or capture. America’s Most Wanted first aired on February 7, 1988, on the fledgling FOX network. The show used reenactments of actual events to dramatize violent
54
■
The Eighties in America
Anderson, Terry
crimes and profiled the crimes’ perpetrators with the goal of gaining information from viewers that would lead to the criminals’ arrests. Although sometimes criticized for sensationalizing crime and obscuring the distinction between law enforcement and entertainment, America’s Most Wanted succeeded in its purpose: In the show’s first eighteen years on the air, more than nine hundred of its featured criminals were apprehended. Successful captures were also chronicled on the show, which also provided updates on previous cases and interviews with victims, family members, and individuals who knew the fugitives both prior to and after their apprehension. Law-enforcement officials were initially skeptical about the potential effectiveness of a show like America’s Most Wanted. However, within days of the first broadcast, David James Roberts, a fugitive on the Federal Bureau of Investigation’s Most Wanted list, was captured after nearly twenty years on the run as a direct result of information gained through the show. In another well-known case, John Emil List, who murdered his wife, mother, and three children in Westfield, New Jersey, in 1971, was captured in 1989 only eleven days after his case aired on America’s Most Wanted. New Jersey authorities had been unable to locate List for eighteen years. America’s Most Wanted was hosted by anti-crime activist John Walsh. Walsh was selected because of his personal passion, dedication, and previous media exposure following the 1981 abduction and murder of his six-year-old son, Adam Walsh. Adam’s murder was never solved; the prime suspect, Ottis Toole, died in prison while serving time for different crimes, and he was never conclusively linked to Adam’s death. As a response to the ordeal of their son’s murder, Walsh and his wife, Revé, formed the Adam Walsh Child Resource Center, a non-profit organization dedicated to reforming legislation regarding missing and exploited children. Adam Walsh’s story was the subject of two made-for-television movies, Adam (1983) and its sequel, Adam: His Song Continues (1986). Impact America’s Most Wanted was one of the first of a new generation of “reality-based” television programs and was directly responsible for the inception and success of COPS, which began airing soon after. It was also FOX network’s first show to break into the Nielsen ratings’ weekly top fifty and was extremely lucrative for the young network, as its production costs were very low.
Further Reading
Breslin, Jack. “America’s Most Wanted”: How Television Catches Crooks. New York: Harper & Row, 2000. Walsh, John. Tears of Rage: From Grieving Father to Crusader for Justice—The Untold Story of the Adam Walsh Case. New York: Pocket Books, 1997. Alan C. Haslam See also Crime; FOX network; Journalism; Missing and runaway children; Tabloid television; Television.
■ Anderson, Terry Identification
American journalist held captive in Lebanon Born October 27, 1947; Lorain, Ohio Anderson was one of a group of hostages seized by the paramilitary organization Hezbollah during the Lebanese Civil War. He was held the longest—2,455 days—becoming the face of the hostages and a filter through which Americans interpreted the Lebanese conflict. Terry Anderson was the Middle East bureau chief for the Associated Press when he was abducted by terrorists on March 16, 1985. Born in Ohio and raised in New York, Anderson served two tours of duty as a U.S. Marine in the Vietnam War. After his 1970 discharge, he attended college at Iowa State University, graduating in 1974 with a B.A. in journalism and political science. After working in radio and television news in Des Moines, he became an editor for the Ypsilanti Post in Michigan, then state editor, foreign-desk editor, broadcast editor, Tokyo correspondent, South Africa correspondent, Middle East news editor, and chief Middle East correspondent for the Associated Press. Anderson moved to Beirut in 1983. Two years later, he was returning from his regular Saturdaymorning tennis game when he was kidnapped on the street, put in a car trunk, and taken to an unknown location—the first of more than fifteen sites where he would be imprisoned. During his captivity, he was first kept in isolation then jailed with a group of other hostages. He was beaten and tortured then given materials so he could write. He was repeatedly led to believe his release was imminent then moved by his captors. His frustration grew so great that he once banged his head against a wall until it bled.
The Eighties in America
Anderson, Terry
■
55
Former hostage Terry Anderson arrives in Wiesbaden, Germany, where he is met by his sister, Peggy Say. (AP/Wide World Photos)
During the 1980’s, Anderson was one of seventeen American hostages held in Lebanon by Hezbollah, a radical Shiite group seeking to expel Western and Israeli occupation forces from the country. A number of Europeans, including Anglican Church envoy Terry Waite, were also held at the same time. Anderson never stopped resisting: He created games out of materials at hand, learned French from hostage Thomas Sutherland—an administrator with American University in Beirut— and renewed his faith with help from hostage Lawrence Jenco, a Roman Catholic priest. Many people worked to free Anderson, including family members (especially his sister, Peggy Say), officials with the Ronald Reagan and George H. W. Bush administrations, his employers, and many fellow journalists, hundreds of whom petitioned Iran’s Ayatollah Ruhollah Khomeini, who was thought to have influence with the Shiites holding Anderson. Impact Perhaps as a result of the worldwide attention paid to Anderson’s plight, kidnapping of West-
erners in the Middle East seemed to decline a bit or simply to get less media attention. If the former, it could be because of the rise of governmental and corporate task forces to deal with hostage taking, also a possible consequence of the years of attention that Anderson’s imprisonment received. The trend continued, until the war in Iraq of the early twentieth century opened a new chapter in Middle Eastern hostage taking. At any rate, Anderson’s release, survival, and growth while a prisoner all were remarkable in the 1980’s—an era Time magazine called “the decade of hostages.” Subsequent Events Anderson was released on December 4, 1991, when he faced reporters in Damascus, Syria, and said, “I’ll try to answer a few questions, although you’ll understand I have a date with a couple of beautiful ladies and I’m already late.” He was reunited with his fiancé, Madeleine Bassil, and their daughter Sulome, born three months after his abduction. Anderson wrote a memoir of his ordeal, Den of
56
■
The Eighties in America
Androgyny
Lions (1993), and taught for a time at Columbia University in New York and Ohio University in Athens, Ohio. He later won a lawsuit against the Iranian government, which was thought to have supported Hezbollah, and in 2002 won millions of dollars from frozen Iranian assets held in the United States. With those proceeds, he launched several charities, including the Father Lawrence Jenco Foundation and a program to build schools in Vietnam. In 2004, Anderson ran as a Democrat for Ohio’s Twentieth Senate District, losing to an appointed incumbent, Joy Padgett, whose campaign accused Anderson of being “soft on terrorism.” Further Reading
Anderson, Terry A. Den of Lions: Memoirs of Seven Years. New York: Crown, 1993. Masland, Tom, Jennifer Foote, and Theresa Waldrop. “How Terry Survived.” Newsweek, December 16, 1991. Say, Peggy. Forgotten: A Sister’s Struggle to Save Terry Anderson, America’s Longest Held Hostage. New York: Simon & Schuster, 1991. Weir, Ben. Hostage Bound, Hostage Free. Philadelphia: Westminster Press, 1987. Bill Knight See also
Beirut bombings; Middle East and North America; Terrorism.
■ Androgyny Definition
The mixture of traditionally masculine and feminine traits in the same individual
During the 1980’s, androgyny became more common as an avenue of expression for popular singers and musicians, as a political statement of equality, and as a fashion statement. “Androgyny,” from the Greek andro (for “man”) and gyne (for “woman”), has been most commonly defined as the merging of feminine and masculine gender traits. It also has been equated with sexual ambiguity, gender ambiguity, hermaphroditism, homosexuality, bisexuality, transsexuality, and crossdressing. Native American cultures have a tradition of reverence for the gender-blending “two-spirit person.” In most Western cultures, stable gender identities have been strongly encouraged by the cultures’
dominant ideologies, so androgyny usually has been considered to be deviant, subversive, or both. As a result, several movements in the 1980’s—both pop cultural and political—seized upon androgyny as a symbol or expression of alternative culture. Early Representations An article in the journal Family Relations in 1980 announced that a trend to watch in the coming decade was the emergence of “more androgynous people.” Androgyny had existed in multiple forms for hundreds of years, but the journal had noticed a distinctive increase in the mainstream representation of androgyny and its incorporation into fashion and other aspects of popular culture. In the early twentieth century, psychologist Carl Jung argued that blending gender characteristics— the female archetype anima with the male animus— was essential to a person’s psychological and social growth. Jung’s ideas and the ideas of other psychologists continued through the century and were especially relevant to writers and researchers in the 1970’s and 1980’s who began to argue for an androgynous ideal for women and men. Literary critic and novelist Carolyn Heilbrun, in Toward a Recognition of Androgyny (1973), and psychologist Sandra Bem, in her journal article “The Measurement of Psychological Androgyny” (1974), were among the first to introduce androgyny as a subject worth studying. Heilbrun surveyed literature for examples of androgynous characters and themes, and Bem developed the Bem Sex Role Inventory to measure self-conceptions of femininity and masculinity. Also, Bem showed that androgynous persons blend characteristics of both, and she argued that androgynous individuals are not “sextyped”; that is, they express their gender based on a given situation rather than on culturally prescribed gender roles. Political Equality
Some American feminists of the 1970’s and 1980’s believed that women would be accepted as equals in the workplace if they took on “ideal” masculine behaviors: dressing in power suits, not showing emotions, taking chances, and so forth. Men, too, it was argued, could benefit from taking on what were considered ideal feminine characteristics: empathy and care, nurturance, emotional expressiveness, cooperation, and the like. The prime-time television soap operas Dallas and Dynasty showed best the androgynous “power dress-
The Eighties in America
ing” of the working women of the decade. Women on both shows wore tailored business suits made with broad, padded shoulders; loud but tasteful jewelry; and hairstyles that required much hair spray to remain in place. Women wore suits to assert their new feminine masculinity—or masculine femininity. By blending gender traits, proponents of the new style argued, an androgynous person would embody the best of both genders and express even a “third” gender. However, the key gender remained the masculine one, a fact that did not slip by critics of androgyny. Many called not for embracing androgyny but for thinking up and then embodying new sorts of gender expression that resisted a choice between “masculine” and “feminine” entirely. Popular Culture Androgyny in the 1980’s—in addition to being embraced by the fashion world, women in the workplace, and many feminists—was represented most profoundly in popular culture. The gender-bending singers and musicians of “glam” or “glitter” rock, whose popularity peaked in the early 1970’s with such performers as Lou Reed, Alice Cooper, Suzi Quatro, Freddie Mercury, and David Bowie, inspired the androgynous fashions of the late 1970’s and early 1980’s. Gender-bending was incorporated by rockers Kiss and early punk and alternative rockers, including Iggy Pop, the Cure, and Siouxsie and the Banshees. The 1980’s saw androgyny epitomized by singers such as Boy George, Adam Ant, Prince, Michael Jackson, Annie Lennox, and Grace Jones (who also helped popularize power dressing and short hairstyles). Following in the footsteps of the early 1980’s gender-bending pop singers were heavy metal bands, including Bon Jovi, Poison, Mötley Crüe, and Twisted Sister, whose members wore tight spandex pants and heavy makeup and had “big” hair, contributing to a highly stylized rock aesthetic that seemed contrary to the genre’s masculine antics and lyrics. This heavy metal aesthetic was not unlike the glam rock of the 1970’s, and its androgynous leanings were often criticized as too feminine. Indeed, groups such as Bon Jovi would be categorized in a genre called “hair metal” because of their prettified hair. Country-folk singer k. d. lang, who arrived on the music scene with her first successful album in 1987, was unapologetically androgynous, even as she faced a country music market of mostly socially conservative consumers. Within a few years, she came out as
Androgyny
■
57
lesbian. Folk singer Sinéad O’Connor was shockingly—for the time—bald, and she wore what many considered “formless,” less-than-feminine clothing. In the world of fashion, many supermodels defined the androgynous look by moving away from a more curvaceous body style to a more boyish one. The top designers created a look that was already popular among singers of the time. Because models were becoming celebrities in their own right, what they wore quickly became fashionable. This genderbending caught on in the worlds of fashion and art photography as well and manifested most popularly in advertising—which reached consumers hungry for a new look. Impact By the end of the 1980’s, gender-bending would see a surge in popularity among youth who embraced androgyny as a political identity and began naming themselves “genderqueer” instead of “androgynous.” Among college and university students, gender roles and sexuality became more fluid, and one could argue that the mainstream popularity of androgyny in the 1980’s led to the beginning of a breakdown of personal barriers, allowing some to embrace alternative gender expressions. Androgyny remained a fashion statement as well for a time, although it later subsided. The rise of hip-hop helped eclipse androgynous styles in popular music, and music videos of later decades tended to express a more culturally acceptable hyperfemininity and hypermasculinity. Further Reading
Bullough, Bonnie, Vern L. Bullough, and James Elias, eds. Gender Blending. Amherst, N.Y.: Prometheus Books, 1997. Covers androgyny in its multiple forms, including gender-bending, transgender, transsexuality, and cross-dressing. The editors are well-known researchers in the study of sexuality and gender. Celant, Germano, and Harold Koda, eds. Giorgio Armani. New York: Guggenheim Museum, 2000. Overview of the designs of Giorgio Armani, with discussion of the play of androgyny and gender in his clothing styles of the 1980’s. Heilbrun, Carolyn G. Toward a Recognition of Androgyny. 1973. New ed. Bridgewater, N.J.: Replica Books, 1997. Heilbrun surveys the literature from classical times to the late twentieth century to find literary and mythical references to androgyny. Rubinstein, Ruth P. Dress Codes: Meanings and Mes-
58
■
The Eighties in America
Apple Computer
sages in American Culture. Boulder, Colo.: Westview Press, 1995. Explores the world of “dress codes” and what they mean in American culture. Includes the chapters “The Image of Power” and “Gender Images.” Simels, Steven. Gender Chameleons: Androgyny in Rock ’n’ Roll. New York: Arbor House, 1985. A look at the ever-evolving androgyny of rock singers. Whiteley, Sheila, ed. Sexing the Groove: Popular Music and Gender. New York: Routledge, 1997. Includes chapters on k. d. lang and Sinéad O’Connor. Desiree Dreeuws See also Bon Jovi; Boy George and Culture Club;
Fads; Fashions and clothing; Feminism; Hairstyles; Heavy metal; Homosexuality and gay rights; Hurt, William; Jackson, Michael; Kiss of the Spider Woman; Mötley Crüe; Music; New Wave music; Pop music; Power dressing; Torch Song Trilogy; Women in rock music; Women in the workforce.
■ Apple Computer Identification
Innovative computer manufacturer
In the early 1980’s, Apple created the mass market for personal computers with the Apple II. Faced with competition from IBM, the company marketed the Macintosh, whose graphical user interface changed the way people interacted with computers and whose WYSIWYG and printing capabilities made possible the desktop publishing industry. A paradigm of the technology firm marching from garage to mega-success, Apple Computer (later Apple, Inc.) dominated the early microcomputer market with Steve Wozniak’s Apple II series of personal computers (also known as microcomputers). Color graphics, 8 expansion slots, and Wozniak’s innovative floppy disk drive attracted thousands of developers whose applications enhanced the computer’s usefulness over its sixteen-year lifespan. The first spreadsheet, Visicalc, hugely boosted Apple’s sales. A Fortune 500 company in record time, its initial public offering (IPO) in 1980 was the biggest since the Ford Motor Company’s in 1956. The Apple II, Apple II+, Apple IIe, and Apple IIgs became cash cows, sustaining Apple through costly product development cycles and missteps, but it also lured industry giant International Business Machines (IBM) into producing its own microcomputers.
The IBM PC, first marketed in 1981, assembled from off-the-shelf components, garnered tepid reviews and a smug Wall Street Journal ad from Apple reading “Welcome.” The smugness soon disappeared, however. With Intel supplying the central processing unit (CPU) and Microsoft providing an operating system called MS-DOS, the PC was easy to “clone” (that is, it was easy for competitors to create and market their own, functionally equivalent computers). The existence of several different clones, all capable of running the same software, created a de facto standarized computing platform and simplified the chaotic microcomputer landscape, allowing IBM PCs and their clones to surpass Apple’s personal computers by 1983. Revolutionary Interface
Apple stumbled twice attempting to create a primarily business-oriented computer to compete with IBM. The Apple III, rushed to market before the company’s IPO in 1980, was initially marred by component failures and lacked software that would run on its upgraded operating system. The product never recovered. Apple’s 1983 Lisa, overpriced at $9,995 and with similarly limited software, also failed, but it embodied the future with its graphical interface. The graphical user interface, or GUI, was pioneered by mouse inventor and Xerox Corporation’s impressive Palo Alto Research Center (Xerox PARC). Indeed, Xerox PARC was an extremely important source of early innovation in personal computing, having invented part or all of the GUI, the laser printer, objectoriented programming, Ethernet, and what-you-see-is-what-youget (WYSIWYG) text editors. Apple engineers visited Xerox PARC in the early 1980’s, and the company gave Xerox shares of its stock in return for permission to imitate aspects of Xerox’s projects in Apple products. The GUI simplified human-computer interactions. It replaced tedious command-line interfaces—which required accurately typing arcane text strings to control a computer—with mouse-selectable icons and menus. It was Apple’s 1984 product, the Macintosh, that refined and established the superiority of GUIs. Created by brilliant young engineers, the Mac replaced PCs’ fuzzy black screens and crude characters with a sharp white, square-pixeled, bit-mapped display supporting superior graphics, multiple proportional fonts, and foreign scripts. Portable, distinctively upright, allowing long natural file names, and
The Eighties in America
Apple Computer founder Steve Jobs with the original Macintosh computer in 1984. (Hulton Archive/Getty Images)
boasting easy-to-use applications, it also possessed, thanks to its whimsical icons, an endearing personality. Mac users’ loyalty became legendary. Marketing the Mac The Mac, however, was hobbled by its price ($2,495), initially meager software, and co-founder Steve Jobs’s dismissive rejection of engineers’ pleas for hard-drive support and more than 128 kilobytes of random-access memory (RAM). Apple counteracted these drawbacks with its legendary Superbowl ad, which touted the new computer as an alternative to the mindless conformity of corporate culture, represented by IBM PCs. The ad declared that, thanks to Apple’s product, “1984 won’t be like Nineteen Eighty-Four” (a reference to George Orwell’s novel portraying a nightmarish totalitarian future). Advertising Age named Apple’s ad the Commercial of the Decade, but its success in branding Macs as anticorporate may have cost Apple the opportunity to
Apple Computer
■
59
sell to corporate America, which tended to perceive the new computer as a product for home and hobbyists only. The Macintosh was saved by the creation of a new industry to take advantage of it. With Apple’s new LaserWriter printer, Adobe’s Postscript language for communicating between printer and computer, and the document-creation program Aldus PageMaker, Macs could be used to publish professional-quality documents without the need for printing presses or professional typesetting services. The desktop publishing industry was born, and Apple found a niche market that it could dominate. When the company introduced the Mac Plus in 1986, sales exploded. The Mac II (1987) sacrificed the Mac’s iconic form for color and expansion slots. Regular upgrades increased Macs’ power, from the initial 8 megahertz in 1984 to the IIfx’s 140 megahertz in 1990. The Mac Portable (1989) featured an innovative active-matrix liquid crystal display, but its lead-acid battery made it an unmanageable 15.6 pounds, belying its portable status. Beyond vital operating-system upgrades, the most interesting new Apple software to be introduced was HyperCard (1987), an ingenious database application whose high-level programming language simplified application development. With non-linear hyperlinks and integration of text, graphics, sound, animation, and video (“multimedia”), HyperCard followed the vision of computing pioneers Doug Engelbart and Ted Nelson and presaged the development of the World Wide Web. Management Judgment
The failed Apple III and Lisa and the company’s tardiness in correcting some of the Mac’s shortcomings evoked criticism of Apple’s management. Chief Executive Officer John Scully (who had ousted co-founder Jobs in 1985) tightened operations, but he vacillated on the company’s most pressing strategic concern: what to do about the dominant market share of PC-compatible computers. Interestingly, Microsoft’s Bill Gates, recognizing the Mac GUI’s superiority over MS-DOS and anxious to recoup his spending on Mac applications, in 1985 famously recommended licensing the Mac to other manufacturers, allowing it to compete head to head with PCs as another clonable platform. He believed that the Mac’s superiority would make it the new standard, and he even offered to help recruit Mac clone manufacturers.
60
■
The Eighties in America
Archaeology
Another option was to introduce low-cost Macs to compete with the cheaper clone market. A third was to create a version of the Mac GUI that could run on Intel CPUs rather than the Mac’s Motorola-made CPUs, so consumers could run Macintosh software in IBM PC clones. This approach would have posed problems, however, because PC clone manufacturers were required to purchase MS-DOS for every machine they made, whether they installed it or not, so installing a Mac operating system would always be more expensive. Recompiling Mac applications for Intel processors would also have required time and money. In the end, Apple dithered indecisively, electing to protect its 50 percent margins while its huge ease-of-use advantage eroded via the slow, implacable development of Windows. The Microsoft Challenge
Apple had provided prototype Macintosh computers to Microsoft so Gates’s company could write programs to run on Apple’s computers. As a condition of this arrangement, Gates had promised not to market GUI applications that would run on competitors’ operating systems. He never agreed, however, not to create an entire competing GUI operating system. Thus, Gates commenced developing Windows, a GUI designed to run on top of MS-DOS. He announced the new product shortly before the Mac’s introduction, brashly predicting that 90 percent of PCs would adopt Windows before 1985. Fearing legal action, Gates threatened to stop developing WORD and EXCEL for the Mac if Apple sued Microsoft over the rights to GUI operating systems. Both applications were vital to the Mac’s survival. Although advised that Gates was bluffing, in October, 1985, Scully foolishly agreed to allow Windows to use elements of Mac’s GUI. Windows 1 was crude, with tiled windows and ugly fonts. Windows 2 provoked an Apple lawsuit in 1988, based on the claim that Gates was copying the “look and feel” of Apple’s proprietary system. Most of the claims were dismissed, however, when the judge found that Microsoft had permission to copy Apple’s work as a result of Scully’s 1985 agreement. Apple appealed fruitlessly until 1995, as Microsoft copied its software more blatantly. Windows 3 (1990) was the company’s first real success, largely closing the gap in ease of use between Macs and PCs.
Impact Apple drew millions of consumers into personal computing by making it easier and more en-
gaging. The Apple II’s success inadvertently simplified the fragmented microcomputer market by attracting IBM. The Mac’s hardware innovations and GUI revolutionized computing, but Apple executives—indecisive and transfixed by lucrative profit margins—ceded most of the GUI franchise to Gates, helping make him the world’s richest man. Free to innovate hardware and software in concert, however, Apple remained one of the industry’s most creative companies. Further Reading
Hertzfeld, Andy. Revolution in the Valley. Sevastopol, Calif.: O’Reilly Media, 2005. Fascinating anecdotal account of the creation of the Mac. Hiltzik, Michael. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperCollins, 1999. Thorough coverage of the developments at the Xerox PARC labs. Levy, Steve. Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything. New York: Penguin, 2000. Informative account by a well-known technology journalist. Linzmayer, Owen W. Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company. San Francisco: No Starch Press, 2004. Best history of Apple’s triumphs and failures; features coherent topical organization. R. Craig Philips See also
Business and the economy in the United States; Computers; Information age; Inventions; Microsoft.
■ Archaeology Definition
Systematic recovery and analysis of ancient and historic human cultural artifacts
American archaeology was characterized in the 1980’s by increasing refinement of analytical techniques; emphasis on women, racial minorities, and marginal communities; and regulations curbing the activities of amateur fortune hunters. American archaeology entered the 1980’s with trends already firmly established that shaped the discipline in the coming decade. Culture history, which emphasized the physical evolution of particular classes of artifacts, had given way to processual ar-
The Eighties in America
chaeology in the 1960’s. Archaeologists looked more at function and at how all of the objects at a given site fit together in reconstructing the culture that produced them. Microscopic examination and chemical analysis were used increasingly to reconstruct the paleoenvironment. Improvements in carbon-14 dating made it possible to date small fragments of organic matter, rendering the technique applicable to many more sites. New Interest in Marginal Communities
The 1980’s saw a great increase in public and academic interest in African American history, the role of women in historic and prehistoric cultures, and Native American cultures after European contact. Thus, sites that previously would have been considered to contain nothing of cultural significance proved to be valuable windows on the lives of people who left few written records despite living in a literate society. Some studies, such as the excavation of slave quarters at Thomas Jefferson’s Monticello estate, were the result of deliberate planning, but many were by-products of conservation efforts, as cities and states increasingly required archaeological surveys prior to development. Whereas unearthed traces of poor urban neighborhoods and rural communities would formerly have been dismissed as insignificant, surveyors now called upon professional archeologists to remove and catalog artifacts before construction proceeded. This proved to be a major task, when a team working under the auspices of New York City’s conservation laws retrieved 310,000 artifacts from a waterfront area built on an eighteenth century landfill. Although often hastily done and inadequate, such mandated studies greatly expanded knowledge of everyday life in eighteenth and nineteenth century America. In 1980, construction on a commuter railroad in Philadelphia halted, when workers unearthed a forgotten cemetery belonging to Philadelphia’s first African American Baptist church. By analyzing the skeletons, anthropologists were able to reconstruct the health status and demographics of a free black population in the early nineteenth century. The remains were then re-interred at another location. In the 1980’s, protection of Native American burial sites from desecration had achieved widespread public support, and a number of states had enacted legislation prohibiting removal of skeletons and grave goods from prehistoric cemeteries, but a federal law
Archaeology
■
61
in this area only took effect in 1990. Tension between the desire of the scientific community to study human prehistory as thoroughly as possible and indigenous groups wanting their cultural heritage respected operated throughout the 1980’s and is far from being resolved today. A construction site in downtown Tallahassee, Florida, fortuitously turned up the remains of Spanish explorer Hernando de Soto’s winter camp, the first incontrovertible physical evidence of de Soto’s epic journey across the American South from 1539 to 1542. The discovery spurred excavation of other sites along the route, helping form a more complete picture of dense and sophisticated aboriginal settlement along America’s Gulf Coast. Looters, Private Collectors, and ARPA
In 1979, Congress passed the Archaeological Resources and Protection Act (ARPA), which replaced the 1906 Antiquities Act. ARPA established a permit system for excavation on federal land and included regulations requiring that artifacts removed from a site be properly curated and made available to researchers. These requirements were meant to curb fortune hunters, who often used motorized equipment to remove a few highly prized artifacts for sale on the private market, in the process destroying the rest of a site’s value for future archaeological inquiry. The requirements were also aimed at entrepreneurial archaeologists, who used standard techniques to preserve sites’ integrity but then sold their finds piecemeal to the highest bidders. Archaeology on private land remained unregulated. For artifacts in demand by private collectors, the stakes were high and the pressures enormous. The fate of Mimbres ware, a distinctive pottery type found in northern New Mexico, is instructive. Sites that included this pottery, mostly on private land, were bulldozed, yielding hundreds of pots for collectors while obliterating most traces of the culture that produced them. Partly in response to this ongoing rapacity, concerned citizens founded the nonprofit Archaeological Conservancy in 1980. This organization purchased vulnerable sites and preserved them for posterity.
Underwater Archaeology The 1980’s saw a large increase in underwater archaeology—primarily in the investigation of shipwrecks. In the late 1980’s, the Institute of Nautical Archaeology at Texas A&M University instituted a systematic search of the Carib-
62
■
The Eighties in America
Architecture
bean Sea for remains of Spanish ships from the Age of Discovery, in anticipation of the upcoming Christopher Columbus Quincentennial. Sunken ships in U.S. territorial waters fell under admiralty salvage laws, which were more concerned with the ownership of specific valuables than with the preservation of history. Already in 1978, tensions had surfaced between sport divers and marine archaeologists when a diver located a wreck in a protected natural area on the coast of Florida and filed a claim to it under admiralty law. Believing he had found a treasure ship, he began dynamiting, destroying the ship’s midsection before environmentalists obtained an injunction under the National Park Service Act of 1916. The wreck, a British warship sunk in 1742, contained no gold but much of historical interest. Impact Archaeology can change one’s perceptions of previous cultures. The most significant development of the 1980’s, during which American archaeology produced no spectacular finds, was the endowment of minorities and women with more central places in the historical record and the recognition of the complexity of their lives. Increasing refinement in dating, the thorough analysis implicit in a processual approach, and close investigation of a much broader range of sites would eventually lead to rethinking the canonical picture of human settlement in the Americas prior to the arrival of Europeans. That process, which had just begun at the end of the 1980’s, still continues. Further Reading
Archaeology 42, no. 1 (January/February, 1989). Special issue, “Archaeology in the Twenty-First Century,” comprising articles that describe trends that seemed likely to shape twenty-first century archaeology at the end of the 1980’s; good coverage of development and looting issues. Meltzer, David T., Don D. Fowler, and Jeremy Sabloff, eds. American Archaeology: Past and Future. Washington, D.C.: Smithsonian Institution Press, 1986. A collection of papers on the state of the profession in mid-decade. Describes the transition from cultural to processual archaeology. Scott, Elizabeth M. Those of Little Note: Gender, Race, and Class in Historical Archaeology. Tucson: University of Arizona Press, 1994. Includes an overview of the place of marginalized peoples in archaeological theory and practice, as well as descriptions of a dozen specific archaeological projects.
Showronek, Russell K. “Sport Divers and Archaeology: The Case of the Legare Anchorage Ship Site.” Archaeology 38, no. 3 (1985): 23-27. A good discussion of issues in underwater archaeology. Martha A. Sherwood See also
African Americans; Native Americans.
■ Architecture Definition
The design and building of structures, especially habitable ones
In the 1980’s, a reaction toward the previously dominant architectural style of modernism had set in. In its place, late modernism, deconstructivism, and postmodernism became more common, seeking to counteract modernism’s cold, sterile structures, minimalist facades, and rejection of historical reference. American architectural practices changed in the 1980’s as a result of several converging cultural developments, including the ascendance of computer technology, increased public awareness of architecture as such and of environmental issues generally, continued growth of the suburbs, and a prosperous, consumer-oriented economy. These developments both influenced and amplified the evolution of architectural and art history, contributing to and shaping the meaning of modernist architecture’s fall from dominance. Styles
Late modernism is a somewhat ambiguous term for an architectural style common from the end of modernism’s preeminence to the flowering of postmodernism. It is sometimes referred to as High Modernism, heroic modernism, or late expressionism. It stressed the use of high technology in design and materials, continuing the use of sheet metal and glass structures. However, late modernists used more complex, shaped masses and nonlinear forms to give their buildings more visual appeal. Deconstructivism also enjoyed some popularity in the 1980’s but was more admired by architects than by most of the public. Reflecting the chaos that many saw in society, deconstructivism offered strange, skewed designs with the intent of altering perception, favored fragmentation and unusual forms, and sometimes relied on commonplace materials such as chain-link fencing and bright plastic.
The Eighties in America
With its disturbing images, celebration of tension and conflict, and convoluted vocabulary, incomprehensible to most people outside the discipline, the movement failed to achieve much popularity beyond California and the East Coast. New York’s Museum of Modern Art hosted the exhibition Deconstructivist Architecture in 1988, displaying designs by Frank Gehry, Daniel Libeskind, Peter Eisenman, and Bernard Tschumi from the United States, as well as three European architects. Few of the proposed designs were ever built. Reaction to the sterility of modernism spawned the growth of postmodernism. One of its earliest proponents, Robert Venturi, parodied the modernist aphorism “Less is more,” describing the movement with the phrase “Less is a bore.” Postmodernists favored traditional and classical forms that invoked the past, rather than the harsh concrete and glass facades of modernism. Three main tenets of postmodernism were ornamentalism, contextualism, and allusionism. Ornamentalism was the use of color, texture, and decoration to add interest and meaning to a building’s exterior and interior, in opposition to the modernist belief that decoration is unneeded and therefore inappropriate. Contextualism was an effort to integrate new buildings with their surroundings, including natural features as well as other nearby buildings. It opposed the modernist practice of constructing monolithic designs that ignored the surrounding landscape. Allusionism considered the building’s purpose when developing its design, rather than following the modernist principle of using generic styles for all buildings, no matter where they were or what they would be used for. For example, a new postmodern home in a seaside community might be designed to blend in with existing beach cottages. Trends Computer technology, including computeraided design (CAD), e-mail, and faxes, had a profound effect on architecture. Architects no longer needed pencil, paper, and ruler to create their plans; three-dimensional views could be created and complex engineering problems resolved with computer programs. As environmental concerns became more important to the public, more old buildings were saved and restored or adapted for new uses. Private commissions increasingly required that architects dis-
Architecture
■
63
play a concern for and new skills in energy conservation, as well as a commitment to adapting a building to its site. Along with environmental awareness came a movement sometimes referred to as “new vernacular.” More a philosophical approach to architecture than an architectural style, it incorporated the sensual aspects of a site, such as its geography and seasonal variations, into the design of the building. It also paid attention to local traditions and lore. An example of this sensibility is a complex designed in 1982 by Jefferson Riley for Ann Elliott and Peter Gruen in the Appalachian countryside of Pennsylvania. A splitrail fence that Elliott had played on as a child was used to connect the house visually with the adjacent structures for Elliott’s sculpture studio and Gruen’s office. In the 1980’s, the public at large became more aware of architecture as an art form and of individual architects. Two books on architecture became best sellers: Tracy Kidder’s House (1985) detailed the challenges a family and their architect faced in building a new home; Tom Wolfe’s From Bauhaus to Our House (1981) castigated both modernism and postmodernism. Despite the hostile reaction to his book from many architects of various schools, Wolfe was invited to be the keynote speaker at the 1985 convention of the American Institute of Architects (AIA). Public television hosted two series on architecture in the 1980’s. Pride of Place (1986) dealt with architectural history, disparaging modernism but applauding postmodernism. America by Design (1987) discussed the sociological, technological, and political forces shaping the built environment. The growth of suburbs not only resulted in the building of more large developments of very similar homes but also increased the demand for small strip malls and large shopping malls. As the U.S. economy improved and the country became more consumer oriented, shopping malls, hotels, and museums became prized commissions for architectural firms. Festival marketplaces, originated by James Rouse in the 1970’s, became increasingly popular in the 1980’s. Such marketplaces often were developed in rundown older areas, with historic older buildings restored, redeveloped, and repurposed into retail shops, restaurants, and entertainment venues. Both the dependence on computers and the increase of large-scale projects forced the growth of megafirms, the decline of individual practices, and
64
■
Architecture
the need for architects to work in concert with engineers, planners, and landscape architects to an extent unimagined in earlier times. At the same time, building regulations became more stringent and contracts became more complex, requiring numerous consultants. Notable U.S. Architects and Buildings
One of the decade’s most recognizable buildings was the Crystal Cathedral in Southern California, opened in 1980. Architects Philip Johnson and John Burgee designed the $18 million extravaganza as the church and headquarters for television evangelist Robert Schuller. At Schuller’s insistence, the massive structure was made entirely of glass set into welded steel tubing painted white. The interior was 128 feet high and spanned 200 feet without interior columns. Viewed from the outside, the building reflected the landscape and changing patterns of light throughout the day; from inside, one looked out to the flat, suburban landscape and the massive parking lot. The church was as much a tourist attraction as a house of worship. At the opposite end of the spectrum was the small but dramatic Thorncrown Chapel, designed by E. Fay Jones and completed in 1981. Set in the Ozark Mountains of Arkansas, the chapel was built completely with materials that could be carried to the site by two workers, to prevent damage to the surrounding environment. Although it covered only fourteen hundred square feet, the chapel soared forty-eight feet into the trees. Like the Crystal Cathedral, Thorncrown Chapel was primarily made of glass—more than six thousand square feet of glass, including more than 425 windows. The remainder of the structure was wood, with only a small section of steel. The native flagstone floor was surrounded with a rock wall. The distinctive design received the AIA Design of the Decade Award for the 1980’s and was ranked fourth on the AIA’s list of the top buildings of the twentieth century. Michael Graves, who had embraced modernism early in his career, soon turned away from the style, believing it to be irrelevant to the people who had to work in and use such buildings. One of his bestknown and most controversial commissions was the Portland Public Services Building, completed in 1982, and considered by many critics to be the first postmodernist office building. Its colorful—some argued garish—exterior did complement the many
The Eighties in America
colorfully restored Victorian buildings in downtown Portland, but its unusual forms and decorative elements, meant to be symbolic, were confusing to most observers. Despite Graves’s concerns that buildings should be relevant to their users, the Portland building was criticized for its small, dark offices and poorly designed public spaces. However, part of Graves’s original design was altered and simplified to meet budget and time constraints. It was Graves’s first commission for a large building; he went on to design less controversial projects and was the recipient of the 2001 AIA Gold Medal for significant influence on the profession of architecture. Another civic building of note was the Thompson Center (also known as the State of Illinois Center) in Chicago, built by late modernist Helmut Jahn and dedicated in 1985. The blue glass building filled and dominated a city block with an unusual rounded shape tapering inward as it rose seventeen stories to the angled dome on top. Each floor opened onto the circular central rotunda, 160 feet across. Primarily a government building with state agencies and offices, it also contained a commercial area of shops and restaurants and the State of Illinois Gallery, Illinois Artisans’ Shop, and an impressive collection of contemporary Illinois art. Its assembly hall and lower concourse level were available for public and private events, and the building became a major tourist attraction in Chicago. One of the great failures of modernism was its use for large public-housing projects. After a massive public-housing project in St. Louis was completely demolished because it had deteriorated into a crimeridden slum, planners looked for ways to provide lowincome housing on a smaller, more human scale. Charleston, South Carolina, took a new approach: Some 113 apartments for low-income residents were built in the style of single homes that were common to the area, and the units were distributed among fourteen sites around the city. The project was divided between Bradfield Associates of Atlanta and Middleton McMillan Architects of Charleston, with each firm employing its own contractor. The project came in under budget and won numerous awards, including a U.S. Department of Housing and Urban Development award for innovation and a 1984 presidential award for design excellence. Frank Gehry (born Ephraim Goldberg in Toronto, Ontario, in 1929), is one of the best-known
The Eighties in America
Architecture
■
65
The Crystal Cathedral in Garden Grove, California, was one of the most recognizable architectural achievements of the 1980’s. (© Arnold C [Buchanan-Hermit])
U.S. architects. The Goldberg family changed its name to Gehry and moved to California when Gehry was seventeen years of age. His highly unusual forms are commonly referred to as deconstructivist or deconstructionist, although Gehry himself rejected such classifications. Many of his later commissions were for public buildings and museums, but a guest house he designed in 1987 for clients in Wayzata, Minnesota, gives a taste of his iconoclastic style. The living room was the center of the building, with three rooms and a fireplace alcove extending from the corners as separate structures. Each room was a different shape and of a different material, including sheet metal, galvanized metal, coated plywood, brick, and stone veneer. Another prominent U.S. architect, Robert Venturi, had a varied career, from designing the Sains-
bury Wing of the National Gallery in London, England, to designing home furnishings. Two awardwinning houses designed by his firm, Venturi, Rauch & Scott Brown, are prime examples of homes that not only responded to the needs of the clients but also were sensitive to the surrounding environment. The Coxe/Hayden house and studio, situated on a large tract of land on Block Island in Rhode Island, was actually two separate houses, adjacent to each other and similar in style. The larger house contained the kitchen, living and dining rooms, master bedroom and bath, and a writing studio. The smaller house had guest quarters above the large garage and workshop. Both were in an updated version of the style typical of many of the area’s nineteenth century buildings, and their gray exteriors blended with the nearby salt pond and boulders that lined the prop-
66
■
The Eighties in America
Architecture
erty. A later home, in Delaware, was more playful in style, with unusual, broad, flat columns; fanciful decorative arches in the music room; and large windows throughout to enable the family’s hobby of bird watching. The firm’s most imaginative commission was the Widener Memorial Tree House at the Philadelphia Zoo. Not just an exhibit, the six oversize environments provided spaces and activities that visitors could experience as did the animals who typically lived there. The primordial swamp, milkweed meadow, beaver pond, honeycomb, ficus tree, and everglade each feature the sounds and smells of their real-world counterparts. Both children and adults enjoyed and learned from the interactive tree house exhibits, which changed regularly. Described by some as the best new piece of public architecture in Boston, the Hynes Convention Center was the lively answer of Kallmann, McKinnell & Wood to the question of how to design a convention center that functioned as an attractive and inviting public space. The three-story building opened to the street with generous windows on every level. Its granite facade matched the granite of the nearby Boston Public Library, and its height was similar to that of the buildings across the street. The old convention center, an undistinguished, nearly featureless building, was incorporated into the new center; the rooms of the old building were still used for large product displays, while the new building provided forty-one meeting rooms, spacious auditoriums, and an elegant glass-domed ballroom, all colorfully and tastefully decorated. Canadian Architecture Canada is home to several notable buildings constructed in the 1980’s, and Canadian architects made their mark both in Canada and elsewhere. The Olympic Stadium in Montreal was begun in 1973 for the 1976 Olympics; however, it was not completed until 1987. Because the Olympic Games must be played in the open air but an openair stadium would be impractical most of the year in Montreal, French architect Roger Taillibert attempted to build the world’s first retractable domed stadium. The retractable roof and the massive and dramatic tower that supported it were completed in 1987. After years of technical problems with opening and closing the roof, it was closed permanently. The West Edmonton Mall, completed in 1986, was the largest shopping center in Canada, and for
several years was the largest in the world. With more than eight hundred stores, one hundred food outlets, and such entertainment options as an amusement park with a roller coaster, a seven-acre wave pool, a professional hockey rink, and a large bingo hall, the mall quickly became a tourist destination and a major contributor to Edmonton’s economy. The Centre Canadian d’Architecture/Canadian Centre for Architecture, designed by Peter Rose of Montreal, was an exceptional example of Canadian postmodernism. It was built around an elegant eighteenth century home that had been threatened with demolition. Instead, Rose designed a U-shaped building around the older structure. The classically styled building incorporated features similar to those of neighboring Victorian homes. Another prime example of postmodernism in Canada (although not designed by Canadian architects) was the Mississauga (Ontario) City Hall and Civic Square. Using simple forms and generally light colors, it both provided an emphatic urban focal point in an otherwise featureless area of rural landscape and newer subdivisions and offered multiple levels of meaning, or coding. For example, casual observers might notice that the council chamber’s shape resembled a silo, such as might be found on a nearby farm, and the white-banded brickwork reflected Victorian architecture common in Ontario. Architectural students, on the other hand, might notice the plan’s resemblance to an Italian piazza. Impact The rejection of modernism and the move toward postmodernism in the 1980’s represented at base a decision that buildings should be designed to be functional and to be a responsible part of the landscape they are to inhabit, both visually and environmentally. Modernist buildings were edifices unto themselves that ignored their environments and required people to adjust to the needs of the building. The buildings of the 1980’s, however, sought by and large to anticipate the needs of its denizens and to respond to their surroundings. It may not be coincidental that this change occurred at a time when lay people were developing a greater awareness of architectural styles and of the design issues that affected their experience of and relationship to public and private spaces. Further Reading
American Institute of Architects. American Architecture of the 1980’s. Washington, D.C.: American In-
The Eighties in America
stitute of Architects Press, 1990. Forty-five articles reprinted from Architecture magazine provide a comprehensive and lavishly illustrated look at major buildings of the 1980’s. Floor plans and building plans are included. The book concludes with a dozen essays on the decade by architects, architectural historians, and critics. Diamonstein, Barbaralee. American Architecture Now II. New York: Rizzoli, 1985. Preservationist and writer Diamonstein interviews twenty-nine practicing American architects from 1982 through 1984. Handlin, David P. American Architecture. 2d ed. New York: Thames & Hudson, 2004. A brief but thorough survey of U.S. architecture. Well illustrated with photographs and plans. Index and bibliography. Kalman, Harold. A Concise History of Canadian Architecture. Canada: Oxford University Press Canada, 2000. An extensive treatise on Canadian architecture, beginning with information on the structures of the First Nations peoples prior to European settlement. Notes, bibliography, glossary, index of buildings, and general index. Kidder, Tracy. House. Boston: Houghton Mifflin, 1985. Kidder is known for writing nonfiction that reads like a novel. He brings the reader into the lives of the clients, the architect, and other players in the building of a couple’s dream home. Although not specific to architectural trends in the 1980’s, this book gives insight into the struggles of an architect designing a single-family dwelling. Roth, Leland M. American Architecture: A History. Cambridge, Mass.: Westview Press, 2000. A comprehensive survey of architecture in the United States. Includes discussions on urban planning issues, Native American buildings, and vernacular architecture. Glossary and detailed index. Wolfe, Tom. From Bauhaus to Our House. New York: Farrar, Straus and Giroux, 1981. Both architectural and social criticism, Wolfe’s book gives insight into the modernist tenets that led to the massive monoliths of the 1970’s and the postmodern reaction that continued into the 1980’s. Irene Struthers Rush See also CAD/CAM technology; Deconstructivist architecture; Environmental movement; Gehry, Frank; SkyDome; Vietnam Veterans Memorial; Xanadu Houses.
Arena Football League
■
67
■ Arena Football League Definition Professional sports league Date Began in 1987
Arena football is a version of the game that is played indoors, on a fifty-yard-long playing field, in order to increase its tempo and scores. The modest success of the Arena Football League established that a fan base existed that was eager to watch football twelve months a year. The Arena Football League (AFL) was designed to capitalize on the success of professional indoor soccer in the United States during the 1980’s. Organizers believed that if they could duplicate the torrid pace and high-scoring action of indoor soccer in a football game, they could attract fans of the National Football League (NFL) and of National Collegiate Athletic Association (NCAA) football during those leagues’ off-seasons. The launch of the new league was delayed by the short-lived United States Football League (USFL), but the initial four teams of the AFL began to play in 1987. The teams were located in Denver, Pittsburgh, Chicago, and Washington. Pittsburgh won the league’s first championship, playing its games before substantial crowds. Based on the modest success of the first season, the league embarked on an ambitious expansion program in 1988. This expansion established a nomadic pattern for many of the league’s franchises: Over the next twenty years, forty-five American cities would be home to AFL teams, often for just a single season. Despite this constant franchise relocation, the league benefited from its emergence at a time that several cable television channels, especially the fledgling ESPN network, had an urgent need to fill their schedules with sports programming. Later, ESPN would rely on major college and professional sports, but in the early years the AFL provided just the sort of inexpensive programming that ESPN needed at the time. By the 1980’s, the NFL was an entirely corporate operation, presenting a slick sports package overseen by billionaire owners. It featured millionaire players, many of whom saw stardom as an entitlement. The AFL marketed itself in a way designed to capitalize on its shoestring budget and underdog image. The league’s players were too small to play in the NFL or came from small colleges that NFL scouts ignored. They played for the love of the game, for
68
■
The Eighties in America
Art movements
a very modest paycheck, and in the hope that— despite very long odds—they might someday be recruited to play in the NFL. The games were high scoring, and the action literally wound up in the stands sometimes, with players landing among the fans on some sideline plays. NFL teams were organized into highly specialized units, but many AFL players played both offense and defense, nostalgic reminders of the blue-collar origins of the NFL. The offensive strategy benefited from Darnell “Mouse” Davis, who reached legendary status as the originator of the “run and shoot” offense. At the NFL and NCAA levels, run and shoot was a fad that faded, but it became the staple of the AFL, driving the league’s popularity. Impact The success of the AFL contrasted with the demise of the World League of American Football and the Extreme Football League (XFL). American football proved unpopular in Europe, and even NFL affiliation failed to save the international league. The XFL tried to blend the best of professional wrestling and football traditions, and it disappeared after one season. The AFL remained stable, won major network broadcasting contracts by 1998, and even set up a minor-league network in smaller markets. The AFL never challenged the NFL, but it thrived based on more modest goals and a solid marketing strategy. Subsequent Events The high point of the AFL came in 2000, when former Iowa Barnstormers quarterback Kurt Warner led the NFL’s Saint Louis Rams to victory in the Super Bowl. Warner’s career established the possibility that other AFL players could make a successful transition to the NFL and could bring their AFL fans with them. Further Reading
Brucato, Thomas. Major Leagues. New Brunswick, N.J.: Scarecrow Press, 2001. Evey, Stu. ESPN. Chicago: Triumph Books, 2004. Michael Polley See also
Cable television; Football; Sports.
■ Art movements Definition
Organized or implicit stylistic and ideological trends characterizing art forms of a given time and place
Art in the 1980’s is often described as postmodern. With the project of modernism being viewed by some as over, there was a loss of a link with the traditional avant-garde art that defined the beginning of the twentieth century. In the 1980’s, then, art can be viewed as situated within a pluralist cultural climate without any agreed-upon goals. Art in the 1980’s cannot be easily defined through one style or movement. Many different styles were created and mediums were explored during this decade including painting, sculpture, photography, and installation art. The common link between them was the rise of a culture characterized by diversity, in which little agreement existed as to whether there were any values held in common by all Americans, much less what such values might be. The decade’s art reflected the diversity of its culture, and it often commented explicitly on the lack of shared goals or values underlying that diversity. Nonetheless, artists working in the decade entered into dialogue with one another, and identifiable trends and styles emerged as a result. In New York, America’s cultural and economic center, the booming economy of the Ronald Reagan years resulted in a cadre of newly rich businessmen and media stars with disposable income to spend on art. In SoHo, the focus of the contemporary art world during this period, prices rose, and galleries such as the Mary Boone Gallery found that the demand for paintings among collectors was greater than ever before. However, in venues such as the East Village of New York, art also was created apart from this art-market frenzy. Painting Neoexpressionism lay at the heart of the art-market boom in the 1980’s. In the early 1980’s, galleries in SoHo took an interest in new figurative art from Germany and Italy by artists such as Georg Baselitz, Anslem Kiefer, Francesco Clemente, Sandro Chia, and Enzo Cucci. These artists’ works were characterized by quirky and crude figurative styles on a large scale, a loose handling of paint, and strong color and were a significant influence on the American neoexpressionist artists who followed in their wake.
The Eighties in America
Julian Schnabel was one of these artists and was promoted by both the Mary Boone and Leo Castelli Galleries. Schnabel used broken crockery and primitive images on a large canvas. Robert Longo was another neoexpressionist favorite of the SoHo art world during this period. Longo painted large scenes in which monuments and figures had ambiguous relationships with one another and with the surrounding space. Artist David Salle promoted choice over invention. As he stated in 1984, “the originality is in what you choose.” This attitude was exemplified in his juxtaposition of patterns of rectangles with a seemingly unrelated woman wearing paper cones on her head and breasts. Of these neoexpressionist works, Eric Fischl’s portrayal of the human figure is the most traditional subject matter, evincing influences from Édouard Manet, Edgar Degas, and Winslow Homer. Fischl depicted nudes to create narratives of an emotionally bankrupt middle-class America. Neo-Geo was an art movement that propagated the idea of the death of modernism. Peter Halley’s paintings exemplify this movement, presenting a few squares of day-glow rectangles in multiple panels. Drawing on the post-structuralist theory of Jean Baudrillard, Halley stated that artists can no longer make art but only refer to it. Art can only be a copy without an original, a reference to Baudrillard’s notion of the simulacrum. Halley’s abstractions signified confinement and interconnection in the mediacontrolled postindustrialized world. Other artists who attempted to copy or imitate art included Sherrie Levine, Ross Bleckner, and Philip Taeffe, who reduplicated Op art in the vein of 1960’s artist Bridget Riley. New York’s East Village art scene was purposefully less commercial than were the SoHo galleries, with smaller galleries opening to show art. Developing in this environment were artists that incorporated the style of graffiti into their art. Probably one of the best known of these artists was Jean-Michel Basquiat. Basquiat’s graffiti style incorporated vivid colors and images of black heroes. His art communicated the anguish of a black artist in a white world. After being discovered by the commercial art scene, Basquiat began to show in SoHo until his death of a drug overdose in 1988. Keith Haring was another artist who utilized graffiti by first creating white chalk line drawings on covered-up black panels of subway ads from 1981 to 1985. His art progressed to colorful im-
Art movements
■
69
ages of radiant babies, barking dogs, and androgynous figures that were joyful and life-affirming and that became icons of mass culture to which many could relate. Contrasting with Haring, David Wojnarowicz made art that called attention to the ethical state of emergency he found to exist in America in the 1980’s. His paintings were a complex layering of impressions from nature and culture in collage-like forms, which critiqued a consumer culture whose superficial surface masked violence and contradiction. Photography
Art photography in the 1980’s subverted the traditional styles of portrait and landscape photography and invented new ones. Photography was no longer a medium for description but for invention. Photographers merged fact and fiction, as in the work of Cindy Sherman. Sherman’s series of photographs entitled Untitled Film Stills was a string of self-portraits. In this series, Sherman dressed in many different guises, assuming the roles of different characters in nonexistent motion pictures. She challenged the idea of fixed identity and created open-ended narratives by appropriating representations of prefabricated roles of women from movies and the media. Sherman continued her exploration of the idea of the feminine by wearing false breasts and other prosthetics, thus turning herself into a monstrous grotesque. Feminist issues became the subject of many photographers’ works in the 1980’s. Photographer Barbara Krueger overtly critiqued and questioned the representation of women in American society. Through the union of the word-based with the photobased in her photography, Krueger parodied advertisements by undermining the disembodied voice of patriarchal authority and revealing the invisibility of women. By undermining the images and words in magazines, newspapers, and television, Krueger deconstructed the dominant masculine discourse and reframed images within a feminist perspective. Photographer Hannah Wilke also brought feminist issues to the fore in her work, when she utilized chewing gum to symbolize the psychological scars of the struggles of women. Another photographer who challenged the dominant discourse of white male culture was Carrie Mae Weems. Through her photography, Weems critiqued racism in the United States and tried to recover a genuine “black experience.”
70
■
Art movements
Utilizing the technique of “re-photography,” Richard Prince took elements from existing photographic advertisements, such as cowboys from cigarette ads, and blew them up or cropped them, reassembling them into grid-like images. These images in turn highlighted the emptiness at the core of mass-media representation in America. Sherrie Levine also re-appropriated images for her photography by taking classic photographs and re-presenting them as her own. The East Village in the 1980’s had its own documentary photographer, Nan Goldin. Photographing the Village’s partying drug culture, Goldin portrayed her subjects—often her friends and acquaintances—with a sense of detachment yet revealed both the emptiness and the pathos of their lives. Goldin’s pictures of the victims of AIDS, drugs, and violence reflected a bankruptcy of idealism that existed in her world. The 1980’s also saw photography expressing controversial themes and images. This trend was exemplified by the work of Robert Mapplethorpe, who created images of nude homosexual males in sexual situations. An exhibition of the artist’s work, The Perfect Moment, appeared in 1988-1989 and instigated a culture war led by conservative senator Jesse Helms. The controversy surrounded the funding of potentially inflammatory art by the National Endowment for the Arts (NEA). As a result, by the time the Mapplethorpe exhibition reached the Contemporary Arts Center in Cincinnati, it was closed down, and the NEA began discussing new standards by which to judge future applications for art funding. Another photographer who came under attack by Helms for creating “obscene imagery” was Andrés Serrano. Serrano’s black-and-white “Piss Christ” (first exhibited in 1989) depicted a crucifix submerged in urine. The photograph was seen as blasphemous by both Helms and religious groups, but it also has been read by critics (including the artist himself) to suggest spiritual enlightenment through its soft focus and warm light. The image demonstrated the clashing values of the decade’s pluralism, as Serrano claimed that his use of bodily fluids was meant to be sacred, whereas Senator Helms read it as selfevidently profane. In 1984, Serrano had created other sensational images, such as his “Heaven and Hell,” a color photograph showing a Catholic bishop standing next to a naked torso of a woman tied up by her hands and covered with blood.
The Eighties in America Sculpture and Installation Art Sculpture in the 1980’s saw artists utilizing a wide variety of materials outside of the traditional realm of sculpture, such as wax, plastics, ceramics, and papier-mâché. Jeff Koons was the sculptor who best exemplified the decade’s coopting of consumer and popular culture into art. Koons heroicized vacuum cleaners as social icons by enclosing them inside Plexiglas cases. His work, which bordered on kitsch, continued the tradition of Neo-Geo’s appropriation of the theory of the simulacrum and its denial of originality. This is exemplified in his 1988 ceramic sculpture, Michael Jackson and Bubbles, in which the pop singer and his monkey are portrayed as slick and shiny yet at the same time absurd and ridiculous. Heim Steinbach also utilized objects from consumer culture, juxtaposing objects as disparate as Nike sneakers and brass candlesticks. Artists such as Louise Bourgeois, Kiki Smith, and Robert Gober created sculptures that commented on the body and sexuality. Smith’s sculptures referred to the interior and the exterior of the body by emphasizing fluids such as blood, semen, and breast milk in sculptures made of materials as diverse as bronze and wax. Bourgeois created sexually charged imagery of autobiographical origin in biomorphic forms made of stone, wood, resin, latex, and bronze. Gober commented on the body politic through his sculptures of sinks that, through their lack of plumbing, implied the impossibility of cleansing, which in turn signified the lack of a cure for AIDS. Two public sculptural projects in the 1980’s caused controversy and debate. In 1981, Richard Serra installed his Tilted Arc in Foley Square at Federal Plaza in New York City. The curving wall of 120foot-long and 12-foot-high raw steel cut the plaza in half. Those who worked in the area found that the sculpture interfered with their daily navigation of the plaza and criticized the work as an “ugly rusted steel wall.” Letters of complaint were written, and in 1985 a public hearing on the work was held, in which the final vote was for its removal. Serra’s appeal of the ruling failed, and Tilted Arc was removed on March 15, 1989. This situation again caused people to question the role of government funding for the visual arts, as well as the role of the public in determining the value of a work of art. Meanwhile, in 1982, Maya Ying Lin won a contest to design the Vietnam Veterans Memorial in Washington, D.C. Her winning design, in which the names of 57,692 Americans killed in Vietnam were inscribed on a massive
The Eighties in America
black wall, was criticized for being too abstract for such a memorial. Installation art also flourished in the 1980’s. In 1983, by incorporating sculpture and installation art, Jonathan Borofsky, at the Paula Cooper Gallery in SoHo, created a chaotic environment that incorporated painting, drawing, and sculpture in a variety of styles, subjects, and scales. The effect produced was that of entering the intimacy of the artist’s mind. Installation artist Jenny Holzer also came to the fore in the 1980’s. Holzer created art based in words with her “truisms,” one-line adages that reflected a variety of beliefs and biases by in turn critiquing their legitimacy. In 1982, her “truisms” flashed in electronic signage above Times Square in New York. Placed in public areas where normal signage was expected, the work confronted viewers that might not otherwise have ventured into a museum or given art any consideration. Impact
The pluralistic art of the 1980’s explored diverse ideas of race and gender, disavowed the possibility of originality in art, and both critiqued and celebrated late twentieth century corporate consumer culture. Ties to the traditional modern masters of the early twentieth century were broken, and art of many different styles and ideas flourished side by side. While the traditional media of painting and sculpture continued to prosper, photography and art installations explored visual languages and themes that were new to these media and often controversial in their scope. This climate in turn set the stage for art of the 1990’s and beyond, in which artists continued to explore similar themes and projects. Further Reading
Fineberg, Jonathan. Art Since 1945: Strategies of Being. Upper Saddle River, N.J.: Prentice Hall, 2000. Excellent overview including art of the 1980’s. Hall, James. “Neo-Geo’s Bachelor Artists.” Art International, Winter, 1989, 30-35. Good article focusing on this specific group of artists. Hills, Patricia. Modern Art in the U.S.A.: Issues and Controversies of the Twentieth Century. Upper Saddle River, N.J.: Prentice Hall, 2001. Excellent overview of the many facets of art of the 1980’s, drawing on primary source documents. Hopkins, David. After Modern Art, 1945-2000. Oxford, England: Oxford University Press, 2000. Brief yet useful overview of art in the 1980’s.
Artificial heart
■
71
Kuspit, Donald. The Rebirth of Painting in the Late Twentieth Century. New York: Cambridge University Press, 2000. Excellent selection of a variety of essays from one of the most important art critics of the period. Smith, Joshua P. The Photography of Invention: American Pictures of the 1980’s. Washington, D.C.: National Museum of American Art, Smithsonian Institution, 1989. Very good overview of photography in the 1980’s. Tomkins, Calvin. Post to Neo: The Art World of the 1980’s. New York: Henry Holt, 1989. Essay written during the 1980’s by this well-known art critic. West, Thomas. “Figure Painting in an Ambivalent Decade.” Art International, Winter, 1989, 22-29. Excellent article focusing on the neoexpressionist painters. Sandra Rothenberg See also Architecture; Basquiat, Jean-Michel; Multiculturalism in education; Neoexpressionism in painting; Photography.
■ Artificial heart Definition
A mechanical device implanted into a patient’s body to replace a damaged or defective biological heart
Although researchers were unable to develop a fully satisfactory, permanent replacement artificial heart in the 1980’s, the knowledge gained during the decade led to the development of cardiac-assist devices that would allow heartdisease patients to live longer and more fulfilling lives. In 1982, news that an artificial heart had been implanted in Seattle dentist Barney Clark startled the general public. Cardiologists and bioengineers, funded through the National Institutes of Health (NIH), had been working on an implantable, permanent artificial heart since the 1960’s, but the existence of a working device came as a shock to persons outside the medical research community. The unusual transparency of the procedure, which took place at the University of Utah’s medical center, may have contributed to the high level of interest displayed by the news media. The university press office gave daily briefings, reporting both successes and setbacks, allowing the public a rare opportunity to view a medical experiment in progress. Clark, who
72
■
Artificial heart
William DeVries discusses his implantation of the Jarvik 7 artificial heart. (AP/Wide World Photos)
was dying from heart disease, volunteered for the procedure expecting to live for only 1 to 2 days at most, but instead survived for 112. The device implanted in Clark, the Jarvik 7, was invented by Robert Jarvik and implanted by surgeon William DeVries. Jarvik’s invention was not the first attempt at an artificial heart, but it proved to be the most successful. Heart surgeons and bioengineers had been attempting to develop an artificial heart since the invention of a cardiopulmonary bypass device (heart-lung machine) in the 1950’s. Advances in organ transplants served as an additional impetus, especially following the first successful heart transplant surgery in 1967. While heart transplantation offered hope to heart-disease patients, it was obvious there would never be as many donor organs available as there were recipients waiting for them. In addition, an artificial heart would eliminate the problems associated with tissue rejection and antirejection drug regimens. The Jarvik 7 had two pumps, analogous to the
The Eighties in America
right and left ventricles of a natural heart, to keep blood flowing through the circulatory system. It was powered externally using compressed air. The need for an external power source was one of the shortcomings of the Jarvik 7. The break in the skin provided an ideal environment for infections, and the power source itself made patient mobility difficult. The compressor unit was described in the press at the time as “the size of a washing machine” and “not much smaller than a refrigerator,” although the real issue was not so much the size of the power unit as it was the fact that the patient’s mobility was limited by the length of the air hoses. Clark, who had been in extremely poor health at the time of the procedure, never recovered to the point of being able to leave the hospital. He was fully informed of the risks prior to volunteering to be the first recipient, and he had even visited the research facility to see the calves being used as test subjects, so he was fully aware of the need for an external power source. His first reaction had been to say no to the idea, but he volunteered anyway in the hope that knowledge gained from his case would help others. The second patient to receive a Jarvik 7 implant, Bill Schroeder, enjoyed a slightly better outcome. Schroeder’s procedure took place at a Humana hospital in Louisville, Kentucky. Dr. DeVries had been recruited by the cardiology research program at Humana and performed a number of artificial implants there. Schroeder suffered a setback two weeks after the surgery when he had a major stroke, but he nonetheless recovered sufficiently to leave the hospital. His apartment was equipped with an air compressor and an emergency generator. He was able to travel using a portable system weighing about fifteen pounds. Schroeder visited his hometown, went to a basketball game, and even enjoyed fishing, despite suffering from various complications, such as additional strokes and infections. He lived for 620 days on the Jarvik 7. Impact Three other patients received the Jarvik 7 as a permanent replacement heart. One died a week after the surgery; the others lived ten months and fourteen months, respectively. After the first few cases, the mainstream news media lost interest in the artificial heart. Many reporters erroneously believed that the Jarvik 7 had been a failure, when in fact in the years that followed, heart surgeons continued to use the Jarvik 7, as well as later models that evolved
The Eighties in America
from it. The perception of the Jarvik 7 as a failure may have come from unrealistic expectations the public had about the ability of science to perform medical miracles and of technology to make incredible advances overnight. Thus, when an artificial heart that did not require an external power source was not immediately available, the press lost interest. Further Reading
Burnette, Martha. The Bill Schroeder Story. New York: Morrow, 1987. Biography of the second artificialheart recipient. Elefteriades, John A., and Lawrence S. Cohen. Your Heart: An Owner’s Guide. Amherst, N.Y.: Prometheus Books, 2007. Details the current understanding of cardiac science, including the heart’s functioning, how to keep it healthy, and steps taken by medical professionals to fight heart disease. Fox, Renée C., and Judith P. Swazey. Spare Parts: Organ Replacement in American Society. New York: Oxford University Press, 1992. Cultural history of organ transplantation, tracing both its portrayal and its reality in American society. Hogness, John R., and Malin VanAntwerp. The Artificial Heart: Prototypes, Policies, and Patents. Washington, D.C.: National Academy Press, 1991. Useful rundown of the state of the science of artificial heart replacements at the end of the 1980’s. Shaw, Margery. After Barney Clark: Reflections on the Utah Artificial Heart Program. Austin: University of Texas Press, 1984. A discussion of the University of Utah’s program, written in the wake of the first successful artificial heart implantation. Nancy Farm Mannikko See also
Baby Fae heart transplantation; Bioengineering; Medicine; Transplantation.
■ Asian Americans Definition
Americans of Japanese, Chinese, Filipino, Vietnamese, Cambodian, Laotian, or South Asian origin
The largest wave of immigration from any world region occurred from Asia into United States during the 1980’s, resulting in a major cultural transformation, particularly in coastal U.S. regions. The assimilation process that followed created unique cultural groups among Asian Americans.
Asian Americans
■
73
While the territory of Asia extends from Turkey to East Timor, in the context of migration patterns and population data, South and East Asian nations spanning from Pakistan to the Philippines are seen as the primary Asian emigrant regions. During the late nineteenth century, the availability of jobs in the lush farmlands of California’s central valleys created a large immigration movement from East Asia. Angel Island, located off the coast of California, was the gateway of Asians entering America and remains an important national marker for many Asian Americans. Japan and the United States were bitter enemies in World War II, leading to the internment of Japanese Americans. Since then, the relationship between the United States and Asia has evolved, and Asian Americans today contribute greatly to America’s multicultural melting pot as a vital cultural group. Asian Americans in the 1980’s The 1980’s proved to be one of the most important decades for Asian Americans, as this period also helped transform the cultural identity of the United States. Prior to that decade, America’s deep political involvement with Vietnam created a connection between the two nations and resulted in significant migration from the region. Pursuit of better standards of living, disenchantment with communist governance, and the growth of a vibrant youth population seeking opportunities were some of the factors in this migration. Furthermore, the 1980 Refugee Act enacted by the U.S. Congress allowed people needing asylum from their home nation because of political unrest to seek protection in the United States. The act inspired many Asians to migrate to a more free society in the United States. In particular, the Vietnamese population in the United States saw a giant surge, growing by more than 125 percent between 1980 and 1989, making it the largest immigrant group landing on American shores. A major factor contributing to this surge was the Vietnamese Ameriasian Homecoming Act passed by Congress in 1987, which granted immigrant visas to all children born to Americans in Vietnam between 1965 and 1975. The term “Ameriasian” is used to denote Asians born in their respective Asian nations with at least one American parent. The same term was also applied to children born to Americans in Korea following the Korean War in the 1950’s. In the 1980’s, civil unrest and unstable governments in the 1980’s prompted other
74
■
Asian Americans
South Asians to seek freedoms in U.S. society, such as Sri Lankan Tamils and Filipinos. The biggest contributors to the Asian American population, however, remained India and China. During the 1980’s, the Asian Indian and Chinese populations in the United States grew in similar proportions as the Vietnamese and Korean ones. The West Coast of the United States, particularly California, and the Pacific state of Hawaii were key regions of settlement for Asians. In the 1990 census, San Francisco led major cities in the continental United States in Asian American population, with about 29 percent reporting Asian ancestry. It was followed by Los Angeles, with about 10 percent, and New York City, with about 7 percent. The state with the highest percentage of Asian Americans in its population was Hawaii, with more than 64 percent of people in Honolulu describing themselves as Asian. Japanese Americans Close geographical proximity to East Asia and mass migration from Japan with the end of the fascist government following World War II paved the way for Hawaii and California as key migrant destinations. On a political level, several landmark events during the 1980’s transformed the relationship between United States and Japan and encouraged increased growth of the Japanese American population. Under the leadership of U.S. president Ronald Reagan and through congressional legislation, the United States made several agreements to reconcile America’s infamous role in Japanese American internment during World War II. In 1980, for instance, Congress passed the Commission on Wartime Relocation and Internment of Civilians Act, which was initiated to investigate injustices done to Japanese Americans in World War II internment camps. Similar efforts culminated in 1988 when President Reagan signed the Civil Liberties Act, which offered a formal apology to Japanese Americans for internment as well as monetary reparations. Japanese Americans witnessed great changes in their population during the 1980’s. Japanese Americans are often described in terms of three generational stages of migration from Japan: Issei (first immigrants from Japan), Nisei (their children), and Sansei (grandchildren of Issei). The 1980’s witnessed a large Issei population and maturing of the Nisei and Sansei Japanese Americans from the postWorld War II Japanese immigration. Each of these
The Eighties in America
generational groups influenced a unique cultural mixture to the American society. The 1980’s also witnessed a resurgence of industrial regions in United States with strong Japanese ties that influenced growth in the Japanese American population. In Detroit, growth of the automobile industry influenced higher levels of competition as well as collaboration between American and Japanese automakers. The cultural exchanges that followed resulted in unique cultural cross-fertilization between the two nations, with Japanese Americans in Detroit as key facilitators in these exchanges. American jazz music, synonymous with the U.S. South, was accepted in the Japanese community, resulting in a Japanese American fusion jazz that appealed to both cultural groups. Baseball, regarded as America’s pastime, became quite popular in Japan, and Japanese baseball attracted more than twenty million fans toward the end of 1980’s. From San Francisco to Detroit, Japanese Americans shared their rich cultural heritage in many ways. With their predominately Buddhist religion and annual Obon festival to commemorate their ancestors, Japanese Americans helped transform the American perception of Japan. A key aspect of this change was in the tourism business, and in 1980’s Japanese Americans encouraged their fellow Americans to visit their ancestral homeland with the slogan “Exotic Japan,” much different from the simple “Discover Japan” theme of the 1970’s. Chinese Americans As the initial Asian migrant group, Chinese Americans share a special relationship with the United States that goes back to 1850 when the first Chinese Filipino migrants landed at Angel Island in California. Migrants of Chinese ethnicity assimilating to American culture were not entirely from mainland China. Vietnamese, Cambodian, and Laotian refugees during the post-Vietnam War years were largely of Chinese ethnicity. The years leading up to the 1980’s witnessed a rapid transformation in the perception of Chinese Americans by the mainland Chinese and consequently the immigration and subsequent assimilation of the Chinese in the United States. Mao Zedong’s Cultural Revolution of the 1960’s and 1970’s had regarded Chinese Americans as “capitalistic traitors.” This attitude changed when Deng Xiaoping became China’s leader in 1981. He viewed Chinese Americans as key players, not rivals, and as a
The Eighties in America
vital link in facilitating improved business between China and the United States. The ethnic enclave of Chinatown in San Francisco, which first emerged in the mid-nineteenth century, continued to boast the largest Chinese population outside mainland China. In 1983, Lily Lee Chen became the first Chinese American mayor of an American city when she took over as mayor of Monterey, located just south of San Francisco. The 1980’s also witnessed enormous growth in the Chinese American population in Southern California. Toward the end of the 1980’s, Chinese Americans came together to help their counterparts in mainland China. A defining moment came in 1989, when the government crackdown on the student-led Tiananmen Square demonstrations in China and the subsequent unrest in Chinese society left many Chinese students stranded in the United States. That year, by way of an executive order passed by President Reagan, these students were allowed to stay in the United States.
Asian Americans
■
75
The eastern United States became a key region for settlement. The New York metropolitan area, extending into New Jersey, became the largest settlement region for South Asians in the 1980’s, a trend that continues today. Another key region was California, where toward the end of 1980’s the growth in the computer and microprocessor industries created an employment gold mine that attracted South Asians. Silicon Valley, a popular term for a region of Northern California, saw some of the largest settlement of South Asians and computer firms. Impact The growth of the Asian American population in this decade represented a true success story in the ongoing experiment of a multicultural society in the United States. Today, with a population of more than 13 million, or 4 percent of the total U.S. population, Asian Americans represent a vital component of American development, both economically and culturally, and have contributed significantly to technology, politics, and education. Further Reading
South Asian Americans
Historically, Punjabi Indians from northwestern India were the first South Asians to immigrate to the United States, arriving in California during the late nineteenth century. The 1980’s, however, was marked by a large Pakistani and South Indian migration into the United States. In the heyday of British colonization, the territory of South Asia, formerly British India, comprised a region extending from modern Pakistan to Bangladesh. In the years following independence from the United Kingdom, India and Pakistan underwent territorial disputes over the Muslim regions, and three regional wars ultimately led to the birth of Bangladesh in 1971. The tumultuous years of the 1970’s prompted many South Asians to seek opportunities in the growing U.S. economy and start new lives. Key factors contributing to the large South Asian migration to the United States in the 1980’s were the pursuit of better standards in higher education and employment in high-technology industries. Several prestigious colleges in New Jersey, California, Texas, and Illinois attracted bright South Asian students, who then assimilated into American society to create an unique South Asian American group in these regions. An advantage for South Asians in the acculturation process was the English language left over by British rule.
Ancheta, Angelo N. Race, Rights, and the Asian American Experience. New Brunswick, N.J.: Rutgers University Press, 2000. Explores the history of civil rights for Asian Americans with an emphasis on their contribution to American society. Louie, Steven G. Asian Americans: The Movement and the Moment. Los Angeles: UCLA Asian American Studies Center Press, 2001. Presents an examination of various Asian American issues in relation to their place in U.S. history. Shen Wu, Jean Yu-Wen, and Min Song, eds. Asian American Studies: A Reader. New Brunswick, N.J.: Rutgers University Press, 2000. Addresses the evolution of Asian American societies in relation to their development of nationalism and civil rights. Zia, Helen. Asian American Dreams: The Emergence of an American People. New York: Farrar, Straus and Giroux, 2000. The historical development of Asian American identity across difficult and welcoming periods is presented. Makes special reference to Southeast Asians. Aswin Subanthore See also China and the United States; Demographics of the United States; Globalization; Immigration Reform and Control Act of 1986; Immigration to the United States; Japan and North America; Multiculturalism in education; Racial discrimination.
76
■
The Eighties in America
Aspartame
■ Aspartame Definition
Low-calorie artificial sweetener
For those seeking to avoid sugar, aspartame provided a safe alternative to cyclamates, which were banned in the United States in 1969, and saccharin, which had been determined to cause cancer in animals. Aspartame, marketed under various names including NutraSweet and Equal, is a nontoxic, noncarbohydrate, low-calorie, easily digestible, generalpurpose sweetener approved for use in food and beverages by the Food and Drug Administration (FDA). It has a taste similar to sucrose but is 180 to 200 times sweeter. It is used in thousands of products worldwide. Aspartame is considered a low-calorie sweetener rather than a no-calorie sweetener, since it contains four calories per gram. Unlike sucrose, aspartame does not promote tooth decay. Aspartame was discovered by James M. Schlatter in 1965 while he was working at G. D. Searle, a pharmaceutical company. Schlatter was synthesizing a tetrapeptide (a type of protein consisting of four amino acids) normally manufactured by the stomach. A dipeptide (consisting of two amino acids) that was created during an intermediate step in the synthetic process contained aspartic acid and the methyl ester of phenylalanine. When Schlatter licked his finger while reaching for a piece of paper, he sensed sweetness. He was able to trace the sweet taste to the dipeptide. Eventually, G. D. Searle was convinced of the potential value of the chemical, which they called aspartame; the company decided to research its use as a sweetener. After some initial concerns that aspartame could be linked to brain cancer, the substance was approved for use in dry goods by the FDA in 1981 and for use in carbonated beverages in 1983. By 1996, the FDA had removed all restrictions on the use of aspartame in food products, and it was classified as a general-purpose sweetener for use in food and beverages. Although some still claim that aspartame is not safe for human consumption, many studies since the early 1980’s have reaffirmed the safety of aspartame. Since persons with phenylketonuria (PKU) have difficulty metabolizing phenylalanine, they must restrict their intake of that substance. As a result, all U.S. products containing aspartame are labeled “Phenylketonurics: Contains phenylalanine.”
Impact The approval of the use of aspartame provided a safe alternative artificial sweetener to the banned cylcamates and to saccharin, whose safety was in question. Companies manufacturing products that contained aspartame in the early and mid1980’s touted the superiority of the substance, usually under one of its trade names. They invested considerable sums marketing NutraSweet, Equal, and diet foods that used them as ingredients, thereby driving both sales and the belief that purchasing diet foods was important to weight loss and obesity control. Further Reading
Stegink, Lewis D., and L. J. Flier, eds. Aspartame: Physiology and Biochemistry. New York: Marcel Dekker, 1984. Tschanz, Christian, Frank N. Kotsonis, and W. Wayne Stargel, eds. Clinical Evaluation of a Food Additive. Boca Raton, Fla.: CRC Press, 1996. Charles L. Vigue See also
Caffeine; Cancer research; Consumerism; Diets; Food Security Act of 1985; Medicine; Science and technology.
■ Astronomy Definition
Study of extraterrestrial objects and phenomena
Innovative theories and unforeseen discoveries changed astronomers’ conceptions about the structure and development of the universe in the 1980’s, while space probes discovered many new bodies in the solar system. The universe came to seem a bit stranger during the 1980’s, as astronomers discovered large-scale phenomena that challenged prevalent theories and cosmologists developed a startlingly novel explanation for the structure of the universe. Meanwhile, information gathered by the Voyager 1 and Voyager 2 space probes revealed that the solar system’s small corner of the universe was a busier place than had theretofore been recognized. The Inflationary Universe Several important theoretical advances occurred in the 1980’s. In 1981, Massachusetts Institute of Technology graduate student Alan Guth proposed a novel idea to account for puzzling observations about the size and structure
The Eighties in America
Astronomy
■
77
tists accumulated evidence that a large number of galaxies within 200 million light-years of the Milky Way were being drawn in the direction of the constellations of Hydra and Centaurus toward a point that they called the Great Attractor. In 1989, Margaret Geller and John Huchra mapped a similarly immense structural feature, the Great Wall (later the CfA2 Great Wall), a plane of galaxies occupying a space 200 million by 500 million by 15 million light-years in volume. These bodies called into question existing cosmological theories, which were unable to account for the formation of such large superstructures. A model of the Voyager spacecraft’s trajectory through the solar system. Voyager 1 and VoyIn 1982, astronomers first deager 2 contributed greatly to astronomers’ knowledge of the solar system in the 1980’s. tected the Sunyaev-Zel’dovich ef(NASA CORE/Lorain County JVS) fect, a peculiarity in the cosmic microwave background radiation that had been predicted in 1969. In 1987, Canadian of the universe. Guth asserted that, contrary to theoastronomer Ian Shelton discovered a supernova, retical expectations, the universe is flat, homogedubbed SN 1987a, in the Large Magellanic Cloud, nous, and uniform in all directions. Soon after its oradjacent to the Milky Way. igin in the big bang, Guth argued, the universe must have undergone a phase transition—a period of rapid expansion during which the strong nuclear Space Probes and Observatories Voyager 1 passed force was disassociated from the electromagnetic by Saturn in 1980, detecting three new moons and and weak forces. This “inflationary model” quickly finding new complexities in the planet’s rings. Voybecame a fundamental part of cosmological theory. ager 2 flew by the next year, probing Saturn’s atmoIn 1987, Eugene Parker explained how the Sun’s spheric temperature and density, and then went on corona is heated by energy released in “micro flares” to Uranus in 1986 and Neptune in 1989, discovering as the solar magnetic field continuously oscillates. A or confirming the existence of ten moons and seven year later, Walter Johnson and Joseph Hollweg pubmoons, respectively. Voyager 2 also revealed a maglished a model of hot, proton-producing coronal netic field around Uranus and discovered the Great holes that accounted for the fast solar wind. Also in Dark Spot, a hole in Neptune’s cloud cover. 1988, Martin Duncan, Thomas Quinn, and Scott Earth-orbiting observational satellites brought Tremaine solved a puzzle about the solar system’s increased variety and power to astronomers. The Soshort-period comets by demonstrating that most of lar Maximum Mission, launched in 1980, proved them originate in the Kuiper Belt beyond Neptune, that the Sun’s energy output varies. The Infrared instead of in the more distant Oort Cloud, as had Astronomical Satellite (IRAS), launched in 1983, previously been believed. spent ten months surveying intra- and extra-galactic Several discoveries were made in deep space that infrared sources, produced the first images of the were important to theories about the universe’s Milky Way’s core, and discovered five new comets in structure. In 1981, a giant void—100 million lightthe solar system. SPARTAN 1 (1985) and instruments aboard Spacelab 1 (1983) and Spacelab 2 years of empty space—was found in the constellation Boötes, and in 1987 several American scien(1985), all deployed from space shuttles, gathered
78
■
The Eighties in America
AT&T breakup
data on X-ray sources, especially galactic clusters and candidate black holes. In 1989, the Cosmic Background Explorer (COBE) was launched to map variations in the universe’s cosmic microwave background radiation. On earth, meanwhile, the Very Large Array, twenty-seven radio telescopes near Socorro, New Mexico, became operational in 1980, and the fifteenmeter-wide James Clerk Maxwell telescope atop Mauna Kea, Hawaii, began detecting light from far infrared to microwave frequencies in 1987. Impact Several of the astronomical observations made in the 1980’s had significant implications. Supernovas, for example, can provide means of measuring intergalactic distances and of testing cosmological theories, and SN 1987a served these functions in two ways. First, neutrinos emitted by the supernova were detected on Earth, providing the first empirical evidence that gravity affects matter, antimatter, and photons in similar ways. Second, geometrical measurements of light sent out by SN 1987a confirmed the value of the Hubble constant, a crucial component of astronomical theory. The large-scale clumping and massive voids discovered in the universe’s structure were also important. These structures—including the Great Attractor, the Boötes void, and the Great Wall—suggested that early fluctuations had occurred in the expansion rate of the universe. These fluctuations, predicted by Guth’s inflationary model, were confirmed in 1990 by data that the COBE satellite gathered. These data indicated that the universe went through a period of exponential growth soon after the big bang; their collection is widely regarded as constituting the most significant astronomical observation made during the late twentieth century. Further Reading
Hartmann, William K. Moons and Planets. 3d ed. Belmont, Calif.: Wadsworth, 1992. This classic text is a moderately technical survey of planetary science; includes discoveries during the 1980’s from space-based and ground telescopes, as well as many photographs and illustrations. Leverington, David. A History of Astronomy from 1890 to the Present. New York: Springer, 1995. Leverington provides a highly detailed, mostly nontechnical account of modern astronomy, emphasizing the great changes in technology and theory. With illustrations.
North, John. The Norton History of Astronomy and Cosmology. New York: W. W. Norton, 1995. Surveys the scientific history of astronomy and places its concepts in the larger Western intellectual tradition. The final chapter briefly recounts advances made in astrophysics, black hole theory, and cosmology during the 1980’s. Schorn, Ronald A. Planetary Astronomy from Ancient Times to the Third Millennium. College Station: Texas A&M University Press, 1998. Provides a thorough, nontechnical survey of the history of astronomy as a science. The last chapter discusses the Voyager flybys of Jupiter, Saturn, Uranus, and Neptune and the discoveries made as a result. Sheehan, William. Worlds in the Sky: Planetary Discovery from Earliest Times Through Voyager and Magellan. Tucson: University of Arizona Press, 1992. This agreeably written, nontechnical narrative provides basic historical background to the progress in astronomy during the 1980’s. Roger Smith See also Science and technology; SETI Institute; Space exploration; Space shuttle program.
■ AT&T breakup The Event
Forced fragmentation of a telephone company’s monopoly Date Settlement made on January 8, 1982 AT&T, a government-regulated monopoly providing local and long-distance telephone service in the United States, settled a long-running government antitrust suit by agreeing to divide itself into seven independently owned regional local telephone companies and an unregulated national long-distance company. The new national company would also be free to enter emerging communications and computer markets. By the mid-1970’s, American Telephone and Telegraph (AT&T) was the sole provider of telephone service and equipment for most consumers in the United States. Early in the twentieth century, the company had eliminated its independent competitors in the telephone industry by refusing to offer long-distance service or equipment to the local independent phone companies, which at the time collectively controlled one-half of the U.S. market. As a result, the independent companies were either de-
The Eighties in America
stroyed or acquired by AT&T (also known as the Bell system). In the wake of this development, which was mitigated only slightly by federal antitrust prosecution, state and federal laws were passed to regulate the telephone industry. This regulation limited the prices AT&T could charge for local and longdistance service, but it also largely protected the company from new competition. Beginning in 1934, the Federal Communications Commission (FCC) was created to oversee the federal regulations. When the FCC began to encourage competition for the long-distance market in the 1970’s, AT&T responded by delaying and often refusing to interconnect competitors’ equipment and long-distance calls with its own system, effectively preventing customers from placing and receiving phone calls using any equipment or any services other than those of AT&T. In response, in 1974, the Antitrust Division of the U.S. Department of Justice once again sued AT&T for violating antitrust laws. AT&T fought the case tooth and nail. For four years, the company filed motion after motion, claiming that it was either exempt from the antitrust laws or subject to the exclusive jurisdiction of the FCC and not the courts. All of these attempts to end the case were unsuccessful. Finally, in 1978, the parties began discovery, the pretrial exchange of documents and the taking of statements under oath from witnesses. In 1981, the most important antitrust trial since the Standard Oil case of 1911 finally began. By 1982, most observers believed that the government was on the verge of winning a dramatic victory. Seeing the handwriting on the wall, AT&T settled the case, agreeing to a voluntary breakup. Under the settlement, the company would have eighteen months to spin off all regulated local phone services into seven new, independent companies dubbed the “Baby Bells.” The Baby Bells would be prohibited from entering the long-distance telephone business until such time as they could demonstrate that they faced significant competition for local phone service. The new AT&T would be a much smaller company, consisting of the long-distance, equipment, and research portions of the old Bell system. It would be allowed, however, to enter new unregulated markets without prior court approval. In 1984, the Baby Bells were created through a multibilliondollar stock sale to the public, and the new American telecommunications system was born. Judge Harold
AT&T breakup
■
79
Greene of the U.S. District Court for the District of Columbia, the judge who presided over the trial, also oversaw the administration of the settlement agreement and arbitrated the many disputes that arose for the next twelve years, until the U.S. Congress eventually passed the Telecommunication Act of 1996, setting new rules governing both regulation and competition in the telecommunications industry. Impact The breakup of AT&T into the seven Baby Bells and a new, unregulated AT&T changed everything about the American telephone industry. It reintroduced competition into the industry, although little new competition was created for local telephone service until the introduction of cell phones and the beginning of Internet phone service. Perhaps more important, the breakup meant that the nation’s telecommunications infrastructure was in multiple hands. Beginning in the 1980’s, a customer in one region seeking to place a long-distance call to another region necessarily sent signals through multiple companies’ equipment. As a result, standards had to be adopted and maintained that allowed the various companies effectively to interconnect their systems. Subsequent Events Over the years following the breakup, many of the Baby Bells were allowed to merge and reenter the long-distance communications market. Ironically, the competition from these matured Baby Bells proved to be more than AT&T could take: In 2005, it was announced that one of the most powerful Baby Bells, SBC Communications, would purchase AT&T for $16 billion. Because the latter company had the more recognizable name, SBC renamed itself AT&T after the merger was completed. Further Reading
The AT&T Breakup: 20 Years of Confusion. Available at http://consumeraffairs.com/news04/ att20.html. Consumer-oriented history of the breakup’s aftermath. Benjamin, Stuart Minor, et al. Telecommunications Law and Policy. 2d ed. Durham, N.C.: Carolina Academic Press, 2006. Details both the law governing U.S. telecommunications and the public policy decisions behind it. Forest, Herbert E. After the AT&T Settlement: The New Telecommunications Era. New York: Practising Law
80
■
The Eighties in America
Atlanta child murders
Institute, 1982. Extremely comprehensive practical legal analysis prepared for distribution to lawyers and other professionals attending a 1982 conference on the settlement. National Association of Attorneys General. The AT&T Settlement: Terms Effects Prospects. New York: Law & Business, 1982. Document designed to aid state attorneys general in their responses to the AT&T breakup and oversight of the transformed telecommunications industry. Spencer Weber Waller See also
Business and the economy in the United States; Cell phones; Information age.
■ Atlanta child murders Identification
A series of killings in the city of Atlanta Date 1979-1981 Place Atlanta, Georgia The murders of black children and young adults and law enforcement’s slow response to the initial crimes called attention to the vulnerability of poor, black children in what was considered to be an economically advanced and racially enlightened Southern city. They also drew attention to the phenomenon of the serial killer. Although the murders began in 1979, they did not come to general attention until 1980, and no suspect was found until 1981. At the end of the 1970’s, the city of Atlanta had one of the highest crime rates in the United States. Much of the crime went unnoticed, however, even by the local newspapers. On July 28, 1979, the bodies of two African American children from different housing projects in different parts of Atlanta were discovered. The deaths of fourteen-year-old Edward Hope Smith and thirteen-year-old Alfred James Evans were followed in September by the disappearance of Milton Harvey, age seven. In October, Yusef Bell, age nine, also disappeared. The body count quickly accelerated throughout the summer of 1980 into a list that would eventually grow to thirty names by 1981. Parents and Police Take Action A group of parents whose children had been victims, along with their neighbors and other concerned citizens, formed the Committee to Stop Children’s Murders (STOP) on April 15, 1980. The group set up a hotline, provided
safety education, hired private investigators, and held press conferences. They treated the increasing number of deaths and disappearances of children as related. At first, however, the police investigated each murder individually. As the disappearances continued, the Atlanta police finally recognized that there might be a relationship among the murders. On August 14, 1980, they formed the Missing and Murdered Task Force to deal with the series of murders. In September, the Federal Bureau of Investigation (FBI) entered the investigation at the behest of Attorney General Griffin Bell, after the mayor of Atlanta, Maynard Jackson, asked the White House for help. The official case name assigned by the FBI was ATKID, also known as Major Case 30. Eventually, law-enforcement officials were able to provide a profile of a serial killer who targeted young, black males and to collect enough evidence to solve at least some of the murders. Arrest and Trial
On May 22, 1981, police in the area stopped the car of Wayne Bertram Williams, a twentythree-year-old African American man, after hearing a splash off the James Jackson Parkway Bridge. Two days later, they discovered the body of Nathaniel Cater, a twenty-seven-year-old man, in the Chattahoochee River near the bridge. On June 21, 1981, the police arrested Williams for the murders of Cater and another victim, Jimmy Payne, whose killing was considered the last in the series of murders. Jury selection for the trial of Williams began on December 28, 1981, in Fulton County Superior Court with Judge Clarence Cooper presiding. Nine
Convicted killer Wayne Bertram Williams, center, appears at a 1983 news conference at Atlanta’s Fulton County jail, at which he protests his innocence. (AP/Wide World Photos)
The Eighties in America
women and three men were chosen. The trial began on January 6, 1982. Williams was represented by Atlanta attorney Mary Welcome and attorneys Alvin Binder and Jim Kitchens from Jackson, Mississippi. District Attorney Lewis Slaton led the prosecution team. The prosecution presented a wide array of blood, fiber, and hair evidence tying Williams not only to the two victims with whose murders he had been charged but to ten other murders as well. In addition, a series of eyewitnesses testified to seeing Williams with some of the victims. On February 27, 1982, the jury found Williams guilty of the two murders. Two days later, members of the task force declared that Williams had killed twenty-one others on the list, and these cases were declared solved. Williams was sentenced to two consecutive life terms in prison. Aftermath and Controversy
Although there was a certain relief at the conclusion of the Williams trial, many people remained critical of the way in which the deaths of so many African American children had been handled and the prosecution’s perceived dependence on circumstantial evidence, which opened the possibility that all of the murders attributed to Williams might not have been committed by him. It was hard for some to conceive of a black serial killer, and the kind of profiling that would later become an accepted part of investigative practice was just establishing itself when the Atlanta murders came to public attention. Some even believed that Williams was innocent of all the murders, although these were in the minority.
Impact The series of youth murders focused attention on the previously invisible poverty and crime that haunted Atlanta and especially on the inadequacy of police protection and investigation in the city’s poor, black neighborhoods. In the end, it became clear that the poor, black members of Atlanta suffered disproportionately from rampant crime and lack of police protection, a problem that would prove to be endemic—and would only worsen—in urban areas throughout the United States during the 1980’s. Further Reading
Dettlinger, Chet, with Jeff Prugh. The List. Atlanta: Philmay, 1983. Criticism of the way in which the murder investigation was handled that calls attention to other murders not on the official “list” established by the task force. Coauthored by a for-
Atwater, Lee
■
81
mer Atlanta police officer and a journalist and nominated for a Pulitzer Prize. Includes the list, maps, and photographs of the victims. Douglas, John, and Mark Olshaker. Mindhunter: Inside the FBI’s Elite Serial Crime Unit. New York: Lisa Drew/Scribner, 1995. The chapter, “Atlanta,” tells the story of the murders from the perspective of one of the FBI’s best-known profilers, who was involved in the investigation of the murders, the arrest of Wayne Williams, and the prosecution’s trial strategy. Dispels myths about the involvement of the Ku Klux Klan and takes the position that Williams killed at least eleven of the victims. Headley, Bernard. The Atlanta Child Murders and the Politics of Race. Carbondale: Southern Illinois University Press, 1998. Academic study by a professor of criminology and criminal justice. Covers the various reactions to the crimes, the trial of Wayne Williams, and the verdict; takes the position that Williams was guilty of at least twenty-three of the thirty murders. Contains appendixes with a total list, photographs, and details of the murders, as well as the guidelines of the task force established to solve them. Lopez, Nancy. “The City It Always Wanted to Be: The Child Murders and the Coming of Age of Atlanta.” In The Southern Albatross: Race and Ethnicity in the American South, edited by Philip D. Dillard and Randal L. Hall. Macon, Ga.: Mercer University Press, 1999. Places the child killings in the context of the economic, political, and racial history of Atlanta. Susan Love Brown See also African Americans; Crime; Lucas, Henry Lee; Night Stalker case; Racial discrimination.
■ Atwater, Lee Identification
Political strategist and chairman of the Republican National Committee in 1989 Born February 27, 1951; Atlanta, Georgia Died March 29, 1991; Washington, D.C. A pioneer in the business of political consulting and the manager of George H. W. Bush’s 1988 presidential campaign, Atwater was responsible for bringing a highly personal and confrontational style of negative campaigning to the forefront of American politics.
82
■
Atwater, Lee
A charismatic political strategist who spent his formative years in the Deep South, Lee Atwater capitalized on his innate ability to understand American cultural sensibilities and political trends in order to become one of the foremost political strategists of the 1980’s. Following his successes in a number of state campaigns, including serving as the political director for Dixiecrat-turned-Republican senator Strom Thurmond of South Carolina, Atwater achieved national recognition for his work as the southern regional coordinator of Ronald Reagan’s 1980 presidential campaign. Following the Republican victory in that race, President Reagan named Atwater White House deputy political director, a post he held for the first four years of the Reagan administration. In 1984, following the conclusion of President Reagan’s successful reelection campaign, Atwater returned to the private sector, where he merged his boutique political consulting firm with that of the larger company of Black, Manafort, and Stone. Shortly after Atwater left the Reagan White House, Vice President George H. W. Bush asked him to manage his 1988 presidential campaign. In crafting a campaign strategy for Bush and his running mate, Indiana senator Dan Quayle, Atwater relied on many of the strategies he had used successfully during his earlier campaigns. Those strategies included embracing a combination of traditional and populist ideas and values to appeal to swing voters who subscribed to those values. Thus, Bush and Quayle campaigned by advocating for strong defense, limited government, low taxes, and school prayer. Bush and Quayle were opposed in the presidential race by Democrats Michael Dukakis—the governor of Massachusetts—and his vice presidential running mate, Texas senator Lloyd Bentsen. Atwater portrayed Dukakis and Bentsen as captives of an East Coast liberal establishment that was out of touch with ordinary Americans. The most notable part of the 1988 campaign was Atwater’s tactical use of race to depict Governor Dukakis as being weak on crime. Atwater and his colleagues created a television commercial featuring a convicted African American murderer named William Horton. Horton had been serving a life sentence in a Massachusetts penitentiary when he was granted a weekend-long furlough in 1986 under a controversial Massachusetts state law. He did not return to prison at the end of the
The Eighties in America
At an October, 1988, presidential campaign rally, Lee Atwater— George H. W. Bush’s campaign manager—holds up a sign proclaiming his support for the candidate. (AP/Wide World Photos)
weekend, and less than a year later, Horton raped a woman in Maryland. Atwater’s commercial blamed Dukakis for the furlough program and intimated that such episodes would recur under a Dukakis presidency. The advertisement proved successful in frightening some voters into supporting Bush. Together with Atwater’s other tactics, it helped the Bush-Quayle ticket win the presidency in 1988. Following his electoral victory, President Bush named Atwater chairman of the Republican National Committee in 1989. Atwater became the first political consultant in the history of either party to be named to lead a national political organization. Impact Atwater’s approach to political campaigns proved extremely influential. Not only was it partly
The Eighties in America
responsible for the victories of Reagan and, especially, Bush, but it also influenced the strategies and tactics adopted in future campaigns. Such campaigns often became more focused on personal attacks than on policy issues. They concentrated especially on making voters fear one’s opponent rather than cultivating trust of oneself. This approach was certainly not new, nor was it solely the province of the Right, but it did increase both in frequency and in social acceptability following the 1988 campaign. In addition, Atwater helped modernize the business of political consulting, influencing many of the private-sector political strategists and tacticians who followed him. His use of cultural tastes and trends to influence the voting behavior of specific demographics also set a precedent that would continued into the future. Further Reading
Alterman, Eric. “GOP Chairman Lee Atwater: Playing Hardball.” The New York Times, April 30, 1989. Oreskes, Michael. “Lee Atwater, Master of Tactics for Bush and G.O.P., Dies at 40.” The New York Times, March 30, 1991. Parmet, Herbert S. George Bush: The Life of a Lone Star Yankee. Piscataway, N.J.: Transaction, 2000. Laurence R. Jurdem See also
Bentsen, Lloyd; Bush, George H. W.; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, 1980; Elections in the United States, 1988; Horton, William; Quayle, Dan; Reagan, Ronald.
Auel, Jean M.
■
83
Hunters (1985). Auel would later continue the series with The Plains of Passage (1990) and The Shelters of Stone (2002). Drawing extensively on archaeological research, she is credited with weaving fact with creative thought to offer a glimpse of what life may have been like for Neanderthals and for Cro-Magnon humans, as well as providing some perspective on the evolutionary process of Homo sapiens. Her awardwinning series touches on the history of prehistoric humans, evolutionary processes, matriarchal and patriarchal societies, and the gender roles prevalent in each culture. Her series has received both acclaim and criticism relating to its historical accuracy, portrayal of a feminist utopia, and anachronistic mirroring of political and social themes of the 1980’s. Two of her novels were adapted for film: The Valley of the Horses in 1984 and The Clan of the Cave Bear, starring Daryl Hannah, in 1986. Impact Because Auel’s fiction is so heavily blended with researched facts, her works provide a theoretical basis for understanding prehistoric life and society. Because she details and contrasts both matriarchal and patriarchal societies through the eyes of a female protagonist, her works have been praised for offering a strong feminist perspective for mainstream audiences. Auel’s, moreover, is a recognizably second-wave brand of feminism, distinctive of the 1980’s, in that her main character, Ayla, deals not merely with “legislated” inequality but also with patriarchal ideas and behaviors in ways that model the second-wave feminist slogan, “the personal is political.” Further Reading
■ Auel, Jean M. Identification American fiction author Born February 18, 1936; Chicago, Illinois
Auel’s Earth’s Children series offers a well-researched fictional account of life during the Stone Age. Written with strong feminist undertones, her novels appeared at a time when interest in prehistory was increasing and the role of women in society was becoming heavily debated. Jean M. Auel went from technical writer and poet to best-selling fiction author with her Earth’s Children series of novels. The first three of the novels appeared in the 1980’s: The Clan of the Cave Bear (1980), The Valley of the Horses (1982), and The Mammoth
Andrade, Glenna Maire. (Re)Envisioned (Pre)History: Feminism, Goddess Politics, and Readership Analysis of Jean M. Auel’s “The Clan of the Cave Bear” and “The Valley of the Horses.” Kingston: University of Rhode Island, 1998. “Jean M(arie) Auel.” In Twentieth-Century Romance and Historical Writers. 3d ed. Edited by Aruna Vasudevan. London: St. James Press, 1994. Wilcox, Clyde. “The Not-So-Failed Feminism of Jean Auel.” Journal of Popular Culture 28 (Winter, 1994): 63-70. Susan E. Thomas See also Archaeology; Feminism; Film in the United States; Hannah, Daryl.
B ■ Baby Fae heart transplantation The Event
The first cross-species heart transplant into a human infant Date October 26, 1984 Place Pasadena, California A group of physicians at Loma Linda University Medical Center in Pasadena, California, performed the world’s first animal-to-human transplant in a newborn, when they placed a baboon’s heart into the chest of a twelve-day-old infant named Baby Fae. This highly experimental procedure, which ultimately failed, opened a Pandora’s box of ethical, moral, scientific, and legal issues. It became clear that all was not well with Baby Fae shortly after her birth. The small infant, born in Barstow, California, on October 14, 1984, had hypoplastic left heart syndrome (HLHS), a lethal birth defect in which the heart’s underdeveloped left ventricle cannot supply the body with sufficient blood flow. There was no well-established, successful treatment for her malformed heart, so the infant went home with her mother to die. A few days later, Doctor Leonard Bailey’s team at Loma Linda University Medical Center (LLUMC) offered to replace the baby’s heart with the heart of a baboon in a procedure never before attempted in a newborn infant. Baby Fae received the heart of a young female baboon on October 26 and died twenty days later, on November 15. Her body had rejected the animal’s heart. Impact The news media closely followed the baby’s initial progress and ultimate decline. Days after the transplant, images of a yawning, stretching baby could be seen on the evening news. Flowers, get-well cards, and donations poured into the hospital. Criticism and praise were abundant from both the lay and the scientific press. Those in support of the surgery pointed out that the successful use of animal organs would alleviate the shortage of human organs. Scientists were quick to respond that there was no
indication that cross-species transplants would succeed. Animal-rights activists were troubled by the sacrifice of a healthy animal. Legal scholars objected to the use of a minor in such a highly experimental procedure. The consent obtained from Baby Fae’s parents came under scrutiny by the National Institutes of Health, which concluded that the LLUMC physicians were overly optimistic in regard to the baby’s long-term chances of survival and had failed to discuss the possible use of a human heart to save her. Ethicists condemned the xenotransplantation, because they felt that the procedure was not in the baby’s best interests: Palliative surgery that had recently been developed by William Norwood would have given the baby a 40 percent chance of survival. Animal-to-human organ transplantation did not be-
Baby Fae lies in the Loma Linda University Medical Center on October 30, 1984, four days after her heart was replaced with that of a baboon. (AP/Wide World Photos)
The Eighties in America
Baby Jessica rescue
■
85
come an accepted medical practice after the Baby Fae case, and even research related to such transplantation became heavily regulated by the Food and Drug Administration (FDA). Further Reading
Bailey, Leonard, et al. “Baboon-to-Human Cardiac Xenotransplantation in a Neonate.” The Journal of the American Medical Association 254, no. 23 (December, 1985): 3321–3329. Sharp, Lesley. “Human, Monkey, Machine.” In Bodies, Commodities, and Biotechnologies: Death, Mourning, and Scientific Desire in the Realm of Human Organ Transfer, edited by Sharp. New York: Columbia University Press, 2006. Elisabeth Faase See also Abortion; Artificial heart; Fetal medicine; Genetics research; Health care in Canada; Health care in the United States; Medicine; Science and technology.
■ Baby Jessica rescue The Event
A toddler trapped in a well is rescued on live television Date October 14-16, 1987 Place Midland, Texas The dramatic rescue of Baby Jessica, trapped in an abandoned well, transfixed the nation for several days. The burgeoning field of cable news television enabled Americans to follow this story of peril and triumph around the clock. On the morning of October 14, 1987, twenty-twomonth-old Jessica McClure was playing at her aunt’s house in Midland, Texas. While her mother’s attention was diverted, Jessica fell into an abandoned well in the backyard and became wedged in the shaft, twenty-two feet below ground. Emergency personnel responded quickly, and within hours dozens of people were on hand to help with the rescue of “Baby Jessica.” The story rapidly grew beyond the confines of the small Texas town. Coverage of the event was unprecedented. Whereas this type of human-interest story had always been considered newsworthy, exposure was generally limited to short segments on local and national evening news broadcasts. However, the fledgling Cable News Network (CNN) provided
Baby Jessica is cradled safely in the arms of a rescue worker, after spending three days trapped in an abandoned well. (AP/Wide World Photos)
twenty-four-hour coverage of the toddler’s plight, allowing the entire country to watch as the drama unfolded. As news of Baby Jessica’s ordeal spread, volunteers descended on Midland to offer their aid; firefighters, paramedics, police officers, and construction workers all came to lend equipment or a helping hand. Others helped by sending monetary contributions or messages of moral support. The mouth of the well was only eight inches in diameter, preventing many traditional rescue tools from being used to free the trapped child. After much discussion, rescue crews decided that, instead of trying to widen the shaft, they would dig another hole alongside the well. Through the larger tunnel, the rescuers hoped to break into the well and retrieve Jessica. In an effort to keep the toddler calm, rescuers entertained Jessica by talking and singing to her. Jessica did not seem overly frightened by her situation; throughout the ordeal, she sang nursery rhymes, slept periodically, and only rarely cried. Rescuers worked tirelessly for almost three days. Finally, after Jessica had spent fifty-eight hours in the well, the world watched as paramedic Robert
86
■
Back to the Future
The Eighties in America
O’Donnell pulled her to safety at 7:55 p.m. on October 16. Miraculously, Jessica sustained only minor wounds and, for the most part, came out of the ordeal with only superficial scars. Her most significant injury was sustained when gangrenous portions of the toes on her right foot had to be removed. Impact Baby Jessica exited the national spotlight almost as quickly as she had entered it. With the exception of her participation in a Fourth of July parade in Washington, D.C., and attendances at a handful of small events in her native Texas, Jessica’s public appearances were limited. Interviewed a year after the event, Baby Jessica’s parents said that the toddler remembered little of the incident and, aside from the physical scars, suffered no lingering effects. Ever ybody’s Baby, a made-for-television movie about the event, aired in 1989. The production not only dramatized the rescue of Baby Jessica but also, through its title, indicated how television had transformed the local, small-town incident into an event that played out on the national stage, capturing the attention of the entire country. Further Reading
Garner, Joe. Stay Tuned: Television’s Unforgettable Moments. Kansas City, Mo.: Andrews McMeel, 2002. McClure, Chip. Halo Above the City. Flint, Tex.: Still Sprint, 1997. Matthew Schmitz See also
Cable television; CNN; Journalism; Tele-
vision.
■ Back to the Future Identification
Science-fiction comedy adventure
film Director Robert Zemeckis (1952Date Released July 3, 1985
)
Back to the Future blended lighthearted adventure with a science-fiction time travel plot, appealing to a broad audience and becoming a major hit. The blockbuster, which spawned two sequels, was the first of director Robert Zemeckis’s spectacle-driven, effects-laden films, and was coexecutive produced by Steven Spielberg. In Back to the Future, Marty McFly (Michael J. Fox) is a high school senior from a dysfunctional family and friend to eccentric scientist Doc Brown (Christopher
Lloyd), who has been buying nuclear fuel from Libyan terrorists to power the time machine he has built out of a DeLorean car. When the terrorists shoot him, Marty escapes back to 1955 in an attempt to warn him. He meets the younger Doc Brown and also assists his parents in the early stages of their courtship. He initially endangers their relationship, almost erasing himself from existence, but ultimately changes his personal history for the better, causing his family and himself to have a better life in 1985. The movie won a Hugo Award, awarded by attendees at the annual World Science Fiction Convention, as the year’s best science-fiction film. It grossed $210 million in its initial release, the most of any movie that year. The film seemed to strike a chord with 1980’s culture, as tales of time travel, especially those in which the protagonists fixed problems in history, were widespread in the decade. These stories formed the subjects of the films Time Bandits (1981), Peggy Sue Got Married (1986), and Star Trek IV: The Voyage Home (1986), as well as the television series Voyagers! (premiered 1982) and Quantum Leap (premiered 1989). Impact As one of the most widely seen films of the decade, Back to the Future influenced the catchphrases of 1980’s American culture, and it was referred to in sources as diverse as television commercials and President Ronald Reagan’s 1986 State of the Union address, in which he quoted a line from the movie (“Where we’re going, we don’t need roads”). The film had not been conceived as part of a franchise, but its success and the popularity of the other time-travel films and television shows resulted in the simultaneous filming of two sequels to be assembled and released separately—Back to the Future II (1989) and Back to the Future III (1990)—as well as the release of comic books, novelizations, video games, toys, and an animated television series. Further Reading
Clute, John, and Peter Nicholls, eds. The Encyclopedia of Science Fiction. London: Little, Brown, 1993. Gipe, George. Back to the Future. New York: Berkley, 1987. Kagan, Norman. The Cinema of Robert Zemeckis. Landham, Md.: Taylor Trade, 2003. Klastorin, Michael, and Sally Hibbin. Back to the Future: The Official Book of the Movie. London: Hamlyn, 1990. Paul Dellinger
The Eighties in America See also
Action films; De Lorean, John; Family Ties; Film in the United States; Fox, Michael J.; Science-fiction films; Sequels; Special effects; Spielberg, Steven.
■ Bakker, Jim and Tammy Faye Identification
Evangelical minister and his wife
Jim Bakker January 2, 1940; Muskegon, Michigan
Born
Tammy Faye Bakker March 7, 1942; International Falls, Minnesota Died July 20, 2007; Loch Lloyd, Missouri Born
Charismatic televangelist preachers Jim and Tammy Faye Bakker used television to bring their ministry into the public eye, creating an empire that brought in millions of dollars each year. Jim and Tammy Faye Bakker were hosts of the PTL Club, whose initials stood for “praise the lord” or “people that love.” Bakker used his boyish good looks, humor, and righteous anger to send his message and draw in viewers. Tammy Faye was best known for her lavish use of makeup. Mascara often ran down her face as she cried while singing or asking for donations. When their talk show premiered in 1976, it was very different from other evangelical television programs, which offered somber, sometimes threatening, messages. The faithful were ready for something different. Millions of middle-class Christians worldwide found it by tuning in daily to watch the Bakkers. The PTL ministry grew quickly, earning more than $129 million per year. Bakker preached what was called “prosperity theology.” What you gave to God, he said, would be returned to you many times over. He believed that God wanted His followers to have the best of everything, including million-dollar homes and expensive cars and clothes. At the center of the PTL ministry was the Heritage USA theme park and complex. What began in 1978 as a Christian campground soon turned into the third largest theme park in the nation, complete with hotels, restaurants, and a shopping mall. After attending bible study, prayer meetings, or weekly services, families could relax at the water park.
Bakker, Jim and Tammy Faye
■
87
As the PTL ministry grew during the 1980’s, so did the Bakkers’ lavish lifestyle. They spent money faster than their followers could send it in. Bakker came up with the idea of PTL lifetime partners. For a one-thousand-dollar donation, partners earned one free weekend stay per year at Heritage hotels for the rest of their lives. Bakker, however, soon found himself being sued, because there weren’t enough rooms to accommodate everyone who had purchased a lifetime partnership. The lawsuits and the couple’s conspicuous consumption led the mainstream media to turn them into symbols of the worst excesses of televangelism. In 1987, word leaked about a 1980 sexual encounter between Jim Bakker and a church secretary named Jessica Hahn. The incident became public when it was discovered that Hahn had been paid $265,000 to keep quiet. Bakker temporarily stepped
Jim and Tammy Faye Bakker record an episode of their television program on August 20, 1986. (AP/Wide World Photos)
88
■
The Eighties in America
Ballet
aside and let fellow preacher Jerry Falwell take over until things cooled off. When Falwell discovered that the Bakkers were in bad financial shape, he tried to prevent their return. The Bakkers also found themselves under investigation by the Federal Communications Commission (FCC) for using money raised for specific overseas ministry programs to pay for Heritage USA and their personal expenses. In 1989, Bakker was convicted of fraud and conspiracy for stealing donations from the PTL ministry and sent to prison. Impact The PTL scandals and the Bakker’s excessive lifestyles eventually affected the reputation of other televangelists. The 1987 scandal happened to coincide with one involving televangelist Oral Roberts, who had shocked followers when he implied that if they did not donate $4.5 million to his ministry, God would kill him. Thus, televangelism became associated with corruption and greed during the late 1980’s. The Bakkers’ greed also victimized many unsuspecting, faithful viewers, causing them collectively to lose millions of dollars. Further Reading
Albert, James A. Jim Bakker: Miscarriage of Justice? Peru, Ill.: Carus, 1988. Hunter, James. Smile Pretty and Say Jesus: The Last Great Days of the PTL. Athens: University of Georgia Press, 1993. Martz, Larry. Ministry of Greed: The Inside Story of the Televangelists and Their Holy Wars. New York: Weidenfeld & Nicolson, 1988. Maryanne Barsotti See also Falwell, Jerry; Heritage USA; Religion and spirituality in the United States; Robertson, Pat; Swaggart, Jimmy; Televangelism.
■ Ballet Definition
A classical, theatrical, narrative form of
dance The 1980’s were characterized by a departure from classicism by ballet choreographers who redefined the dance form’s language and took it beyond its established boundaries. After Mikhail Baryshnikov finished working with George Balanchine in The Prodigal Son (pr. 1929)
and The Steadfast Tin Soldier (pr. 1975) in 1980, he returned to the American Ballet Theatre (ABT) as artistic director. He staged elaborate productions of the classics, including Giselle (pr. 1841) and Swan Lake (pr. 1877, rev. 1951), featuring the virtuoso Gelsey Kirkland, who had reached stardom at New York City Ballet. Her performance of the role of Kitri in Don Quixote (pr. 1869), in which she touched the back of her head with her foot in an amazing jeté kick, won for her celebrity and acclaim. Aside from his embrace of the classics, Baryshnikov’s greatest contribution was to add eight Balanchine ballets and other works by avant-garde choreographers to the company’s repertoire, expanding it in style and vision. Death of a Master
A child extinguishing a candle as the music died in Balanchine’s memorable Adagio Lamentoso from the Pathétique Symphony (1981) anticipated the mourning of the brilliant choreographer’s end. With his death in 1983, New York City Ballet faced the arduous task of replacing the genius who had served as the company’s ballet master, mentor, and voice. The task of keeping the company together and redirecting its future fell on partners Peter Martins and Jerome Robbins. Robbins assumed the responsibility of creating new ballets, while the company’s management and direction fell to Martins. Robbins was considered the most gifted American-born choreographer and had contributed to the company’s repertoire for more than twenty years. During the 1980’s, his creative versatility was recognized again in works inspired by jazz (Gershwin Concerto, 1982), by modern composers like Claude Debussy (Antique Epigraphs, 1984) and Aaron Copland (Quiet City, 1986), and by the minimalist music of Philip Glass (Glass Pieces, 1983) and Steve Reich (Eight Lines, 1985). Martins spent the larger portion of the 1980’s maintaining and promoting Balanchine’s repertoire. At the end of the decade, however, his own creative works showed a clear departure from his master’s style in such pieces as Ecstatic Orange (pr. 1987) and Echo (pr. 1989), both set to the music of Michael Torke. If Balanchine’s specialty had been the exploration of movement in the anonymity of dancers, Robbins’s movement invention lent itself for communication among dancers, and Martins, with Ecstatic Orange, achieved equality between male and female dancers.
The Eighties in America
On the West Coast, San Francisco Ballet, considered one of the most prestigious regional companies, underwent deep restructuring. As artistic director, Michael Smuin’s greatest accomplishment had been to get the company’s full-length ballets televised on Dance in America and to redefine the boundaries of ballet by incorporating everything from leather jackets in the television special To the Beatles to live buffalo in Song for Dead Warriors. Criticized for having gone “too far,” he was replaced in 1984 by Helgi Tomasson, whose efforts concentrated on reviving the classics, attracting a number of outstanding dancers from world-renowned companies, and transforming San Francisco Ballet into the leading ballet company of the western United States. Departure from Classicism
Alonzo King, a choreographer with a unique vision who saw ballet as both a science and a language, appeared on the dance scene of San Francisco with his company Lines Ballet in 1982. King’s outstanding contribution to the field was his holistic approach to it. Where dance previously had been seen as imitation of external shapes and forms, King viewed it as self-discovery, bringing to the foreground ballet’s long-ignored spiritual aspect. His work can be said to be primitive and ritualistic, designed to discover the truth of dance and to dig hidden talents out of a dancer’s spirit. Smuin, after leaving San Francisco Ballet in 1984, dedicated his inventiveness, creativity, and theatrical vision of ballet to the creation of Smuin Ballet. Smuin’s work has been called “choreographic theater personified.” Combining his expertise in both theater and ballet, he managed to depart far enough from both genres to give birth to a unique style that differed completely from King’s style, rapidly winning him a name and reputation in San Francisco and among the leading choreographers of the decade. Les Ballets Trockadero de Montecarlo, a transvestite group featuring hairy-chested men on pointe wearing tutus, performing parodies of the work by major choreographers, including Balanchine, Robbins, Martha Graham, Agnes De Mille, Paul Taylor, and Kurt Jooss, among others, brought a humorous tone to the ballet scene. From the outrageous stage names of the company members to the mocking of technique and choreography, their spoofing of classical and modern dance was well received by audiences worldwide. Seen by many as a manifestation of
Ballet
■
89
the gay liberation movement, its humor, inventiveness, and critical approach to an art form that until then had only been taken seriously, gained both support and approval. Ballet on Film Two ballet films helped raise interest in classical ballet, White Nights (1985) and Dancers (1987), both starring Baryshnikov. The latter was a melodrama involving ballet dancers who performed Giselle. (Baryshnikov had already danced Giselle on film for a 1977 television broadcast, aired as part of the Live from Lincoln Center series.) In White Nights, Baryshnikov played a Soviet ballet dancer who had defected to the United States and who finds himself back in the Soviet Union when his airplane is forced to make an emergency landing there. The film’s autobiographical elements combined with Baryshnikov’s collaboration with tap dancer Gregory Hines to make White Nights an immediate success. The inclusion in the film of an outstanding staged performance of Roland Petit’s Le Jeune Homme et la Mort (1946; the young man and death) raised its artistic merit considerably. The Dark Side of Ballet
Acquired immunodeficiency syndrome (AIDS) took the lives of many dancers in major ballet companies such as the Joffrey Ballet and Les Ballets Trockadero de Montecarlo. Professional ballet dancers were accustomed to deal with the fleeting nature of dance and the brief life of their performing careers, but AIDS now added another aspect to their transient profession. Similarly, cocaine and other drugs, which had made their way backstage at ABT, came to light in 1986, when Gelsey Kirkland published her book Dancing on My Grave.
Impact The 1980’s witnessed the introduction of external elements from modern dance and culture generally into classical ballet. Ballet was popularized through its serious if melodramatic portrayal in film, even as it was spoofed mercilessly by Les Ballets Trockadero de Montecarlo. The decade ended with a sense of the dance form’s greater possibilities, as the classical companies demonstrated that they could stage traditional ballets alongside more radical or irreverent departures from tradition. Further Reading
Garafola, Lynn. Legacies of Twentieth-Century Dance. Middletown, Conn.: Wesleyan University Press, 2005. An evaluation of the lasting contributions
90
■
Baseball
of twentieth century dancers from the point of view of the early twenty-first century. Garis, Robert. “Millennium: The Years of Peace.” In Following Balanchine. New Haven, Conn.: Yale University Press, 1995. Covers the most relevant events preceding and following Balanchine’s death. Reynolds, Nancy, and Malcolm McCormick. “Ballet’s High Tide.” In No Fixed Points: Dance in the Twentieth Century. New Haven, Conn.: Yale University Press, 2003. Covers the major developments at ABT, New York City Ballet, and regional ballet companies during the 1980’s. Roseman, Janet Lynn. “Alonzo King.” In Dance Masters: Interviews with Legends of Dance. New York: Routledge, 2001. Includes a discussion of King’s work and a lengthy interview in which the choreographer expresses his dance philosophy. Solway, Diane. A Dance Against Time. New York: Simon & Schuster, 1994. Provides a detailed and personal account of the impact AIDS had on members of the Joffrey Ballet. Sylvia P. Baeza
The Eighties in America
with drug abuse among athletes and unethical conduct by one of its outstanding players, Pete Rose. Four individuals would serve as baseball commissioner during the decade, and on February 3, 1989, Bill White became the president of the National League, making him the highest-ranking African American sports executive ever. Major League Baseball recorded several all-time performance records in the 1980’s: Pete Rose surpassed Ty Cobb as the all-time hits leader, and two pitchers, Nolan Ryan and Steve Carlton, moved ahead of Walter Johnson to become all-time strikeout leaders. An earthquake on October 17, 1989, forced the cancellation of game three of the 1989 World Series between the San Francisco Giants and the Oakland Athletics. On August 8, 1988, Chicago’s Wrigley Field hosted the first night game in its history. On April 3, 1985, the League Championship Series changed from a best-of-five-games competition to a best-of-seven-games competition. On July 13, 1982, the first All-Star Game outside the United States was played in Montreal’s Olympic Stadium. Labor and Management Relations
See also
Classical music; Dance, popular; Glass, Philip; Homosexuality and gay rights; Jazz; Music; Theater.
■ Baseball Definition
Professional team sport
Major League Baseball suffered a tarnished image during the 1980’s, as a player’s strike and a gambling scandal involving star player Pete Rose disillusioned fans. Nonetheless, by decade’s end, attendance levels had recovered, and the fans had largely decided to continue watching professional games, albeit with a more jaded attitude than they had had at the beginning of the 1980’s. During the 1980’s, Major League Baseball team owners and players enjoyed considerable economic prosperity. Record-setting attendance as well as lucrative media contracts favored owners, whereas increased salaries favored players. In the midst of these prosperous economic conditions, however, conflicts developed between labor and management. Player strikes and collusion on the part of owners came to the surface during the decade. In addition to labor confrontations, baseball’s leadership had to deal
Free agency, which had become part of baseball’s landscape in the 1970’s, caused considerable disagreement between owners and players in the 1980’s. On three occasions during the decade, negotiations between the Major League Baseball Players Association and the owners of the major-league teams resulted in work stoppages by players. As the 1980 season was beginning to open, players and owners had not come to an agreement over labor issues. As a result, players conducted a walkout of spring training on April 1, 1980. The players agreed to open the season but warned that they would strike on May 23 if their demands were not met. On May 23, players and owners agreed on all but one issue, free agency. The issue of free agency was tabled until the 1981 season. Near the end of the spring of 1981, an agreement on free agency had yet to be reached. The owners wanted to receive compensation if they lost free-agent players to other teams. Specifically, they wanted to institute a rule stating that a team that lost a free agent would in return receive players from the middle of the roster of the team that acquired the free agent. The players found such mandatory trading unacceptable, because they believed it would make teams less willing to sign free agents, who would therefore command less money when they were signed.
The Eighties in America
Baseball
■
91
their demand for a salary cap. In addition, the minimum player salary was increased from $40,000 to $50,000. Strikes were not limited to players. Major-league umpires struck for one week, between September 30 and October 7, 1984. As a result, the first game of the National League Championship Series was officiated by college umpires. During the 1980’s, a number of baseball’s highperformance players, such as Jack Morris, Carlton Fisk, and Andre Dawson, tested the free-agent market. Surprisingly, no teams made an offer for these players. Miller asserted that owners were engaged in a conspiracy not to sign players who opted for free agency. Miller contended that owners were engaged in unlawful collusion, and a grievance was filed. The baseball establishment of twenty-six owners and the commissioner of baseball were found guilty of engaging in collusion during a three-year period in the 1980’s. The owners were required to pay $280 million in lost wages. Drugs and Gambling
Baseball Commissioner Bart Giamatti announces his decision to ban Pete Rose from baseball in August, 1989. (AP/Wide World Photos)
On June 11, Marvin Miller—the executive director of the Major League Baseball Players Association, who led the negotiations—announced that the players would commence a work stoppage on June 12. Although there were previous player work stoppages, the 1981 strike represented the first time that players had walked out during the regular season. On July 31, 1981, the owners agreed to a resolution. The 1981 season resumed with the All-Star Game on August 9, and regular season games resumed on August 10. As a result of the strike, 713 games were canceled. A second work stoppage by players during regularseason play occurred on August 6, 1985. The causes of the second strike included free agency, arbitration salary caps, and salary minimums. The strike ended on August 8, when the owners agreed to drop
Baseball was confronted with a public relations disaster in 1985, when twenty-one active players testified that they had used cocaine. In 1986, thirty-one more players were fined for drug use. On September 20, 1985, Curtis Strong, a Philadelphia caterer, was found guilty of eleven counts of selling cocaine to major-league players between 1980 and 1983. To demonstrate that baseball’s leadership would not tolerate drug abuse within its establishment, Commissioner Peter Ueberroth on June 18, 1985, announced that a mandatory drug-testing program would be instituted in July, 1985, for players and umpires in the minor leagues. In August of 1985, drug testing became required of all majorleague managers, coaches, trainers, and umpires as well. The players association voted against mandatory testing of players. They did, however, agree to drug testing as part of a labor contract. One of baseball’s most popular players in the 1980’s, Pete Rose began his baseball career with the Cincinnati Reds in 1963. When he retired as a player in 1986, he had become the all-time hits leader with 4,256 career hits and the all-time leader in at-bats with 14,043. He ranked second in career doubles with 746, and he finished ten seasons with two hundred or more hits in each. After playing for the Philadelphia Phillies for five seasons and one-half year with the Montreal Expos in 1984, he returned to the
92
■
The Eighties in America
Baseball
Reds in the middle of the 1984 season to assume the role of player coach. On September 11, 1985, Rose moved ahead of Ty Cobb to become the all-time hits leader. Although he retired as a player in 1986, he continued as manager of the Reds until 1989. On April 1, 1989, Bart Giamatti began his duties as baseball’s seventh commissioner. He was immediately confronted with allegations that Rose had gambled on baseball games, in violation of league rules. An attorney, John Dowd, was hired by Giamatti to investigate the involvement of Rose with gambling. Dowd’s report concluded that there was evidence that Rose bet on baseball. According to the league’s rules, gambling on baseball was grounds for a lifetime suspension from Major League Baseball. On August 23, 1989, Rose signed an agreement that permanently banned him from baseball. The agreement stipulated, however, that he would be eligible to petition for reinstatement after one year. On September 1, 1989, Giamatti died of a heart attack. On September 13, 1989, Fay Vincent was elected to serve the unfinished four and one-half years of Giamatti’s commissionership. The agreement that Rose made with Giamatti resulted in his ineligibility to be inducted into the Baseball Hall of Fame. Revenues and Salaries
While the decade was riddled with numerous baseball controversies, owners’ revenues and players’ salaries increased significantly. In 1980, Major League Baseball’s annual attendance was 43 million. That attendance plummeted temporarily during the strike, but later in the decade, it recovered to reach an all-time high of 50 million. Baseball had become more competitive as more teams entered postseason play. During the period between 1981 and 1988, eleven different teams won divisional titles in the American League. In the National League, ten different teams won divisional races, and on October 3, 1981, the Montreal Expos clinched a playoff title, becoming the first team from Canada to do so. Media contracts contributed significantly to increased revenue for baseball. On April 7, 1983, the American Broadcasting Company (ABC) and the National Broadcasting Company (NBC) agreed to pay Major League Baseball $1.2 billion for rights to broadcast the upcoming season. In 1988, gross revenues exceeded $1 billion. Each club received $7 million from television pacts. In January, 1989, a $1.1 billion Columbia Broadcasting System (CBS) televi-
sion deal was made, along with a $400 million fouryear contract with ESPN for the cable rights to 175 games per season and a $50 million four-year radio deal with CBS. Fueled by these lucrative television contracts, player salaries increased significantly as well. In 1981, the average player salary was $185,000. In 1984, the average salary increased to $300,000, with thirty players making more than $1 million each. In 1989, the average player salary rose to $500,000, with twenty players earning at least $2 million a year. Individual player contracts were very lucrative in the 1980’s. On December 15, 1980, the New York Yankees signed Dave Winfield to a ten-year contract. The deal included incentives that could make his salary anywhere between $13 million and $25 million, making him the highest-paid athlete in the history of team sports to that time. On February 15, 1986, Fernando Valenzuela signed a three-year $5.5 million contract. The contract stipulated that his annual salary would increase over the three years, and in 1988 that salary surpassed $2 million. Offensive Performances
On the playing field, the advent of the running game resulted in unprecedented stolen-base records, as six players during the 1980’s recorded one hundred or more stolen bases in a season. On August 27, 1982, Rickey Henderson broke the single-season base-stealing record when he stole his 119th base; he went on to steal a total of 130 bases in the year. In 1987, the records for total home runs hit in each of the two major leagues were broken, when players hit 2,634 home runs in the American League and 1,824 in the National League. Widening of the strike zone, lack of quality pitching, and a “juiced” ball were reasons provided to explain this dramatic increase in home runs.
Pitching Performances
Pitching in the 1980’s did not maintain the game-deciding dominance it had enjoyed in the previous two decades. The 1980’s recorded a total of thirteen no-hitters, compared to thirty-four no-hitters in the 1960’s and thirty-one in the 1970’s. Several perfect games were recorded, however. On May 15, 1981, the Cleveland Indians’ Len Barker pitched a perfect game against the Toronto Blue Jays. It was the first perfect game since 1968. In addition, Mike Witt in 1984 and Tom Browning in 1988 recorded perfect games. On September 26, 1981, Nolan Ryan of the Houston Astros pitched a no-hitter, becoming the first pitcher to
The Eighties in America
pitch five no-hitters in his career. In 1989, Ryan at the age of forty-two became the first pitcher to record five thousand career strikeouts. During the 1980’s, five pitchers reached their career three hundredth victory. Impact The 1980’s introduced regular-season work stoppages to professional baseball. Combined with the drug and gambling scandals of the decade, the players strikes led fans to become disillusioned with the sport colloquially referred to as “America’s pastime.” However, the nation simultaneously maintained idealized and cynical attitudes toward the sport, as demonstrated by the many popular movies— The Natural (1984), Eight Men Out (1988), Bull Durham (1988), and Field of Dreams (1989)—that either romanticized the game, debunked its myths, or sought to do both at once. Further Reading
Koppett, Leonard. Koppett’s Concise History of Major League Baseball. Exp. ed. New York: Carroll & Graf, 2004. Provides an overview and explanation of significant events in baseball since the nineteenth century. Miller, Marvin. A Whole Different Ball Game: The Inside Story of the Baseball Revolution. Chicago: Ivan R. Dee, 2004. Analyzes the changes that occurred in baseball as a result of player arbitration. Reston, James. Collision at Home Plate: The Lives of Pete Rose and Bart Giamatti. Lincoln: University of Nebraska Press, 1997. Reviews the controversial decision to ban Pete Rose from baseball. Solomon, Burt. The Baseball Timeline. New York: DK, 2001. Provides year-to-year accounts of baseball events beginning with 1845. Thorn, John. Total Baseball: The Official Encyclopedia of Major League Baseball. 7th ed. Kingston, N.Y.: Total Sports, 2001. Includes extensive coverage on baseball statistics. Tygiel, Jules. Past Time: Baseball as History. New York: Oxford University Press, 2001. Includes a chapter on baseball fantasies in the 1980’s. Voight, David Q. American Baseball. 3 vols. University Park: Pennsylvania State University Press, 1983. In-depth three-volume history of baseball. War, Geoffrey C., and Ken Burns. Baseball: An Illustrated History. New York: Alfred A. Knopf, 1994. Text companion to the twenty-one-hour video documentary on baseball by Ken Burns. Alar Lipping
Baseball strike of 1981
■
93
See also Baseball strike of 1981; Brett, George; Hershiser, Orel; Jackson, Bo; Rose, Pete; Ryan, Nolan; SkyDome; Sports; Ueberroth, Peter; Valenzuela, Fernando.
■ Baseball strike of 1981 The Event
Fifty-day work stoppage by Major League Baseball players during the regular season Date June 12 to July 31, 1981 The second strike in Major League Baseball’s history resulted in the cancellation of more than one-third of the season. Resulting from a dispute over free-agent player movement and rising salaries, the strike temporarily alienated baseball fans, but attendance soon rebounded later in the decade. Major League Baseball’s Basic Agreement, a collective bargaining agreement between the team owners and the Major League Baseball Players Association, expired in 1980. Despite a one-year cooling-off period, the players’ union, led by Marvin Miller, could not concur with the owners on the terms of a new agreement. Owners were determined to cap the players’ escalating salaries and to limit free agency by requiring that any team that lost a free-agent player to another team receive significant compensation. Miller and the players argued that teams were still making profits despite their higher salaries and that compensation requirements would inhibit teams from bidding on free agents. When the owners threatened to institute their own interpretation of restricted free agency, the players walked out on June 12. During the fifty-day strike, the players lost approximately $30 million in salaries but stayed relatively united. The average player forfeited about $50,000. Team owners had a $50 million strike insurance policy, but they still probably lost close to $70 million in revenue. With the imminent prospect of a canceled season, several owners and Commissioner Bowie Kuhn panicked and renewed negotiations. An agreement was announced July 31. Owners would receive minor compensation for lost free agents, but not enough significantly to inhibit free agency. The rise in salaries continued unabated. Play resumed on August 9 with the All-Star Game in Cleveland, and the regular season began the fol-
94
■
The Eighties in America
Basketball
lowing day. Some 712 games had been canceled during the strike. Hoping to generate fan enthusiasm, owners devised a complicated playoff system for the 1981 season whereby the division leaders in each half of the season would play in a preliminary elimination round to determine who would advance to the league championships. Attendance and television ratings, however, fell significantly. Furthermore, the complicated system resulted in the two National League teams with the best overall records, the St. Louis Cardinals and the Cincinnati Reds, being excluded from the playoffs. Meanwhile, the Kansas City Royals won the second-half American League Western Division championship, despite having a losing record for the abbreviated season. Impact There were no winners in the strike of 1981. Fans were disgusted with owners and players alike. The strike produced a deep-seated distrust between the players’ union and ownership. Owners made Kuhn a scapegoat and refused to reelect him to another term as commissioner. The stage was thus set for even more costly battles over free agency and player salaries. Further Reading
Burk, Robert F. Much More than a Game: Players, Owners, and American Baseball Since 1921. Chapel Hill: University of North Carolina Press, 2001. Korr, Charles P. The End of Baseball as We Knew It: The Players Union, 1960-81. Urbana: University of Illinois Press, 2002. Rader, Benjamin G. Baseball: A History of America’s Game. 2d ed. Urbana: University of Illinois Press, 2002. M. Philip Lucas See also
Baseball; Sports; Unions.
■ Basketball Definition
Team sport
The confluence of events that transpired in the National Basketball Association in the early 1980’s saved the league, allowed it to expand, elevated it to equal status with the National Football League and Major League Baseball, and made it an indelible aspect of the American psyche. The unprecedented success of basketball in the 1980’s can be traced to a singular event in the final
year of the 1970’s. In the ultimate game of the 1979 National Collegiate Athletic Association (NCAA) tournament, two polar forces of the basketball universe collided to determine the last college basketball champion of the decade. The game, held on March 26 in Salt Lake City, Utah, marked the genesis of the rivalry between Michigan State University’s sophomore guard Earvin “Magic” Johnson and Indiana State University’s senior forward Larry Bird. Johnson’s team prevailed 75-64. The two young stars entered the National Basketball Association (NBA) the following fall—Johnson with the Los Angeles Lakers and Bird with the Boston Celtics. They made immediate impacts on the league and came to symbolize the heightened interest in, and the cultural significance of, basketball in the United States. The NBA Takes Off
The rivalry between Bird and Johnson and, in a larger context, between the historically dominant organizations that each player represented gave the NBA a marketable product and a sustainable plotline. The styles employed by the two players contrasted immensely. Johnson’s flashy style and up-tempo attitude were a natural fit for Hollywood, while Bird’s combination of a fundamentally sound and fluid style and gritty, blue-collar workmanship melded with his Bostonian surroundings. The trait that Johnson and Bird shared, however, was the deep, existential desire to be the best. “The first thing I would do every morning was look at the box score to see what Magic did. I didn’t care about anything else,” Bird admitted. The two players met three times in the NBA finals; the Lakers won two of the three series. Bird and Johnson both won three Most Valuable Player (MVP) awards and were selected nine times for the All-NBA first team. The NBA needed to capitalize on the heightened interest in the league that the two superstars engendered, and the emergence of cable television coupled with a visionary league commissioner helped basketball flourish like never before. In 1984, the NBA’s executive vice president David Stern became the new NBA commissioner when Larry O’Brien retired. Stern implemented a number of provisions that ensured the lasting success of the league. He encouraged corporate sponsorship of NBA franchises—highlighting the marketability of many of the young players in the league. Stern encouraged the dissemination of the NBA through cable television: In the previous season, the Entertainment and
The Eighties in America
Sports Programming Network (ESPN) started to broadcast games; Turner Broadcast Station (TBS) took over cable rights in 1984. Other media and technological developments occurred under Stern’s tenure: In 1985, the league began archiving all televised games, and, at the end of the decade, deals were struck to broadcast games in Latin America and the United Kingdom. Stern helped renegotiate the league salary cap, implemented in 1983; established an extensive and seminal antidrug policy; and oversaw league expansion—five new teams were added during the decade. Decade of Dynasties The 1980’s was the decade of dynasties, led by the Lakers, who appeared in eight out of ten finals and won five championships. The pinnacle of the Lakers’ success came in 1985, when the team became the first franchise to win a championship in the legendary Boston Garden. Led by sixtime MVP and career-points-leader Kareem AbdulJabbar, the Lakers exorcized the ghosts of the past by finally closing out a series on the Celtics’ home court. Furthermore, in 1988, the Lakers became the first team to repeat as champions—a feat that head coach Pat Riley guaranteed—since the 1969 Celtics. If the Lakers were the decade’s best team, the Celtics finished a close second. Anchored by Bird, Kevin McHale, and Robert Parrish—considered the preeminent front court in basketball history—Boston won three championships. The 1985-1986 team ranks with the greatest Celtics teams of all time; the team record of sixty-seven wins and fifteen losses is second-best in franchise history. The Detroit Pistons, one of the NBA’s oldest teams, began the decade inauspiciously, finishing the 1979-1980 campaign 16-66. In 1981, the Pistons drafted Indiana University (IU) point guard Isiah Thomas, and the slow march to the top began. Near the end of the 1980’s, the Pistons were legitimate title contenders. After losing a seven-game series to the Celtics in the 1987 Eastern Conference finals and the Lakers in the 1988 NBA finals, the Pistons swept the Lakers in four games in 1989 to capture their first championship. The team repeated as champions the following season. Other NBA franchises staked a claim to the moniker of dynasty, most notably the Philadelphia 76ers. Julius “Dr. J.” Erving had won two American Basketball Association (ABA) championships in the 1970’s with the New York Nets. However, he struggled to
Basketball
■
95
bring a championship to his NBA team, the 76ers. Philadelphia reached the NBA finals in 1980 and 1982, losing both times to the Lakers. For the 19821983 season, the 76ers added Moses Malone, another ABA holdover and two-time NBA MVP. Malone’s extra bulk in the middle of the floor helped the team match up with Lakers’ center Abdul-Jabbar and proved to be the missing piece in the championship puzzle. Philadelphia was dominant in the 1983 playoffs and swept the defending-champion Lakers in four games. The 1984 Draft
In 1985, the NBA implemented the draft lottery; thus, finishing last in the league did not guarantee the first pick in the draft. The 1984 draft is viewed as the finest collection of players to make the transition from college to professional basketball. It
Los Angeles Laker Magic Johnson seeks to pass over the head of Boston Celtic Larry Bird. The Johnson-Bird rivalry helped draw new fans to the sport and revitalized the NBA. (AP/Wide World Photos)
96
■
Basketball
is also the year that the Portland Trailblazers, who had the second pick in the draft, made perhaps the gravest mistake in the history of sports franchises. In need of a center, Portland selected Kentucky’s Sam Bowie, bypassing Michael Jordan, who was quickly snatched by the Chicago Bulls with the third pick in the draft. Though Bowie had a respectable career, Jordan became not only the greatest player of his generation but also arguably the most transcendent player in sports history. The Houston Rockets, after suffering through two consecutive miserable seasons and accusations of self-sabotage, selected Hakeem Olajuwon with the first pick in the draft, altering the course of the franchise for the next two decades. Olajuwon eventually lead the Rockets to back-to-back NBA championships in the mid-1990’s. Other notable players such as Charles Barkley, John Stockton, Kevin Willis, Otis Thorpe, Jerome Kersey, and Jordan’s teammate at the University of North Carolina (UNC) Sam Perkins were selected in the 1984 draft. The superstars drafted in 1984 ensured the lasting prosperity of the league both on the court—where high levels of athleticism were on display every night—and off the court—through product endorsement and worldwide recognition. Men’s College Basketball
The expansion of the NCAA tournament from the exclusivity of the early years to forty teams in 1979, forty-eight teams in 1980, fifty-two teams in 1983, and finally sixty-four teams in 1985 was, on one hand, decried as an affront to the elite teams and, on the other, hailed as a reflection of the increasing parity in intercollegiate athletics. The NCAA’s foresight to expand the tournament to include not only conference champions but also sometimes fourth- and fifth-place finishers piqued the interest of a large portion of the American population and turned the event into a lucrative and highly anticipated spring ritual. In most cases, higher-ranked teams advanced to appropriate levels in the tournament. The 1982 championship between UNC and Georgetown featured numerous future NBA stars, including Jordan, James Worthy, Perkins, Patrick Ewing, and Eric “Sleepy” Floyd. Both teams entered the tournament as favorites, seeded first in their respective region. The game climaxed when Jordan connected on a sixteen-foot jump shot that gave UNC a 63-62 lead and the championship. The season was the first year
The Eighties in America
that CBS broadcast the tournament, and the final was credited by broadcaster Curt Gowdy as elevating the event to the status of the Super Bowl. For fans, the most tantalizing aspect of the tournament’s expanded format was the possibility of lower-ranked teams upsetting favorites. In 1983, coach Jim Valvano led North Carolina State—with ten losses on its résumé—into the final game to face the heralded University of Houston squad, dubbed “Phi Slama Jama,” for its propensity to slam-dunk, and anchored by Olajuwon and Clyde Drexler. In the minds of most, Valvano had taken his “Cinderella” team as far as they could go; but in the waning seconds of the game, Lorenzo Charles corralled an errant long-distance shot attempt and, ironically, slammed home the game-winner for North Carolina State. What followed was perhaps the most iconic image in tournament history: Valvano sprinted wildly down the court, his arms and hands flailing in celebration. The tournament saw other momentous upsets in the 1980’s, such as Villanova’s defeat of defendingchampion Georgetown in 1985 and the University of Kansas’s victory over the brash and confident University of Oklahoma in 1988. In both instances, the favorite had defeated its underdog opponent twice during regular-season conference play, only to lose the third meeting when it mattered most. Intercollegiate rule changes, specifically Proposition 48, which defined academic guidelines for sports scholarships, forced some players to hone their academic and athletic skills at the junior-college level. In 1987, two such players, Dean Garrett and Keith Smart, helped IU capture its fifth championship, three of which belonged to coach Bobby Knight. The proposition was construed by some as specifically targeting African American players. Buoyed by the success of Johnson and Bird, former NCAA foes, the NBA continued to view the NCAA as a breeding ground for potential professional stars. However, the excitement produced by the expansion of the tournament and the presence of colorful coaches and highly skilled players made the NCAA tournament an event worth watching in its own right. The Women’s Game Emerges Title IX of Congress’s 1972 education amendments forbade the exclusion of opportunity solely based on gender. Though Title IX did not specify athletics as included under its equal
The Eighties in America
rights legislation, universities took it to mean just that and adjusted scholarship budgets to accommodate the burgeoning women’s athletic scene. Women’s basketball programs saw an initial upswing in funding and development, only to see it halt with the Supreme Court’s 1984 ruling that Title IX did not apply to athletics and, therefore, that universities were not required to fund women’s athletics in the same manner as men’s. Finally, in 1988, Congress passed the Civil Rights Restoration Act, which reinstated Title IX and marked the start of unprecedented funding and appreciation for the women’s game. Another change within women’s college basketball arose because of the NCAA’s increased desire to administrate women’s college programs. Previously, women’s college basketball was controlled by the Association of Intercollegiate Athletics for Women (AIAW). In 1982, the NCAA announced the implementation of an alternate tournament to the annual championship presided over by the AIAW. Universities now had to decide which tournament to join. Though the AIAW had been integral in the promotion and facilitation of the women’s game, the NCAA had superior resources. Affiliation with the NCAA was construed by some as an act of betrayal against the AIAW, but, as Pat Summitt—the legendary women’s basketball coach at the University of Tennessee—put it, “I knew realistically that the only way the sport could grow . . . was under the umbrella of the NCAA.” By the summer of 1982, the AIAW was defunct. The most dominant women’s collegiate basketball force of the 1980’s was the University of Southern California Trojans. Not known as a traditional powerhouse, key early-decade recruiting landed twin sisters Pamela and Paula McGee, Los Angeles’s own Cynthia Cooper, and, most important, Cheryl Miller. The team won back-to-back championships in 1983 and 1984 and often drew comparisons to the hometown Lakers for its style and dominance. Miller, nicknamed “Silk” for her smooth shooting, is considered the greatest women’s basketball player ever. Olympics: Waiting for a “Dream Team” The 1980’s was the last decade in which amateur players would suit up for the U.S. Olympic team. In 1989, the International Basketball Federation (FIBA) voted to allow professionals to participate in international
Basketball
■
97
competition. With one act, the United States—with its stable of homegrown NBA talent—became the force to be feared on the international scene. However, at the beginning of the decade, this was not the predominant concern of the U.S. Olympic team. The American team hoped to avenge a controversial loss to the Soviet Union in the 1972 Games. Because of the American boycott of the 1980 Summer Olympics and the subsequent Soviet Bloc boycott of the 1984 Summer Games, the two teams did not meet until 1988, in Seoul, South Korea. The Soviet Union dominated the semifinal match and advanced to beat Yugoslavia—the 1980 gold medalist—to win the gold medal. Impact
In the 1980’s, basketball experienced a surge in popularity and prosperity and became a prominent entity in the marketplace and the entertainment-driven society of the subsequent decades. The success of basketball during this time was fueled by players who exhibited superior athleticism and made-for-the-media personalities. The popularity of the college game rivaled the NBA, while the women’s game developed into its own entity—epitomized by the fact that the University of Texas averaged greater than seventy-five hundred fans per game by decade’s end. The foundation laid by the innovations of the era’s players, coaches, and executive body steadied the edifice that basketball became in the 1990’s—a global phenomenon and a financial institution.
Further Reading
Bondy, Filip. Tip-Off: How the 1984 NBA Draft Changed Basketball Forever. Cambridge, Mass.: Da Capo Press, 2007. An extensive analysis of the events leading up to, and the repercussions of, the 1984 draft. Discusses in detail the stories of the drafted players and the motivations of the coaches and executives who helped shape the future of the league. Giglio, Joe. Great Teams in Pro Basketball History. Chicago: Raintree, 2006. Written for an adolescent audience, this book gives brief overviews on some of the best teams in the history of the NBA. Includes short chapters on four teams from the 1980’s: the Boston Celtics, the Detroit Pistons, the Los Angeles Lakers, and the Philadelphia 76ers. Grundy, Pamela, and Susan Shackelford. Shattering the Glass: The Remarkable History of Women’s Basketball. New York: New Press, 2005. A historical over-
98
■
Basquiat, Jean-Michel
view of the evolution of the women’s game, charting the progress of women’s participation in basketball from its onset in the late nineteenth century, through the struggle to gain equal status with the men’s game in 1970’s and 1980’s, to the success of the Women’s National Basketball Association (WNBA) in the late 1990’s and the early twenty-first century. Hareas, John. NBA’s Greatest. New York: Dorling Kindersley, 2003. Full of colorful photographs and written analysis of the league’s top players, coaches, teams, and games. Of particular interest are descriptions of defining moments of the 1980’s individual superstars. _______. Ultimate Basketball. New York: Dorling Kindersley, 2004. A decade-by-decade look at the evolution of the modern game. The book highlights standout players and teams and discusses the pervasive nature of the modern game as it spread its influence globally. Neft, David, and Richard M. Cohen. The Sports Encyclopedia: Pro Basketball. 2d ed. New York: St. Martin’s Press, 1990. Looking at each individual season, the encyclopedia notes league leaders, outstanding moments, and trends within the game. A good overview of the NBA. Packer, Billy, and Roland Lazenby. College Basketball’s Twenty-five Greatest Teams. St. Louis: Sporting News, 1989. Chosen by computer rankings, media opinion, and performance on the court, the teams included in this book are representative of some of the finest seasons in college basketball. Several teams from the 1980’s are featured: Georgetown, Houston, Louisville, North Carolina, and Oklahoma. Shouler, Kenneth. Total Basketball: The Ultimate Basketball Encyclopedia. Wilmington, Del.: Sport Classics, 2003. This massive text is a comprehensive study of basketball and a reliable source for any conceivable facet of the game’s history. Includes essays, statistical analysis, explanations on equipment and rules, and analysis of trends on the court. A primary source for any basketball research. Christopher Rager See also
Bird, Larry; Johnson, Magic; Olympic Games of 1988; Sports.
The Eighties in America
■ Basquiat, Jean-Michel Identification American neoexpressionist artist Born December 22, 1960; Brooklyn, New York Died August 12, 1988; New York, New York
Jean-Michel Basquiat, who started his career as a graffiti spray painter in New York City, became one of the most influential neoexpressionist artists of the 1980’s. His African American and Hispanic background was incorporated into his raw imagery of urban life. After dropping out of high school and leaving his middle-class Brooklyn neighborhood, Jean-Michel Basquiat took to the streets, spraying graffiti on buildings in lower Manhattan. He combined his images with cryptic phrases and signed his work with the pseudonym “SAMO.” Career oriented and ambitious, Basquiat transferred his work to canvas. His paintings were well received in several exhibitions in the early 1980’s, but it was the review of his work in the December, 1981, issue of Artforum that catapulted the young artist to fame. Basquiat’s unfiltered street energy was retained in his roughly scribbled paintings. He combined abstract and figurative styles. He continued to activate his images with crudely printed words, phrases, and ambiguous social comments. Basquiat combined paint, crayon, and collage on large, unprimed canvases. With his bold colors and compositions, he also incorporated gestural slashes and abstract symbols such as grids, crowns, arrows, and rockets. As an outsider trying to work within the predominantly white commercial-gallery system, Basquiat must be examined in the context of his multicultural heritage. His work was filled with references to African, African American, and Latino culture. The human figure, especially the black male, was a major subject of his painting, whose prominent themes included alienation, discrimination, intolerance, and violence. Basquiat took his sources from the disparate social worlds in which he lived. With his success, he befriended Andy Warhol and shared celebrity status in the New York City club scene. His work incorporated images from popular culture—such as sports figures, cartoon characters, and symbols of wealth and money—as well as references to Western art history. His subjects also included New York City street life, drugs, and AIDS. He incorporated skulls, skeletal
The Eighties in America
Basquiat, Jean-Michel
■
99
Jean-Michel Basquiat, right, poses with Andy Warhol in front of paintings on which the two artists collaborated at a gallery exhibition in SoHo in September, 1985. (AP/Wide World Photos)
figures, body parts, and symbols of death. Basquiat was only twenty-seven when his promising career ended with a drug overdose. During his last seven years, he produced more than one thousand paintings and two thousand drawings, which have inspired varying interpretation and debate. Impact
Basquiat was one of the most successful and controversial artists of the decade. He achieved international fame with solo exhibitions throughout the world. With his rough, energetic, figurative style, he came to be identified with the loosely knit group known as neoexpressionists. Borrowing from a variety of sources, he combined multiple cultural identities and aesthetic traditions. His subjects, although not new, were treated with a candor that captured important aspects of his multicultural urban envi-
ronment. His distinctive style incorporated both primitivism and sophistication. Although his work appeared simplistic, it created a powerful reflection of life in the 1980’s. Further Reading
Chiappini, Rudy, ed. Jean-Michel Basquiat. Milan, Italy: Skira Press, 2005. Emmerling, Leonhard. Jean-Michel Basquiat, 19601988. Cologne, Germany: Taschen Press, 2003. Pearlman, Alison. Unpackaging Art of the 1980’s. Chicago: University of Chicago Press, 2003. Cassandra Lee Tellier See also African Americans; Art movements; Neoexpressionism in painting; Racial discrimination; Schnabel, Julian; Slang and slogans.
100
■
The Eighties in America
Beattie, Ann
■ Beattie, Ann Identification
American novelist and short-story writer Born September 8, 1947; Washington, D.C. During the 1980’s, Ann Beattie continued to write fiction that chronicled and examined the baby-boom generation as it developed greater affluence and cultural influence in the United States. Originally known as the voice of the Woodstock generation of the 1960’s and 1970’s, Ann Beattie produced work in the 1980’s that continued to track this generation as it aged and became known by a new name: “yuppies.” As members of the baby-boom generation began to marry, have children, and—in some cases—become affluent, they became identified with this stereotype of the young urban (or upwardly mobile) professional. Beattie suggested that as the generation matured, its initial desire for personal liberty and gratification—associated with the youthful counterculture of the 1960’s—had developed into a coherent social and moral perspective. Although Beattie’s depiction of the narcissistic sense of entitlement of a generation that began to age into power and status in the 1980’s was often satiric, she also suggested that this generation’s determination to preserve its juvenility had resulted in an emptiness that was as sad as it was amusing. The libertarian principles that came to be a marker of this generation also ran the risk of failing to sustain both family and community; the issue of children became a particular puzzle for Beattie’s baby boomers, who chose to base their lives on the premise that the mandates of the untrammeled self must necessarily displace the needs of the weak and the vulnerable. In addition to tracking the ups and downs of the baby-boom generation in the 1980’s, Beattie is identified with that decade’s development of minimalism, a style of writing short fiction that Beattie is said to have pioneered. This sober and understated style of writing returned fiction to a realism that had been upended by the more romantic and experimental works of fiction celebrated in the 1960’s, and it was meant to be a correction of those works’ excesses and falsifications. As well as her identification with the minimalist literary movement, Beattie was identified as a leader in what was known as the “short-story renaissance” in the 1980’s, a resurgence of interest in short fiction that was a consequence of minimalism’s commercial
Ann Beattie. (Sigrid Estrada)
and critical success. Her greatest collection of short stories, The Burning House, was published in 1982; another collection, Where You’ll Find Me, and Other Stories, was published in 1986. In addition to her short stories, Beattie developed a greater interest in the novel form, publishing Falling in Place (1980), Love Always (1985), and Picturing Will (1989), all of which featured the problematic personal lives of former hippies newly reinvented as yuppies. Impact
Beattie’s stories and novels supplied a knowing, topical commentary on what was happening to baby boomers as they moved through the 1980’s. Her realistic fiction earned her a reputation as a major voice of her generation, both as its social historian and as a social critic.
Further Reading
McCaffery, Larry, and Sinda Gregory, eds. Alive and Writing: Interviews with American Authors of the 1980’s. Champaign: University of Illinois Press, 1987. Montresor, Jaye Berman, ed. The Critical Response to Ann Beattie. Westport, Conn.: Greenwood Press, 1993. Margaret Boe Birns See also Big Chill, The; Literature in the United States; Minimalist literature; thirtysomething; Yuppies.
The Eighties in America
■ Beirut bombings The Event
Terrorist bombings of the U.S. embassy in Beirut and the U.S. Marine compound at Beirut International Airport Date April 18, 1983, and October 23, 1983 Place Beirut, Lebanon The Beirut bombings resulted in the deaths of hundreds of Americans and Frenchmen and precipitated the withdrawal from Lebanon of U.S. military forces sent to promote stability in war-torn Beirut during the Lebanese Civil War. The Lebanese Civil War (1975-1990) entered a new phase in the summer of 1982, when the Multinational Force in Lebanon (MNF), a peacekeeping force including U.S. Marines, French paratroopers, and Italian soldiers, deployed in the country. The Marine contingent entered Lebanon in August; their immediate mission was to oversee the evacuation of the Palestine Liberation Organization (PLO) from Beirut. Wartime Chaos Beirut, the capital of Lebanon, had become a combat zone where several factions were competing for control of the city. Fighting in Beirut had erupted in 1975. Then, Yasir Arafat’s Fatah, the leading faction within the PLO, joined other armed factions opposed to the Lebanese government. Syrian military forces intervened at the request of the Lebanese government in 1976 without resolving the crisis. Israeli forces invaded Lebanon in 1982 to expel the PLO from their bases of operation inside Lebanon and then drove into Beirut, occupying positions on the west side of the city. Reacting to the situation, President Ronald Reagan sent special envoy Philip C. Habib to arrange a settlement. In August 1982, Habib was successful in bringing about an agreement for the evacuation of PLO fighters from Beirut. The Habib Agreement also called for the deployment of a three-nation force in the city during the period of the evacuation. The Marines stayed in Beirut for only a short while during the withdrawal, departing on September 10, 1982. However, only fifteen days later, the Lebanese president-elect, Bashir Gemayel, was assassinated. In the resulting chaos, Israeli forces moved into West Beirut, and the Marines were recommitted to Beirut. In the succeeding weeks and months, the Americans began to ally themselves with the government of Lebanon. Anti-government factions, with
Beirut bombings
■
101
the support of Syria, actively began to harass American forces, engaging them with sniper fire and occasional artillery fire. The Embassy Bombing On April 18, 1983, a bomb was detonated at the U.S. embassy in Beirut. The blast destroyed the front portion of the seven-story building, killing sixty-three occupants of the building, seventeen of whom were Americans. The bombing was carried out by a terrorist driving a van carrying a load of explosives. Multinational negotiations in May of 1983 resulted in an agreement for the withdrawal of Israeli military forces simultaneous with the withdrawal of Syrian military forces. However, as the Israeli withdrawal from Beirut began, there started a resurgance in combat between local militia forces. Attacks against American forces worsened.
A small U.S. flag and the Marine Corps flag fly above the ruins of the U.S. Marine compound at Beirut International Airport. This is the gate through which the bomb-laden terrorist truck passed before exploding on October 23, 1983. (AP/Wide World Photos)
102
■
Beloved
The Barracks Bombing
The First Battalion Eighth Marines, under the U.S. Second Marine Division, had established its headquarters at the Beirut International Airport. On the early morning of October 23, 1983, a truck driver drove into the compound and detonated a load of explosives in a suicide bombing. The American death toll from the explosion was 241 servicemen: 220 Marines, 18 Navy personnel, and 3 Army soldiers. Sixty Americans were injured. Rescue efforts at the U.S. compound continued for days. Rescuers were harassed at times by sniper fire, and some survivors were pulled from the rubble and airlifted to hospitals for treatment. It remains uncertain who was responsible for the bombing. Several radical Shiite militant groups claimed responsibility for the attacks. In May 2003, in a case brought by the families of the servicemen who were killed in Beirut, U.S. District Court Judge Royce C. Lamberth declared that the Islamic Republic of Iran was responsible for the 1983 attack. Lamberth found that there was sufficient evidence to conclude that Hezbollah, an organization formed with the assistance of the Iranian government, had conducted the bombing operations. President Reagan called the attack a “despicable act” and remained firm in his commitment to keep a military force in Lebanon. On October 27, 1983, President Reagan made a televised address to the United States of America. He declared that the military presence in Lebanon was important to the United States, because “peace in the Middle East is of vital concern to our nation” and “the area is key to the economic and political life of the West.” He also stated that U.S. involvement was “a moral obligation to assure the continued existence of Israel as a nation.”
Impact
Following the barracks bombing, the Marines were redeployed offshore, where they could not be targeted by terrorist bombing attacks. Unable to sustain the resolve he had expressed months before, on February 7, 1984, President Reagan ordered the Marines to begin withdrawal from Lebanon. On February 26, 1984, the last Marines left Beirut. In despair over the departure of U.S. military forces from Beirut, the Lebanese Army collapsed in February of 1984, with many soldiers deserting to join militias. By April, the rest of the multinational force had also withdrawn from Beirut. The city re-
The Eighties in America
mained in a state of civil war. Israel did not begin the withdrawal of its military forces until January of 1985. By June of 1985, Israeli forces had completely withdrawn from Lebanon, with the exception of occupying a security zone inside southern Lebanon to protect the northern territories of Israel. Along with the U.S. Embassy bombing, the barracks bombing prompted a review of the security of U.S. facilities overseas for the U.S. Department of State. The results of this review were published as the Inman Report. Further Reading
Frank, Benis M. U.S. Marines in Lebanon, 1982-1984. Washington, D.C.: History and Museums Division, Headquarters, U.S. Marine Corps, U.S. G.P.O., 1987. The official account of the U.S. Marines in Lebanon. McWhirter, James A. A Self-Inflicted Wound: The U.S. in Lebanon 1982-1984. Carlisle Barracks, Pa. : U.S. Army War College, 1989. Critical analysis of U.S. foreign policy in Lebanon and the reaction to the Beirut bombings. United States. Congress. House. Committee on Foreign Affairs. Subcommittee on Europe and the Middle East. The U.S. Embassy Bombing in Beirut: Hearing Before the Committee on Foreign Affairs and Its Subcommittees on International Operations and on Europe and the Middle East of the House of Representatives, Ninety-eighth Congress, First Session, June 28, 1983. Washington, D.C.: U.S. G.P.O., 1983. The official record of the congressional hearing enquiring into the embassy bombing of April of 1983. Michael E. Manaton See also
Foreign policy of the United States; Middle East and North America; Terrorism.
■ Beloved Identification Pulitzer Prize-winning novel Author Toni Morrison (1931) Date Published in 1987
Beloved confirmed Toni Morrison’s position as a major American author, and, at the same time, it furthered American sensitivity to issues of race and the legacy of slavery in the lives both of black people and white.
The Eighties in America
Beloved
■
103
history of slavery while it tells the story of a small community of slavery’s survivors.Beloved received immediate critical praise and was nominated for both the National Book Award and the National Book Critics Circle Award in 1987. When it won neither prize, forty-eight African American writers were moved to write a letter of protest to The New York Times. The book won the Pulitzer Prize in literature for 1988, making Morrison the second AfricanAmerican woman to be so honored, but the controversy left bad feelings in the literary community. In 1998, Beloved was made into a film starring Oprah Winfrey. While championed by some critics, the film failed at the box office.
Toni Morrison. (Stephen Chernin/Reuters/Landov)
With four novels already under her belt, Toni Morrison was recognized as an important novelist in the mid-1980’s, but Beloved (1987) both confirmed and extended her reputation. The Civil Rights movement of the previous decades had begun to teach white America the importance of looking seriously at the lives of African Americans, and 1977’s widely watched television miniseries Roots had made Americans look at slavery with fresh eyes. Thus, the nation seemed ready for the lessons of Beloved and for Morrison’s portrayal of a group of slaves just after the Civil War trying to establish new lives and a community in Cincinnati. The novel’s central character is the former slave Sethe; its title character is a mysterious woman who joins Sethe’s household and who may be the ghost of the infant Sethe murdered in a desperate gesture to keep the child from being taken back into slavery. The power of the past to hold the present hostage thus becomes a central theme of the novel. Just as Sethe and her living daughter, Denver, are trapped by Beloved, other characters are also burdened with the events of their slave-lives, events that are brutally portrayed in a number of graphic flashbacks. Among its other themes, the novel suggests the power of women to forge a family. Beloved is a complex work, incorporating images linked to the whole
Impact Beloved was soon widely recognized, especially within an American academy that had begun to embrace multiculturalism, as one of the most important American novels of the decade, if not the century. The novel spoke to the core of American identity in a new way, deepening the picture of slavery in the minds of both black and white Americans: Among its other themes, it invited readers to consider why a loving mother would be willing to murder her child rather than see bounty hunters carry the child back to the plantation from which her mother escaped. It also invited readers to consider the power of memory, both for the characters and for African Americans of later generations, for whom slavery was as powerful a memory as it was for Sethe. Further Reading
Eckstein, Lars. “A Love Supreme: Jazzthetic Strategies in Toni Morrison’s Beloved.” African American Review 40, no. 2 (Summer 2006): 271-284. Franco, Dean. “What We Talk About When We Talk About Beloved.” Modern Fiction Studies 52, no. 2 (Summer 2006): 415-440. “Novel Suggestions.” The Christian Century 123, no. 12 (June 13, 2006): 7. Weinstein, Philip M. What Else but Love? The Ordeal of Race in Faulkner and Morrison. New York: Columbia University Press, 1996. Ann D. Garbett See also African Americans; Feminism; Literature in the United States; Multiculturalism in education; Racial discrimination; Winfrey, Oprah.
104
■
The Eighties in America
Bennett, William
■ Bennett, William Identification
Chair of the NEH, 1981-1985; U.S. secretary of education, 1985-1989; and director of national drug control policy, 1989-1991 Born July 31, 1943; Brooklyn, New York William Bennett was a major figure in the conservative backlash to the multicultural movement in education and a advocate of what he believed were traditional American values in the arts and humanities. He also served as the first cabinet-level drug czar, decisively shaping the administrative structure of the U.S. Office of National Drug Control Policy. William Bennett, an outspoken conservative, served as chair of the National Endowment for the Humanities (NEH) from 1981 to 1985. As chair, he denied funding to programs that criticized America, called for the abolition of the NEH, and reduced the budget by 14 percent. His To Reclaim a Legacy: A Report on the Humanities in Higher Education challenged the educational trend toward diversity and multiculturalism. He believed that American schools should not be critical of the United States. President Ronald Reagan, who shared Bennett’s ideals, appointed him secretary of education, allowing him to focus on education reform. Although traditionally considered the least important cabinet department, under Bennett’s control from 1985 to 1988, the Department of Education gained more influence and visibility. Bennett was strongly opposed to federally guaranteed student loans, blaming them for the rise in college costs and student irresponsibility. He thought that colleges failed to educate students in citizenship and criticized Stanford University for including courses on non-Western cultures. He was a strong advocate for school vouchers, which allow parents to pay for private and religious schools with taxpayers’ money. Bennett tried to end the Bilingual Act of 1968, a law designed to help foreign-speaking students, and he hoped to implement a standard national curriculum. Bennett wanted to end tenure, tie teachers’ salaries to student performance, and implement national teacher competency tests. In 1989, President George H. W. Bush appointed Bennett to be the first director of the Office of National Drug Control Policy, a position he held until 1991. He became the director (popularly known as the drug czar) just as crack cocaine began to devas-
tate America’s inner cities. Washington, D.C., was particularly hard-hit, experiencing escalating drug use and a murder rate seven times the national average. AIDS was also rising among drug users. Bennett, responding to these threats, believed that all users, from first-time offenders to addicts, belonged in prison. He allocated billions of dollars for prisons and enforcement but almost nothing for education or treatment. Bennett imposed mandatory sentencing for drug offenses, causing the U.S. prison population to soar. Part of his strategy included refusing drug users federally subsidized housing and seizing their property. He resigned as drug czar after nineteen months, having failed, many believe, to win the so-called war on drugs. Impact Bennett, a controversial figure, left a lasting impact on American domestic policy in education and drug enforcement. His detractors think that his desire to return to his version of a valuesbased curriculum was misguided and discriminatory, and they accuse Bennett of pushing a right-wing, racist agenda based in a particular interpretation of Christian values. They perceive his policies as antimulticultural and politically incorrect. His supporters believe that he expressed a strong respect for the country and its founding values. Since retiring from public service, Bennett has written many books, served in conservative organizations, and hosted a popular radio talk show. Further Reading
Bennett, William J. Book of Virtues: A Treasury of Great Moral Stories. New York: Simon & Schuster, 1993. _______. From the Age of Discovery to a World at War. Vol. 1 in America: The Last Best Hope. Nashville, Tenn.: Nelson Current, 2006. Katz, Jon. Virtuous Reality: How America Surrendered Discussion of Moral Values to Opportunists, Nitwits, and Blockheads Like William Bennett. New York: Random House, 1997. Leslie Neilan See also AIDS epidemic; Bush, George H. W.; Closing of the American Mind, The; Conservatism in U.S. politics; Crack epidemic; Drug Abuse Resistance Education (D.A.R.E.); Education in the United States; Just Say No campaign; Multiculturalism in education; National Education Summit of 1989; Political correctness; Reagan, Ronald; School vouchers debate; Standards and accountability in education.
The Eighties in America
■ Bentsen, Lloyd Identification
U.S. senator from Texas from 1971 to 1993 and Democratic vice presidential candidate in 1988 Born February 11, 1921; Mission, Texas Died May 23, 2006; Houston, Texas Bentsen was a long-term figure in American politics. As the 1988 vice presidential candidate for the Democratic Party, he provided some of the only real interest in a generally lackluster campaign.
Bentsen, Lloyd
■
105
forty-one-year-old Quayle declared that he had as much experience as John F. Kennedy had had when Kennedy became president, Bentsen delivered a spontaneous reply that rocked the campaign: “Senator, I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.” Despite Bentsen’s impressive résumé and telegenic qualities, the DukakisBentsen ticket lost the election to George H. W. Bush and Quayle. Ironically, Bentsen had beaten Bush in the 1970 senatorial race. Following the 1988 campaign, Bentsen returned to the Senate and resumed his legislative career.
Lloyd Bentsen was born to a prosperous family in Mission, Texas. He attended the University of Texas, Impact Despite his twenty-two years in the Senate, where he earned a degree in law. When World War II Bentsen made no real significant impact on national began, Bentsen enlisted in the U.S. Army. He was legislation. In 1990, however, he helped negotiate a later commissioned an officer in the Army Air budget deal between Congress and President Bush, Corps, rising to the rank of colonel. During the war, resulting in a tax increase. Bentsen flew B-24 Liberators and was heavily decoFurther Reading rated. After the war, Bentsen embarked on a new caGoldman, Peter, and Tom Mathews. The Quest for the reer. Like many in his generation, the World War II Presidency, 1988. New York: Simon & Schuster, veteran turned to politics. 1989. In 1946, Bentsen won his first election, becoming Polsby, Nelson W., and Aaron Wildavsky. Presidential a judge in Hidalgo County, Texas. Two years later, Elections: Contemporary Strategies of American Eleche was elected to the House of Representatives. A toral Politics. New York: Free Press, 1991. protégé of powerful Speaker Sam Rayburn, Bentsen Rhonda L. Smith was twenty-seven years old and the youngest member of Congress. After six years in the House of RepreSee also Congress, U.S.; Dukakis, Michael; Elecsentatives, he left politics and began a career as a tions in the United States, 1988; Quayle, Dan. successful businessman. He returned to Congress in 1970, winning a seat in the U.S. Senate. Despite his political affiliation, Bentsen was known as an ally of busi“You’re No Jack Kennedy” ness interests and a fiscal and social conservative. He served as a member Excerpt from the Bentsen-Quayle vice presidential debate, held on of the Senate Committee on Finance, October 5, 1988: eventually becoming its chair in 1987. Senator Bentsen was little known Dan Quayle: I have far more experience than many others outside of Congress when he was chothat sought the office of vice president of this country. I sen by Massachusetts governor Mihave as much experience in the Congress as Jack Kennedy chael Dukakis as his running mate in did when he sought the presidency. I will be prepared to the 1988 presidential campaign. The deal with the people in the Bush administration, if that unhandsome, white-haired Texan confortunate event would ever occur. veyed a sense of wisdom and experiJudy Woodruff (moderator): Senator Bentsen. ence to voters, and his conservatism Lloyd Bentsen: Senator, I served with Jack Kennedy. I knew was meant to balance Dukakis’s repuJack Kennedy. Jack Kennedy was a friend of mine. Senator, tation as a New England liberal. Bentyou’re no Jack Kennedy. sen gained notoriety during his vice Dan Quayle: That was really uncalled for, Senator. presidential debate against Republican nominee Dan Quayle. When the
106
■
The Eighties in America
Berg, Alan
■ Berg, Alan Identification
Jewish radio talk-show host in Denver assassinated by neo-Nazis in 1984 Born January, 1934; Chicago, Illinois Died June 18, 1984; Denver, Colorado The murder of Alan Berg alerted Americans to the threat of neo-Nazi terrorism within the country. Alan Berg was born in Chicago in January, 1934. His father, a dentist, was the descendant of Russian Jews. Berg attended universities in Colorado and Florida before graduating from DePaul University School of Law in l957. He then became a successful attorney in Chicago; however, after bouts with seizures and alcoholism, he moved to Denver and opened a clothing store. In the fall of 1971, Berg began a career in talk radio, when he appeared as a guest on a talk show on KGMC hosted by Lawrence Gross. When Gross moved to San Diego, Berg replaced him as host of the program. In February, 1981, he became the host of another talk show on KOA in Denver. Berg’s program could be heard in thirty states throughout the western part of the United States. A social and political liberal, Berg became controversial as he debated his callers. In 1981, a small group of men led by Robert Jay Mathews broke off from the Aryan Nations and formed a militant group called the Order. The purpose of the Order was to follow the blueprint for a right-wing racial group put forward in a book called The Turner Diaries (1978). This novel, written by neoNazi Dr. William Pierce under the pseudonym Andrew Macdonald, tells the story of a group plotting to overthrow the Zionist Occupied Government (ZOG) and to create an Aryan nation. Members of Mathews’s group counterfeited money and robbed banks and armored cars to raise funds to finance their implementation of such a revolution. They also compiled a list of individuals to be assassinated. One person on the list was Alan Berg, with whom Order members had talked on KOA. On June 18, 1984, Berg was shot and killed when he stepped from his car in Denver. Law-enforcement officials soon became aware of the Order and the link between the Berg murder and the other unlawful activities of the group. Mathews was killed in a standoff with police on Whidbey Island, Washington, on December 12, 1984. Twenty-
four other members of the Order were ultimately arrested on a variety of charges. Thirteen pleaded guilty, and another ten were convicted of various crimes. While no members of the Order were actually charged with the murder of Alan Berg, three were tried and convicted of the federal crime of violating his civil rights. Impact While the Order was ultimately eliminated by law-enforcement officials, Alan Berg’s assassination alerted Americans to the threat of domestic neo-Nazi terrorism in the l980’s. Berg’s life and death were documented in a book, Talked to Death (1987), by Stephen Singular. The book was the basis for a stage play and Oliver Stone’s later movie Talk Radio (1988). Further Reading
Ridgeway, James. Blood in the Face: The Ku Klux Klan, Aryan Nations, Nazi Skinheads, and the Rise of a New White Culture. 2d ed, rev. and updated. New York: Thunder’s Mouth Press, 1995. Singular, Stephen. Talked to Death: The Life and Murder of Alan Berg. New York: Beech Tree Books, 1987. William V. Moore See also Crime; Domestic violence; Jewish Americans; Nation of Yahweh; Skinheads and neo-Nazis; Terrorism.
■ Berlin Wall Definition
Physical barrier between East and West Berlin Date August 13, 1961, to November 9, 1989 The Berlin Wall, created at the height of Cold War tensions, remained a symbol of those tensions, as well as an actual barrier between the West and the Soviet Bloc, until it was opened on the night of November 9, 1989. In the wake of Germany’s 1945 defeat in World War II, Germany and its capital Berlin were divided into British, French, American, and Soviet occupation zones. Berlin was located deep within the Soviet zone, but it too was divided into four quadrants. Soviet leader Joseph Stalin sought to force the British, French, and Americans to withdraw from Berlin and denied them ground access to the city in the Berlin Blockade of 1948-1949. After Stalin lifted the block-
The Eighties in America
Berlin Wall
■
107
1963, 1 million West Berliners listened as President John F. Kennedy made his famous “Ich bin ein Berliner” (intended to mean “I am a citizen of Berlin”) speech. Reagan’s Visits During the Berlin Wall’s existence from August 13, 1961, to November 9, 1989, it is estimated that about two hundred people were killed attempting to cross over or under the wall, with another two hundred injured, while about five thousand successfully escaped. By the early 1980’s, political conservatives President Ronald Reagan and Chancellor Helmut Kohl were in power in the United States and West Germany, respectively. In June, 1982, Reagan visited Germany and received a tour of the Berlin Wall, which he pronounced to be “ugly.” In 1985, Mikhail Gorbachev became President Ronald Reagan speaks at the Brandenburg Gate in June, 1987, demanding the leader of the Soviet Union. His that the Berlin Wall be torn down. (Ronald Reagan Presidential Library) reform policies of glasnost (openness) and perestroika (restructuring) initiated a series of changes ade in 1949, the British, French, and American occuin the Soviet Union and throughout Eastern Eupation zones in Germany were merged to create the rope. Federal Republic of Germany (West Germany) on On June 12, 1987, Reagan, accompanied by his May 12, 1949, and the Soviet zone became the Gerwife Nancy, Chancellor Kohl, and other dignitaries man Democratic Republic (East Germany) on Octomade a speech at the Brandenburg Gate, part of the ber 7, 1949. Many citizens of East Berlin and East Berlin Wall. Angered that his speech would not be Germany migrated to West Germany in search of heard by East Berliners because East German augreater freedom and economic opportunity. thorities had moved people out of earshot of loudThe loss of hundreds of thousands of skilled workspeakers set up to broadcast the speech toward East ers to the West during the 1950’s and early 1960’s Berlin, Reagan, his voice rising with emotion, decaused the Soviet Union and East Germany to seal off manded: “Mr. Gorbachev, tear down this wall.” Sevthe border between East and West Berlin on August eral months later, on November 4, 1987, in a tele12, 1961, and workers began stringing barbed wire vised speech, Reagan speculated how wonderful it along the border on the East German side. As East would be if he and Gorbachev could take down the Berliners realized what was happening, many es“first bricks” of the wall. caped through or over the wire, including East German border guards. In the following days, concrete The Wall Comes Down In January, 1989, East Gerblocks and barriers began to replace the barbed wire. man leader Erich Honecker stated that the Berlin When completed, the Berlin Wall ran through streets Wall might exist for another fifty or one hundred and along canals and even apartment buildings, comyears; in less than a year, his prophecy was proved prising sixty-six miles of twelve-foot-high concrete wrong. As reform movements emerged in Eastern wall and forty-one miles of wire fencing. On June 24, Europe in 1989, Honecker attempted to remain
108
■
Big Chill, The
steadfast, but events overtook him. Massive demonstrations in East German cities in September and October, 1989, swelled beyond the government’s ability to squelch them. On October 17, 1989, Honecker was replaced by Egon Krenz, who met with Gorbachev on November 1, 1989, at which time Gorbachev urged Krenz to allow East Germans to travel freely. In the wake of a demonstration of 500,000 East Berliners on November 4, 1989, the East German government decided to end restrictions on travel to the West. At a press conference on November 9, 1989, in East Berlin that began at 6:00 p.m. Berlin time, East German official Gunter Schabowski began reading a lengthy announcement about the end of travel restrictions. About 7:00 p.m., in response to reporters’ questions as to when this would take place, Schabowski replied “immediately.” Actually the changes were to take place on the next day, November 10, 1989, but he had not been given this information. Journalists ran to report the news, which quickly spread in both East and West Berlin. East Berliners gathered at the seven checkpoints seeking to enter West Berlin, but the East German border guards had not been informed of the lifting of travel restrictions. Repeated calls placed by the guards did not provide clarification, while more and more East Berliners crowded the checkpoints. West Berliners gathered on the other side, chanting encouragement to the East Berliners. Between 9:30 and 10:00 p.m., East German border guards began to open the gates, allowing the East Berliners to enter West Berlin, where they were greeted by cheering West Berliners. People climbed on portions of the wall to celebrate, while others chipped off pieces to keep as souvenirs or to sell; such actions might have gotten them shot earlier in the day. After 11:00 p.m., the Eastern German government officially ordered the crossings open. Within days, large sections of the Wall were opened to create more crossing points. Impact The breech of the Berlin Wall set in motion an unstoppable demand for the reunification of Germany. Egon Krenz resigned in December, 1989, and his government was replaced by a non-Communist government. Gorbachev indicated that the Soviets would not oppose reunification, although other European countries, such as Great Britain and France, were concerned about how a united Germany would affect the balance of power in Europe.
The Eighties in America
U.S. president George H. W. Bush supported Chancellor Kohl’s proposals for reunification. In June, 1990, the East German government began removing the Berlin Wall altogether, and on July 1, 1990, currency exchange by East Germans for West German currency heralded an important step toward reunification, which formally occurred on October 3, 1990. The changes in Berlin and Germany were part of widespread change throughout Eastern Europe as communist regimes collapsed in 1989, and component parts of the Soviet Union proclaimed their independence, resulting in the dissolution of the Soviet Union and the end of the Cold War in 1991. Had the Berlin Wall not been breached so soon after his speech, Reagan’s demand for it to be torn down might have been forgotten. As it was, however, the phrase “Mr. Gorbachev, tear down this wall” became one of the most famous to be uttered during the 1980’s, and it would later be treated as emblematic of Reagan’s presidency. Further Reading
Cannon, Lou. President Reagan: The Role of a Lifetime. New York: Public Affairs, 2000. This study of Reagan’s presidency provides detailed material about Reagan’s trips to Berlin. Hilton, Christopher. The Wall: The People’s Story. Stroud, Gloucestershire, England: Sutton, 2001. A journalistic account of the impact of the Wall on the citizens of Berlin. Wyden, Peter. Wall: The Inside Story of a Divided Berlin. New York: Simon & Schuster, 1989. A comprehensive treatment of the building of the Wall, its impact on Berliners, and it’s place in the Cold War. Mark C. Herman See also Foreign policy of the United States; Reagan, Ronald; Reagan Doctrine; Strategic Defense Initiative (SDI).
■ Big Chill, The Identification Nostalgic bittersweet comedy film Director Lawrence Kasdan (1949) Date Released September, 1983
Lawrence Kasdan’s The Big Chill charted the course of post-World War II baby boomers as they confronted adulthood and their own mortality.
The Eighties in America
Few film critics or movie pundits could have predicted the impact that Lawrence Kasdan’s 1983 film The Big Chill would have on the generation that came of age during the 1960’s. Kasdan and Barbara Benedek wrote the movie, which tells the story of a group of thirty-something former University of Michigan college friends who gather together to attend the funeral of one of their own, dead from suicide. The opening title sequence features a corpse (famously played by Kevin Costner) being dressed for a funeral while the Rolling Stones’ “You Can’t Always Get What You Want” plays in the background. Indeed, the words to this song prepare viewers for the rest of the movie: None of the members of this group of friends has found exactly what he or she wants in life; coming together again after twenty years, however, gives many of them what they need. For Karen (JoBeth Williams), this means coming to terms with her life as a homemaker and mother rather than as the writer she thought she would be. Nick (William Hurt) must quiet the ghosts of the Vietnam War and find love with a young Chloe (Meg Tilley). Meg
Big Chill, The
■
109
(Mary Kay Place), who has spent her life since college pursuing a high-powered career as a lawyer, needs one of her male friends to impregnate her before her “biological clock” stops ticking. In all, the group who gather at the home of Sarah (Glenn Close) and Harold (Kevin Kline) need one another to reaffirm who they were when they were young, so they can at last become fully realized adults. The large ensemble cast featured actors destined to be among the most important in their generation. Nearly every member later received Academy Award nominations for other endeavors (including Costner, whose scenes in The Big Chill wound up on the cutting-room floor). In addition, the soundtrack of the movie became an immediate best seller, and it strongly influenced the trend toward creating highly marketable, evocative soundtracks for motion pictures that was to continue throughout the decade and beyond. The Big Chill was nominated for three Academy Awards, as well as for a Golden Globe award and a host of other honors. Impact It is difficult to overestimate the importance of this film to an audience composed of those who attended college from roughly 1963 through 1974 and who were in their early thirties at the time of the film’s release. Many members of this group struggled with issues of meaning and mortality during the early years of the 1980’s, as they attempted to make the transition from their college years to adulthood. Kasdan’s film poignantly and incisively targeted many of their greatest fears and desires, including the loss of idealism, the need for love, the fear of growing old and dying, and the desire to leave something lasting for the next generation. Further Reading
Carey, Melissa, and Michael Hannan. “Case Study 2: The Big Chill.” In Popular Music and Film, edited by Ian Inglis. New York: Wallflower, 2003. McGilligan, Patrick. Backstor y 4: Interviews with Screenwriters of the 1970’s and 1980’s. Berkeley: University of California Press, 2006. Troy, Gil. Morning in America: How Ronald Reagan Invented the 1980’s. Princeton, N.J.: Princeton University Press, 2005. Diane Andrews Henningfeld Clockwise from front left: Director Lawrence Kasdan and actors Tom Berenger, Jeff Goldblum, Kevin Kline, and William Hurt on location during the filming of The Big Chill in 1983. (Hulton Archive/Getty Images)
See also
Academy Awards; Beattie, Ann; Close, Glenn; Costner, Kevin; Film in the United States; Hurt, William; thirtysomething; Yuppies.
110
■
Bioengineering
■ Bioengineering Definition
The modification of organisms by means of biological techniques, especially by the direct manipulation of genetic material
The 1980’s saw rapid advances in the basic techniques of genetic manipulation and their application to the creation of new and modified organisms. These advances promised a revolution in various fields of technology, ranging from agriculture to health care. Two significant events occurred in 1980 that transformed the future potential of bioengineering. One was the development of a technique for causing strands of deoxyribonucleic acid (DNA) to multiply rapidly by means of a polymerase chain reaction (PCR); the other was the decision of the U.S. Supreme Court to allow the patenting of an organism—an “oil-eating” bacterium—produced by genetic modification. DNA multiplication made it possible to develop fundamental techniques that facilitated DNA analysis, which would lead in turn to the use of DNA fingerprinting in forensic science and would enable researchers to sequence the human genome. The application of patent law to genetic engineering, meanwhile, made available lavish financial backing to biotechnology companies and generated a stock-market boom as spectacular as the one from which information technology companies benefited. Animals and Genetic Engineering Prior to the 1980’s, successful experiments in genetic engineering had been virtually restricted to the transformation of bacteria. In 1981, however, Ohio University conducted experiments on mice, producing the first transgenic animals (that is, animals incorporating genes from another species). In 1984, Harvard University successfully applied for a patent on an “oncomouse” that had been genetically modified to be susceptible to a kind of human cancer. The oncomouse became the first of a rapidly expanding population of “mouse models” afflicted with significant human genetic deficiency diseases. The technique allowed each disease’s pathology to be carefully analyzed, tracked, and tested against potential treatments. The genetic modification of animals was dependent on methods of in vitro fertilization (IVF) that were also applied to the treatment of human infertility. The first test-tube baby had been born in 1978;
The Eighties in America
tens of thousands more followed in the 1980’s, when the technique was applied as an accepted method of assisting couples who had difficulty conceiving. The decade also saw a rapid growth in surrogate motherhood, in which embryos produced by IVF were implanted in the womb of another woman when the biological mother was unable to carry and bear her child. The freezing of embryos for storage became commonplace, resulting in fervent ethical debates regarding the fate of “spare” embryos made redundant by the success of earlier IVF treatments. The multiplication of embryos by splitting—a kind of cloning—also became practicable; it was almost exclusively applied in the 1980’s to animal embryos, but ethical discussions leapt ahead in anticipation of human splitting. Patenting Life Forms In 1983, the first patents were granted on genetically modified plants, initially for plants equipped with “marker genes,” which allowed successfully transformed individuals to be isolated by virtue of their immunity to an antibiotic. By 1985, however, the first U.S. field trials were licensed for crop plants—tomatoes and tobacco—that had been genetically modified to resist insect pests and herbicides. The year 1989 saw the development of the first transgenic plants with genes coding for antibodies against human diseases (“plantigens”), suggesting the possibility of a new kind of pharmaceutical farming (“pharming”). The production of transgenic oilseed rape plants whose storage protein contained an endorphin—a natural painkiller—was also reported in 1989. The genetic modification of food plants, aggressively promoted by Monsanto—a chemical company that had invested heavily in the genetic modification of such plants as corn, soya, and potatoes—caused considerable alarm within the environmental movement, some of whose key organizations began campaigning vigorously against further field trials. The arrival of Calgene’s Flavr Savr tomato—modified to resist decay—on supermarket shelves in the mid1980’s excited considerable resistance to so-called Frankenstein foods. Similar opposition developed to bioengineering experiments in animal husbandry, such as the treatment of dairy cattle with genetically modified growth hormones, which were developed in 1982 and forcefully marketed by Monsanto in the United States and Canada in 1985-1986. Soon, such products were met with a growing resis-
The Eighties in America
tance to the whole idea of genetically modified organisms (GMOs). Discussion of the possibility of mounting a project to sequence the human genome began in 1986, and in 1988 James D. Watson was appointed head of the U.S. National Institutes of Health’s human genome initiative; the project got under way in the following year. The same technical advances that made possible the Human Genome Project assisted the spread of “genetic screening” of embryos for various hereditary conditions. Although it was not completed until the early twenty-first century, the launch of this project was a key symbol of the expectations attached to bioengineering in the 1980’s. Impact The rapid development of genetically modified plants made less impact than expected, partly because of environmentalist opposition and partly because progress slowed somewhat after initial successes. The advancement of animal transgenics also slowed dramatically because of unforeseen technical difficulties, so the promise of the 1980’s was only partly fulfilled in the next two decades. Still, by decade’s end, many genetic techniques had moved out of the realm of science fiction to become probable technologies of the near future. Further Reading
Boylan, Michael, and Kevin E. Brown. Genetic Engineering: Science and Ethics on the New Frontier. Upper Saddle River, N.J.: Prentice Hall, 2001. A useful summary of the development of genetic engineering and the ethical issues raised by its applications. Bud, Robert. The Uses of Life: A History of Biotechnology. Cambridge, England: Cambridge University Press, 1993. A succinct history, culminating with the breakthroughs and advances made in the 1980’s. Fowler, Cary. Unnatural Selection: Technology, Politics, and Plant Evolution. Amsterdam, the Netherlands: Gordon and Breach, 1994. An argumentative analysis of the social implications of 1980’s advances in the bioengineering of plants. Kneen, Brewster. Farmageddon: Food and the Culture of Biotechnology. Gabriola Island, B.C.: New Society, 1999. An alarmist account of the development of the environmental movement’s opposition to GMOs in the 1980’s and 1990’s and the associated changes in regulation. Krimsky, Sheldon. Biotechnics and Society: The Rise of Industrial Genetics. New York: Praeger, 1991. An
Biological clock
■
111
account of the rapid growth of biotechnology companies in the 1980’s and the range of their enterprise. Silver, Lee M. Remaking Eden: How Genetic Engineering and Cloning Will Transform the American Family. New York: Avon Books, 1998. A painstaking account of the application of biotechnology to problems of human infertility in the 1980’s and 1990’s. Brian Stableford See also Agriculture in the United States; Biopesticides; Cancer research; DNA fingerprinting; Environmental movement; Fetal medicine; Food trends; Genetics research; Health care in Canada; Health care in the United States; Inventions; Medicine; Science and technology.
■ Biological clock Definition
Slang term referring to a purported desire (primarily female) to have children while still of child-bearing age
During the 1980’s, popular culture embraced this term to emphasize the pressures felt by professional or single women who wanted to become mothers but believed they were running out of time to do so. Although often meant to caricature the situation, the term “biological clock”—with its image of a ticking timepiece counting down the period in which reproduction remained a possibility—pointed to a genuine double-bind felt by some women forced to choose between different kinds of personal fulfillment. Beginning in this decade, women’s studies texts and the media treated the social expectation for women to have children, as well as the importance of age to that process, in contrasting ways. Media treatment hyped the desirability for women to marry and have children and the unsuitability for them to have careers preventing this. Women’s texts treated the topic under “infertility” and as reproductive choice and additionally were critical of what were seen as scare tactics. While the media told women they cannot “have it all,” the women’s studies books emphasized that reproduction is not the only measure of success for a woman. Both, however, agreed that women’s fertility peaks in the twenties, remains strong through thirty-five,
112
■
Biopesticides
and thereafter declines sharply until forty, after which it becomes problematic, although women’s health sources emphasize great variation in decline dates and rates. Impact The concept of a biological clock exerting pressure on women as they approached their late thirties was the invention of a society in which some women prioritized their careers or other sources of personal fulfillment above marriage and child rearing. It was therefore emblematic of the ambivalent attitude in the 1980’s toward feminism: The term would have made little sense earlier in the century, when the average woman had little choice but to “settle down” and have children, but it would have made just as little sense were there not a residual sense during the decade that women were “meant” to be mothers and that those species of feminism that denied priority to motherhood were somehow unnatural (opposed to biology). Further Reading
Birrittieri, Cara. What Every Woman Should Know About Fertility and Her Biological Clock. Franklin Lake, N.J.: New Page Books, 2004. Boston Women’s Health Book Collective. Our Bodies, Ourselves. New York: Touchstone/Simon & Schuster, 2005. Hewlett, Sylvia Ann. Creating a Life: Professional Women and the Quest for Children. New York: Hyperion, 2002. Payne, Roselyn. AMWA Guide to Fertility and Reproductive Health. New York: Dell/Random House, 1996. Sandelowski, Margarete. Women, Health, and Choice. Englewood Cliffs, N.J.: Prentice Hall, 1981. Erika E. Pilver See also
Age discrimination; Bioengineering; Feminism; Mommy track; thirtysomething; Women in the workforce; Women’s rights.
■ Biopesticides Definition
Pesticides derived either from living, nonchemical matter or from nontoxic, naturally occurring chemicals
The movement to replace synthetic chemicals with organic and natural pesticides grew during the 1980’s. Scientific advances in the use and modification of living organisms
The Eighties in America
to combat pests proceeded alongside a growing fear of the health consequences of using chemical pesticides, which were in any case proving less effective as pests developed resistances to them. By the end of the decade, biopesticides were preferred over chemical pesticides whenever their use was feasible. In 1969, the Federal Pesticide Commission urged restricting pesticide use. Government agencies had found traces of pesticides in the fatty tissues of nearly every American tested, and it was believed that even these trace amounts could have deleterious effects. In 1970, President Richard M. Nixon created the Environmental Protection Agency (EPA). Congress, which had authorized the EPA’s establishment, followed suit with a flurry of environmental legislation: Dichloro-diphenyl-trichloroethane (DDT), a pesticide that had been found in the fat and liver of even Artic polar bears, was banned in 1972. In 1976, the Toxic Substances Control Act mandated governmental analysis of all chemical risks to health. In 1978, the city of Love Canal near Niagara Falls, New York, was found to be built on a toxicwaste dump; 1,004 families were evacuated. A national superfund to clean up twelve hundred such sites was created. Polychlorinated biphenyls (PCBs), previously used as coolants and lubricants, were banned in 1979, but they stayed resident in soil and animals for years. By the 1980’s, public concern about chemicals in food and water was quite strong. Chemical pesticides and herbicides were blamed for pushing California sand crabs, Florida snooks, black-footed ferrets, brown pelicans, bald eagles, peregrine falcons, and others toward extinction. The EPA found pesticides and heavy metal residues in much of the nation’s groundwater. Chemical-Resistant Pests As chemical-based pesticides were used more broadly, they became less effective, requiring farmers to use more of a given substance to achieve the same result. The use of farm chemicals doubled between 1960 and 1985 as a direct result of this problem. Department of Agriculture figures noted that crop losses from weeds and insects, which were 32 percent in 1945, were 37 percent by 1984. The term “superbug” was coined, denoting organisms immune to chemicals and antibiotics. In 1983, the National Academy of Sciences estimated that 447 insects had developed resistances to chemical insecticides, half of these crop pests, with the Colorado potato beetle resistant to all major
The Eighties in America
categories of insecticide. In 1983, the Food and Drug Administration (FDA) found DDT in 334 out of 386 domestic fish tested, even though DDT had been banned since 1972. In 1984, the cancercausing fungicide ethylene dibromide (EDB) was banned, after it was found in prepared cake mixes and on citrus fruit. Birth of Biopest Control in the 1980’s Biological methods of pest control had been used in the United States since the nineteenth century, when California imported Australian ladybugs to eat pests killing eucalyptus trees. From 1959 to 1980, stingless wasps from Europe and Asia were used very successfully against alfalfa weevils in eleven states. By the first half of the 1980’s, biopesticides had become a viable alternative to chemicals, spurring a national trend. Start-up companies produced successful products in California and on the East Coast. The public, frightened by media warnings and distrustful of government safety standards, was eager to find such alternatives to potentially toxic chemicals. Biopesticides use four approaches: The most common is flooding infestations with laboratorygrown natural predators that specialize in eating target insects, mites, or weeds. A second is spraying harmless substances that are nevertheless deadly to pests—insecticidal soap or Bacillus thuringiensis (BT), for example. A third approach is using passive traps that pests cannot escape from, usually incorporating pheromones to coax them inside. The most complex approach is genetically engineering plants to resist or kill pests, such as implanting a gene in corn that causes it to kill corn borers. Purists advocated using nothing but natural controls. Organic farms appeared, their operators weeding by hand, using bio-controls to combat pests, and working mounds of compost into their soil instead of chemical fertilizers. These farms stayed small because of the labor required to maintain them. “Organic” fruit and vegetables appeared in supermarkets, for prices much higher than were charged for “conventional” produce. Major food marketers, eager to capitalize on the trend toward spending more for more “natural” foods, began boxing “natural” food products, particularly breakfast cereals. Eventually, the government would regulate the use of labels such as “natural” and “organic.” Impact By the early 1980’s, biopesticides were the control method of choice countrywide. BT, a natu-
Biopesticides
■
113
rally occurring bacteria lethal to caterpillars, was used to control gypsy moths and spruce budworms. The U.S. Forestry Service switched to BT under public pressure after one of its trucks accidentally dumped insecticide into the Columbia River, killing seventy thousand trout. A variant, BTI, was used to kill mosquitoes and blackfly larvae. Predatory nematodes (microscopic soil worms) were used to control Colorado potato beetles. Ladybugs were sold by the millions to eat aphids. Predatory wasps were imported to kill mole crickets and mealy bugs. Florida imported weevils to eat water hyacinths. In 1987, the government spent more than $800 million on biological pest-control research projects. Use of farm chemicals was gradually reduced, and large companies entered the biotech field. Biopesticides became widely available not only for farmers but also for home gardeners, who could purchase them at nurseries and in popular gardening and lawn-care catalogs. Large-scale farming cannot profitably sustain complete biological controls. However, the rising cost of chemicals and the laws limiting their use drove farmers in the 1980’s to adopt integrated pest management (IPM) strategies. IPM requires crop rotation, surface tillage that does not disturb deeplying weed seeds, cleaning debris housing overwintering pests, exploiting natural predators and nontoxic substances to kill weeds and insects, crop rotation, and crop varieties resistant or repellant to pests. Chemicals are used only if infestations get out of hand. The natural predator portion of IPM is initially more expensive than chemicals, but it proves less expensive over time, as introduced predators stay in farmers’ fields and need not be reintroduced every year. Further Reading
Altieri, Miquel Angel, and Clara Ines Nicholls. Biodiversity and Pest Management in Agroecosystems. 2d ed. New York: Food Products Press, 2004. A handbook about large-scale farming using biopesticidal techniques. Brown, Michael H. A Toxic Cloud. New York: HarperCollins, 1987. Journalistic account of cases of poisoning by dioxin and other chemicals. Carson, Rachel. Silent Spring. 40th anniversary ed. Boston: Houghton Mifflin, 2002. The 1962 book that ignited the antipesticide movement, written by a woman who has become the icon of the antichemical movement.
114
■
Bird, Larry
The Eighties in America
Ellis, Barbara W., and Fern M. Bradley, eds. The Organic Gardener’s Handbook of Natural Insect and Disease Control. Emmaus, Pa.: Rodale Press, 1997. The bible of nonchemical pest control; perennially in print. James Pauff See also
Bioengineering; Farm crisis; Genetics research; Malathion spraying; Superfund program; Water pollution.
■ Bird, Larry Identification
Hall of Fame professional basketball player Born December 7, 1956; West Baden, Indiana As the leader of the Boston Celtics basketball dynasty, Bird became a living legend and is recognized as one of the greatest players ever. After leading Indiana State University to the National Collegiate Athletics Association (NCAA) basketball national championship game in 1979, Larry Bird became a member of the Boston Celtics. During the 1979-1980 season, Bird led the Celtics to a 6121 record, the best in the National Basketball Association (NBA). In his first NBA season, Bird was voted a member of the Eastern Conference All-Star team and was named NBA Rookie of the Year, edging out his collegiate rival Earvin “Magic” Johnson, who played for the Los Angeles Lakers. In the 1980-1981 season, Bird was joined by Robert Parish and Kevin McHale to form one of the best front courts in the NBA. The Celtics won the NBA championship in six games, defeating the Houston Rockets. In the 1983-1984 season, Bird won the NBA’s Most Valuable Player (MVP) award and led the Celtics to another NBA championship, defeating the Los Angeles Lakers in a seven-game series. Bird was named the MVP of the NBA finals. He captured the NBA MVP award again in the 1984-1985 season. During the 1985-1986 season, Bird once again led the Celtics to the NBA championship with a six-game victory over the Houston Rockets. He was named the NBA MVP for the third consecutive year and the MVP in the NBA finals for a second time. During his illustrious NBA career from 1979 to 1992, Bird averaged 24.3 points per game, recorded a .496 field-goal percentage, a .886 free-throw per-
Larry Bird passes the ball after being trapped by two Philadelphia 76ers during a Boston Celtics home game in May, 1982. (AP/ Wide World Photos)
centage, 5,695 assists, and shot .376 from beyond the three-point line. He was named an NBA All-Star twelve times, a member of the All-NBA team nine times, and voted onto the NBA All-Defensive second team three years in a row from 1982 to 1984. Impact Along with Magic Johnson, Larry Bird helped rejuvenate the NBA by bringing fan interest back to a high level. The Bird-Johnson rivalry, which carried over from the 1979 national collegiate basketball championship game, fueled new excitement in the NBA and helped build a franchise rivalry between the Celtics and the Lakers. Bird was recognized for his innate ability to anticipate and react to the moves of his opponents. He was a fierce competitor whose leadership and team play brought out the best in his teammates. Known as one of the most
The Eighties in America
complete basketball players ever to play in the NBA, Bird recorded an amazing sixty-nine “triple doubles” (games in which three major statistics, such as points, rebounds, and assists, reach double digits) in his career—fifty-nine in the regular season and ten in postseason play. Subsequent Events
In 1992, Bird won a gold medal in basketball as a member of the U.S. Olympic “Dream Team.” He was elected to the Naismith Memorial Basketball Hall of Fame in 1998.
Further Reading
Bird, Larry, with Jackie MacMullan. Bird Watching: On Playing and Coaching the Game I Love. New York: Warner Books, 1999. Kramer, Sydelle A. Basketball’s Greatest Players. New York: Random House, 1997. Shaw, Mark. Larry Legend. Lincolnwood, Ill.: Masters Press, 1998. Alvin K. Benson See also
Basketball; Johnson, Magic; Sports.
■ Black Monday stock market crash The Event
Sudden decline in the value of most major publicly traded stocks Date October 19, 1987 Hundreds of traders on Wall Street responded to fears about inflation and rising interest rates by using newly installed computerized trading programs to sell stocks, thereby causing the Dow Jones Industrial Average to suffer the largest one-day point loss and the second largest one-day percentage loss in its history to that date. On October 19, 1987, the New York Stock Exchange (NYSE) experienced a dramatic sell-off, in which most of the stocks listed on the exchange lost a great deal of their value. The Dow Jones Industrial Average, which tracked the value thirty blue-chip stocks listed on the NYSE, plunged 508 points to close at 1,738.74. The drop equaled 22.6 percent of the average, which had stood at 2,246.74 at the beginning of the day’s trading. The overall net loss in market capitalization of all stocks affected by the crash has been estimated at roughly half a trillion dollars. That is, in one day, around $500 billion in stock value simply ceased to exist.
Black Monday stock market crash
■
115
Immediate Effects of the Crash
Black Monday had global repercussions, as stock prices around the world reeled. News of the stock market crash dominated television, as the three major networks preempted their regular programming to provide special coverage of the crash. Cable news networks, particularly CNN, offered continuous coverage. David Ruder, head of the Securities and Exchange Commission (SEC), threatened to close the markets in order to stop the slide. President Ronald Reagan announced that he was puzzled by the financial events, as nothing was wrong with the economy. He urged Americans not to panic. Since Black Monday conjured images of the start of the Great Depression, many Americans found it difficult to remain calm. As experts described it, the calamity on Wall Street would set in motion an inexorable chain reaction. Fearful consumers, their net worth crippled by the deflation of stock prices, would put off purchases, forcing industry to slow production and lay off workers. The ripples of the economic slowdown would reach every corner of the nation—families, schools, retailers, pension funds, and charities—as the boom that had driven the 1980’s came to a crashing end. In the first days following the crash, no one knew whether Black Monday was simply a stock market correction or the harbinger of something far more serious.
Causes of the Crash
Two varieties of program trading, known as portfolio insurance and index arbitrage, were viewed as the main culprits in the 1987 crash, as well as in a subsequent October, 1989, minicrash. Program trading involves bundles of trades comprising fifteen or more securities and worth more than $1 million. Pension funds, mutual funds, and hedge funds all rely on computers to buy and sell such large collections of investments. Program trading reduces costs and allows the savings to be passed on to small investors. It also permits traders to match their holdings to a particular stock index. Index arbitrage occurs when an investor buys a bundle of stocks and simultaneously sells futures contracts (that is, the contracts obliging the buyer to purchase a given stock on a particular date in the future at a predetermined price) for the index that those stocks represent. Meanwhile, large investors, particularly investment banks and brokerage houses trading on their own accounts, relied on portfolio
116
■
Black Monday stock market crash
insurance. Such insurance was supposed to use futures, as well as options (similar to futures, but granting the purchaser a right to make a future purchase rather than creating an obligation to do so), to protect, or hedge, against steep declines. The rapid fall of stock prices and the market indexes triggered automatic sell orders in many computer programs that worsened the drop. Thus, hedging techniques exaggerated the crash, rather than protecting investors from market volatility. Complicating the situation was the fact that financial market analysts could not agree on the underlying causes of Black Monday. While program trading triggered the drop, it was not apparent whether other factors were also involved. Democrats blamed Reagan for causing the disaster by allowing budget and trade deficits to balloon. Treasury Secretary James Baker blamed Democrats for raising taxes. Alan Greenspan, head of the Federal Reserve, had no comment. Other market observers were frightened by the tendency of program traders to buy and sell stocks without much regard for the quality or achievement of individual companies. Instead, they relied upon elaborate computerized procedures known as algorithms to compare the prices of various investments and then buy or sell particular stocks or sectors if they appeared under- or overvalued compared with historical norms. As a result, the programs tended to be most active in markets that were already moving. They could therefore accelerate or even exaggerate steep advances or declines. Impact The recession or depression that many observers feared would occur in the wake of Black Monday did not materialize. No great number of businesses failed, and unemployment rates did not jump, although Wall Street financial firms did lay off about fifteen thousand workers. While stock market officials expected that individual investors would avoid the market for years, such fears ultimately proved unfounded. Some investors, particularly inexperienced ones, did avoid the market, but only for a few years. Only about 20 percent of household financial assets in 1987 were tied up in stock, and most of that was indirectly owned through pension plans or mutual funds. The stock market did not prove to be a leading economic indicator. However, Black Monday did dramatically reduce both the number of companies planning to go public and the amount of cash available for other firms
The Eighties in America
to raise in the equity market. On Black Monday, 229 businesses had filed papers with the SEC declaring their intention to issue public stock for the first time; about 45 percent abandoned those plans within nine months after the crash. This number of canceled initial public offerings (IPOs) was unprecedented. Stock market analysts estimated that each company that did not cancel its planned IPO raised an average of $7.2 million less than it would have done if the IPO had occurred before Black Monday. The SEC had no clear sense of how to respond to Black Monday. The NYSE swiftly adopted new rules to control program trading. Capital requirements for specialists, who make a market in a given stock on the exchange floor, were increased, while both overthe-counter and standard trade processing were im-
Newspaper headlines across the United States announced the Black Monday stock market crash in October, 1987. (AP/Wide World Photos)
The Eighties in America
proved through computer programs. Many subsequent computer programs had built-in stopping points or “circuit breakers,” designed to limit huge losses. While Congress considered banning program trading, no such legislation was passed. Further Reading
Kamphuis, Robert W., et al. Black Monday and the Future of Financial Markets. Homewood, Ill.: Dow Jones-Irwin, 1989. A late-1980’s attempt to assess the events of October, 1987, and evaluate their long-term repercussions for investors. Mahar, Maggie. Bull! A History of the Boom and Bust, 1982-2004. New York: Harper Business, 2004. Overview of twenty-two years of market history, detailing the events leading up to and following the 1987 market crash. Metz, Tim. Black Monday: The Catastrophe of Oct 19, 1987, and Beyond. New York: William Morrow, 1988. Tightly focused account of the 1987 crash, concentrating on its near-term and likely longterm effects. Caryn E. Neumann See also Business and the economy in Canada; Business and the economy in the United States; Reagan, Ronald; Reaganomics; Television.
■ Blade Runner Identification Science-fiction film Director Ridley Scott (1937) Date Released June 25, 1982
Blade Runner’s groundbreaking design blended film noir and punk sensibilities, striving to portray a realist vision of the architecture, fashion, and technology of the future. Although its initial theatrical release was unsuccessful, the film garnered growing popular approval and critical reappraisal through videotape rentals. Through its eventual cult popularity and original design, it came to influence the look of countless science-fiction films that followed. Fans of Harrison Ford were expecting him to act in Blade Runner like the wise-cracking action hero of Star Wars (1977), Han Solo; they were surprised and disappointed to see him play a downbeat, filmnoir-inspired character. Rick Deckard is a former police detective living in a bleak, rain-soaked, shadowfilled, overcrowded, postmodern Los Angeles. His
Blade Runner
■
117
job was to hunt down and kill renegade replicants (biological androids) who had illegally come back to Earth in 2019. The production had gone over schedule and over budget, reaching approximately $28 million, and director Ridley Scott had been forced to borrow footage from Stanley Kubrick’s The Shining (1980) to complete the original theatrical ending. As a result, he had lost control of the film to Warner Bros. studios, which decided to add an expository voice-over and other explanatory elements to the dense film, as well as tacking on a romantic happy ending. Despite the film’s slow, standard plot and a plodding pace, its strengths lie in its visual design, including its cinematography, art direction, production design, and special effects. Art director David Snyder was assisted by a talented crew that included visual futurist Syd Mead, who also worked on Tron (1982), 2010 (1984), Aliens (1986), among other films. The goal of the design was to create a coherent, dense environment characterized by dystopian bleakness, alienation, and “terrible wonder” or “strange sublimeness.” Scott said he liked to give the eye so much to see in a film it was like a “seventy-layer cake.” Blade Runner’s design changed the look of science-fiction cinema as drastically as had Kubrick’s 2001: A Space Odyssey (1968) and George Lucas’s Star Wars before it. The film also featured a thematically complex plot. It blurred the boundaries between hero and villain (Deckard sees himself as little better than a murderer, and the replicants are by turns inhuman and sympathetic), as well as between hunter and hunted and between artificial and human life. It also incorporated allegories of class and slavery and envisioned a bleak future meant to explore the excesses of 1980’s international conglomerates and the globalization of capitalism, while soberly pondering what it means to be human in the context of mechanized commodity culture. Like Star Wars before it, the film portrayed a future in which technology could be shabby rather than shiny, but it put a decidedly cyberpunk spin on this portrayal, influencing many of the near-future fiction and films that followed. Impact In retrospect, Blade Runner can be seen as a distinctively postmodern film, in that it incorporates a pastiche of many different elements to assemble a vision of the future. As much film noir as science fiction, the film surmounted its component subgenres
118
■
Blondie
The Eighties in America
to achieve something new that would influence many other filmmakers once public opinion caught up with it. It was also the first major adaptation of a story by Philip K. Dick, who died just before the film was released. Dick’s stories, once “discovered” by Hollywood, would become the basis for many television and film adaptations, including Total Recall (1990), Screamers (1995), Impostor (2002), Minority Report (2002), Paycheck (2003), and A Scanner Darkly (2006). Subsequent Events
By 1992, Scott had enough clout in Hollywood to revisit Blade Runner, eliding the film’s voice-over narration and restoring some deleted footage to bring the film closer to his original vision. This “director’s cut” of the film was released in theaters and later on VHS videotape and digital video disc (DVD), and it represented one of the first “director’s cuts” of any major studio film.
Further Reading
Brooker, Will, ed. The “Blade Runner” Experience: The Legacy of a Science Fiction Classic. New York: Wallflower, 2005. Bukatman, Scott. “Blade Runner.” London: British Film Institute, 1997. Kerman, Judith, ed. Retrofitting “Blade Runner.” Bowling Green, Ohio: Bowling Green State University Popular Press, 1991. Sammon, Paul. Future Noir: The Making of “Blade Runner.” New York: HarperCollins, 1996. Joseph Francavilla See also Cyberpunk literature; Film in the United States; Ford, Harrison; Hannah, Daryl; Home video rentals; Science-fiction films; Special effects; Vangelis.
■ Blondie Identification American New Wave musical group Date Initially active 1975-1982; reunited 1997
Blondie’s groundbreaking eclectic style increased the group’s worldwide popularity. Several hit singles achieved legendary status, and singer Debbie Harry’s glamorous image influenced the musical scene during the 1980’s and beyond. Blondie released two original albums during the 1980’s, Autoamerican (1980), the group’s third platinum hit, and The Hunter (1982). The New Wave
Blondie lead singer Debbie Harry in 1980. (Hulton Archive/ Getty Images)
group also released two compilations, The Best of Blondie (1981) and Once More into the Bleach (1988). Originally formed in 1975, Blondie was famous for its successful mix of glam rock, power pop, punk, and disco. Blondie’s lineup in the new decade included vocalist Debbie Harry, guitarists Chris Stein and Frank Infante, keyboardist Jimmy Destri, drummer Clem Burke, and bassist Nigel Harrison. The music of the group’s last 1970’s album, Eat to the Beat (1979), had seemed uneven to many listeners, although the album was still certified platinum, selling more than one million copies by 1980. It included the remarkable single “Dreaming,” as well as Blondie’s third number-one hit in the United Kingdom, “Atomic.” In 1980, the group also released its greatest hit and the number-one single of the year, “Call Me,” originally written as the theme for the film American Gigolo (1980). A collaboration pairing Harry’s lyrics with music by Italian songwriter and producer Giorgio Moroder, “Call Me” was an early
The Eighties in America
example of Europop. The single remained at the top of the Billboard Hot 100 chart for six weeks. The diversity of Autoamerican added to Blondie’s reputation as a trendsetter, and two of the album’s songs proved to be chart toppers in the United States. The reggae strains of “The Tide Is High” and the rap coda to “Rapture” brought Blondie credit for introducing new sounds to mainstream audiences. Although Autoamerican sold well, internal disagreements and individual interests in outside projects damaged the group’s cohesiveness. The release of Debbie Harry’s solo album Koo Koo (1981) may have compounded the problem. An earlier advertising campaign claiming “Blondie is a group” had failed to persuade the public that the band was more than a backup for the blond vocalist, and the confusion worsened as the vocalist’s reputation grew. Further complications emerged, as Clem Burke worked as a producer for another group and Jimmy Destri prepared to record his own solo album. With some reluctance on the part of Harry, the group recorded The Hunter in 1982, a musical failure that ended the 1980’s career of Blondie as a group. The sextet disbanded when Chris Stein became ill with a genetic skin disease, and Harry, who had long been romantically involved with Stein, suspended her career to nurse him back to health. Impact Although Blondie’s importance in the music world faded as the decade progressed, several of the group’s hit singles became classics. A music video of “Rapture” appeared on MTV soon after the network began broadcasting in 1981, and Debbie Harry’s edgy, platinum-blond sexuality influenced many other female lead singers of the times. Further Reading
Harry, Debbie, Chris Stein, and Victor Bockris. “Koo Koo.” In Making Tracks: The Rise of Blondie. New York: Da Capo Press, 1998. Rock, Mick. Picture This. London: Sanctuary, 2004. Margaret A. Koger See also Hip-hop and rap; Madonna; MTV; Music; Music videos; New Wave music; Pop music; Synthesizers; Women in rock music.
Bloom County
■
119
■ Bloom County Identification Daily newspaper comic strip Writer Berkeley Breathed (1957) Date Initially published between 1980 and 1989
Bloom County satirized all aspects of American society of the 1980’s, particularly politics and popular culture. Running from December of 1980 to August of 1989, the daily Bloom County comic strip, written and drawn by Berkeley Breathed, satirized American society of the 1980’s. Many likened the comic to Walt Kelly’s Pogo and Gary Trudeau’s Doonesbury. Breathed’s political slant was decidedly liberal, but his humor was also directed toward everyday things in society, such as advertising, dating, and gender stereotyping. The strip’s cast featured a mixture of adult humans, precocious children, and talking animals. The mythical Bloom County was somewhere in the American Midwest. The strip and its cast evolved and changed over the decade. Originally, young radical Milo Bloom lived with his more reactionary grandfather, the Major. Eventually, the grandfather vanished from the strip, and other characters such as Opus the Penguin rose to prominence. Originally the pet of another lead character, Michael Binkley, Opus became the most famous of the strip’s characters. Thanks to merchandising, stuffed Opus dolls were easily found in gift shops throughout the 1980’s. Bill the Cat, a deliberate parody of the comic-strip cat Garfield, was also a featured character in several major story lines. He dated U.S. ambassador to the United Nations Jeane Kirkpatrick, was lead singer for a heavy metal band, joined a cult, and had his brain replaced with Donald Trump’s. Bill’s experiences parodied many high-profile news events and personalities of the decade. In 1984 and 1988, the strip convened the Meadow Party which nominated Bill the Cat for president and Opus for vice president. Breathed satirized both the political posturing of elections and the nominating process of political conventions. Besides political figures, Bloom County ran story lines on several controversial issues of the decade. In one story line, Opus learned his mother was being held in a cosmetics lab. Opus encountered several animals used for testing, and the cartoons were fairly graphic in detail. However, Opus found himself stuck between the radical animal-rights rescuers and the Mary Kay Commandos, who wielded pink Uzis.
120
■
Blue Velvet
Breathed also took many pop culture figures and events to task. Prince Charles and Princess Diana, Madonna and Sean Penn, Michael Jackson, and a host of other popular celebrities were lampooned in the strip. In 1987, Breathed received a Pulitzer Prize for editorial cartooning for the strip. In 1989, Breathed decided to end the strip. He stopped producing the daily Bloom County and began a Sunday-only strip titled Outland. Impact
Bloom County provided a mixture of political satire, cultural commentary, slice-of-life humor, and surrealism at a time when those four elements were rarely combined in a single mainstream comic strip. In addition to entertaining and edifying his readers, then, Breathed was one of a few cartoonists to demonstrate that a strip could eschew predictability and adherence to a few set themes and still be successful. He thereby helped expand the possibilities of the syndicated daily comic.
Further Reading
Breathed, Berke. Classics of Western Literature: Bloom County 1986-1989. Boston: Little, Brown, 1990. _______. One Last Little Peek, 1980-1995: The Final Strips, the Special Hits, the Inside Tips. Boston: Little, Brown, 1995. Jarnow, Jesse. “The Penguin Is Mightier than the Sword.” Salon.com. Nov. 20, 2003. http://dir .salon .com/story/ent/feature/2003/11/20/breathed /index.html. P. Andrew Miller See also
Comic strips; Kirkpatrick, Jeane.
■ Blue Velvet Identification American art-house crime film Director David Lynch (1946) Date Released September 19, 1986
Blue Velvet shocked audiences with its violence and sexuality, establishing David Lynch as one of the most controversial directors of his generation. Lynch was known to several different audiences for his early works, including the unsettling experimental film Eraserhead (1976). However, few filmgoers were prepared for Blue Velvet, which simultaneously employs and subverts a host of familiar settings,
The Eighties in America
images, and characters. The film is set in an imaginary American lumber town, and while it apparently takes place in the present, its opening scenes create a bucolic atmosphere evocative of the faraway 1950’s. Lynch’s main character is the seemingly innocent Jeffrey Beaumont (played by Kyle MacLachlan), who finds himself drawn all too willingly into a frightening situation that might have been lifted from one of the film noir mysteries of the same period. The film’s audience learns by degrees that the husband and son of sultry nightclub singer Dorothy Vallens (Isabella Rossellini) are being held captive by drug-crazed thug Frank Booth (played with manic intensity by Dennis Hopper). Dorothy submits sexually to Frank but also lures Jeffrey, who has discovered the situation, into a sadomasochistic affair. At the same time, Jeffrey is falling in love with girlnext-door Sandy Williams (Laura Dern), who is the daughter of a police detective and is surreptitiously helping the young man investigate the mystery. The film includes a number of disquieting and incongruous episodes. In an early scene that sets the tone for the film, Jeffrey discovers a severed human ear lying in a field and crawling with ants. Sometime later, Frank, who has been presented as a foulmouthed monster, sits enraptured as Dorothy sings the 1963 Bobby Vinton hit “Blue Velvet” (written by Bernie Wayne and Lee Morris) at her club. Later still, the kidnapped Jeffrey watches brothel owner Ben (former child actor Dean Stockwell) lip-synch another famous 1963 song, Roy Orbison’s “In Dreams,” into a trouble light that grotesquely distorts his features. The inclusion of Orbison on the film’s soundtrack proved significant, because it helped the singer regain a popularity he had lost in the 1970’s. Before he died in 1988, Orbison enjoyed a few years of regained success. Whether they were alive in the 1950’s or not, Americans of the 1980’s tended to think nostalgically of the earlier decade as a period of innocence and tranquillity. Lynch contrasts this perception with a dark and menacing vision of human relationships, but nothing in the film suggests that either version is more “correct” than the other. Thus, the film’s nominally happy ending merely reinforces the mood of unease that has prevailed since its opening scenes. Impact The violence, frank sexuality, and dark vision of Blue Velvet drove many filmgoers from the theater during early showings, but critics recognized
The Eighties in America
it as one of the most innovative motion pictures of the decade. Lynch was nominated for an Academy Award for Best Director, and along with Hopper and Rossellini, cinematographer Frederick Elmes, and the film itself, he won several other important awards. Beyond critical praise, the film’s central motif—the use of surreal images and experimental formal elements to suggest a dark underbelly to idealized middle-America—became extremely influential. Lynch himself employed it again in his television series Twin Peaks (premiered 1990), and it became a staple of independent film and occult television in the 1990’s and the early twenty-first century. Further Reading
Atkinson, Michael. Blue Velvet. London: British Film Institute, 1997. Chion, Michel. David Lynch. London: British Film Institute, 1995. Sheen, Erica, and Annette Davison, eds. The Cinema of David Lynch: American Dreams, Nightmare Visions. New York: Wallflower, 2004. Woods, Paul A. Weirdsville USA: The Obsessive Universe of David Lynch. London: Plexus, 1997. Grove Koger See also
Academy Awards; Film in the United States;
Music.
■ Boat people Definition
Large number of asylum-seeking IndoChinese, Cuban, and Haitian refugees who fled their homelands, often in rickety boats
In both the South China Sea and the Caribbean Sea, a flood of desperate refugees sought illegally to enter the United States, producing a needed redefinition of the nation’s immigration policy. The term “boat people” was first used to describe the massive number of Vietnamese and Cambodian refugees who fled in small boats in the aftermath of the Vietnam War in 1975. From the start of the Chinese invasion of Vietnam in 1979 until the mid-1980’s, nearly 2 million Vietnamese fled to neighboring countries in Southeast Asia, where they were placed in overcrowded refugee camps. There, they were joined by nearly a million Cambodians fleeing the murderous regime of Pol Pot and by Laotian Hill
Boat people
■
121
People (Hmong), who had worked closely with U.S. forces before Laos fell to the communist Pathet Lao. From these countries of first asylum, most refugees hoped to resettle permanently in the United States. The U.S. government first reacted to this refugee crisis by ordering the Seventh Fleet to aid overcrowded and dilapidated refugee-laden boats in distress. Nevertheless, thousands of refugees are thought to have perished in storms and pirate attacks. In response to this human calamity, the United States established, through the United Nations, the Orderly Departure Program, in which the United States, Canada, France, the United Kingdom, and Australia agreed to be nations of resettlement. In return, Vietnam agreed to stop illegal departures and permit orderly emigration of people accepted by the resettlement nations. The program drew a distinction, however, between political refugees and economic refugees. To qualify for resettlement, refugees were required to undergo a lengthy screening process to determine their motives for resettlement. Only those fleeing political persecution, rather than economic hardship, would be accepted. From 1981 to 1990, nearly 281,000 Vietnamese refugees were resettled in the United States. Indo-Chinese boat people amounted to the largest single group of refugees ever accepted by Canada: Between 1975 and 1985, about 111,000 of them came to Canada. The peak year was 1980, when 35,000 were settled in Canada. Cuban and Haitian Boat People
The term “boat people” was also applied in the early 1980’s to Cuban and Haitian asylum seekers, who tried to escape by sea from oppression and poverty at home. The Mariel boatlift began on April 15, 1980, and ended on October 31, 1980. During this period, in which Fidel Castro permitted the exodus of any Cuban wishing to migrate, over 125,000 Cubans arrived in southern Florida from Port Mariel, Cuba. Often, boats were filled far beyond capacity, and there were many instances of distress. As a result of active monitoring of the exodus by the U.S. Coast Guard, however, there were only twenty-seven recorded instances of drowning. Of the Cuban expatriates arriving in Florida, nearly three thousand were criminals sent from Cuban prisons. Hundreds more were mentally ill patients released from Cuban institutions. Upon reaching the United States, the Cuban boat people were detained in processing centers in south
122
■
Boat people
Florida; however, the flood of refugees proved so great that centers were created in Pennsylvania, Wisconsin, and Arkansas to handle the overflow. Frustration with conditions in these centers and the slow rate of processing resulted in occasional riots. Following Castro’s ending of Cuba’s open immigration policy in November, 1980, the flood was reduced to a trickle. Simultaneous with the Cuban exodus, thousands of Haitians boarded aged, rickety boats to escape abuses they were suffering under the regime of JeanClaude Duvalier. In 1981, about twelve thousand Haitian boat people made it to the Bahamas on the first part of their journey to south Florida. However, on the second phase of the journey, most Haitian vessels were intercepted at sea. Passengers were placed in detention centers then sent back to Haiti, since most Haitians were viewed as economic refugees. An unknown number of Haitian boat people died at sea. The worst known incident was when the bodies of thirty Haitians washed up on the shore of Hillsborough Beach, Florida. Within the United States, the differing treatment between Cuban and Haitian migrants produced charges of racism and hypocrisy. To reduce bad publicity and the human drama playing in the news, President Ronald Reagan issued Executive Order 1234 on September 29, 1981, empowering the Coast
The Eighties in America
Guard to intercept vessels outside U.S. territorial waters that were suspected of carrying undocumented immigrants and to escort those ships back to their countries of origin. Charges of racism continued throughout the 1980’s, however. The election of Haitian populist Jean-Bertrand Aristide in 1990 reduced for a time Haitians’ desire to leave their country. However, the flood of refugees would resume in late 1991, after Bertrand was deposed in a military coup. Of the thirty-six thousand Haitians stopped at sea, only nine thousand were granted the right to seek asylum. Tens of thousands of others were able to land on U.S. shores undetected, however, immigrating illegally to the United States.
Impact For the United States, the 1980’s witnessed a larger influx of asylum seekers than did any previous decade. Granting immigrant status to the large number of Indo-Chinese refugees helped relieve Americans’ sense of guilt over their rapid departure from that region and showed that loyalty would be rewarded. Granting immigrant status to most of the Cuban boat people served to embarrass the Castro regime; however, denial of equal status to Haitians raised serious issues of discrimination. It also set the stage for policies later in the decade, when asylum seekers who had been tortured by Central American right-wing regimes supported by the United States were classified as economic refugees, while those from left-wing nations economically devastated by U.S. sanctions were classified as political refugees. Unlike the first wave of immigrants from Cuba and Indo-China, which had been composed largely of middle- and upper-class individuals, the influx of boat people during the 1980’s sprung largely from their nations’ lower classes. In this, the Laotian Hmong represent an extreme example of jungle mountain dwellers descending from their thatched huts into modern U.S. apartments in places such as Central California. For them, assimilation into American society would entail the most difficulties. It would be less difficult A group of Vietnamese children, including one with an American father (center), poses in for the Cambodians who formed a refugee camp in southern Thailand in 1980. (AP/Wide World Photos)
The Eighties in America
a large ethnic enclave in Long Beach, California. The Vietnamese, who established large communities in California and the Texas Gulf Coast, founded lucrative businesses in auto repair, nail care, commercial fishing, and food services. Cuban boat people who were granted immigrant status were rapidly absorbed into the already large south Florida Cuban community and became a revitalizing force for cities such as Miami. Further Reading
García, María Christina. Havana, USA: Cuban Exiles and Cuban Americans in South Florida, 1959-1994. Berkeley: University of California Press, 1996. Analysis of Cuban immigration. Chapter 2 is devoted to the Mariel boatlift of 1980. Endnotes, index, and select bibliography. Reimers, David M. Still the Golden Door: The Third World Comes to America. New York: Columbia University Press, 1992. A standard text for the study of contemporary U.S. immigration, both legal and illegal. Footnotes, index, and bibliography. Vo, Nghia M. Vietnamese Boat People, 1954 and 19751992. Jefferson, N.C.: McFarland, 2005. Analysis of the flight from Vietnam and subsequent resettlement; filled with many human-interest accounts. Footnotes, index, and bibliography. Irwin Halfond See also Asian Americans; Immigration to Canada; Immigration to the United States; Latin America.
■ Boitano, Brian Identification American figure skater Born: October 22, 1963; Mountain View,
California Boitano was a world champion in 1986 and 1988 and an Olympic gold medalist in 1988. His resulting influence on the field of professional figure skating led him to become a voice for the sport and for professional skaters. Brian Boitano earned his initial reputation in the skating world with his unique presence on the ice. Speed and power defined his style, an incomparable ability to defy gravity and fly characterized his jumping, and his natural, flowing movement more than made up for the lack of dance instruction in his training. Fans and rivals alike admired his powerful
Boitano, Brian
■
123
stroking, quickness, and a finesse and precision in timing jumps that belied his height, as well as his inventiveness, sportsmanship, and musicality. During the 1980’s, Boitano was greatly influenced by two people: Linda Leaver, the coach with whom he trained for the larger part of the decade, and Sandra Bezic, his choreographer. His work with Bezic began in 1987, and it changed the course of his career. Boitano engaged Bezic after coming in second at the 1987 World Championships, losing his title to Canadian Brian Orser. Boitano had relied heavily on his overwhelming technical skill, but he realized that to remain recognized as the world’s best figure skater he would have to improve the quality of his artistic presentation. The programs Bezic choreographed for him focused on masculine and heroic characters, emphasizing strong lines and grand moves that won him audiences’ acceptance and judges’ approval. Boitano’s first major competition showcasing Bezic’s choreography was the 1987 Skate Canada event in Calgary, Alberta. The event took place in the same venue as the upcoming 1988 Winter Olympics, and Leaver insisted that Boitano become familiar with every inch of the rink, hoping to give him a headstart for the Olympic competition. His skating impressed the judges, but Boitano still finished second behind Orser, setting up a confrontation at the Olympics that was publicized as the “Battle of the Brians.” Boitano bested Orser in the Olympics, narrowly winning the portion of the event for which artistry is most important, the free skating competition, and he went on to reclaim his world champion title later that year. After the Olympics, Boitano became artistic director of a series of skating tours, two of which were produced during the last two years of the decade. His collaboration with other world champions resulted in skating shows that broke the tradition of ice shows featuring a series of skaters performing unrelated programs. Instead, it began a tradition of shows centered around a theme that gave them coherence and elegance, raising their artistic caliber. Boitano’s portrayal at skating competitions of heroic characters such as pioneers and soldiers culminated with his performance as Don Jose opposite Katarina Witt in Carmen on Ice (1989), for which he won an Emmy Award. Impact Boitano was an important role model for a generation of young skaters. He also helped expand
124
■
Bon Jovi
The Eighties in America
During the late 1970’s, while a high school student in Sayerville, New Jersey, singer John Bongiovi performed with several bands in the region, the best of which was Atlantic City Expressway, a cover band that opened for numerous artists, notably Bruce Springsteen. Later, while working as a janitor at New York’s Power Station recording studio, Bongiovi made demo tapes, one of which, “Runaway,” became a hit on local radio and was chosen for a compilation recording of new artists. In 1983, Bongiovi (who had taken the professional name Jon Bon Jovi) Brian Boitano leaps into the air during the free skating competition at the 1988 Winter signed with Mercury Records and Olympics. (AP/Wide World Photos) formed the heavy metal band Bon Jovi with his boyhood friend David Bryan (keyboards), as well as the sport by performing feats that few earlier skaters Richie Sambora (lead guitar), Alec John Such (bass had thought to be technically feasible. Beyond these guitar), and Tico Torres (drums). accomplishments, Boitano was a leader and a voice The band’s first album, Bon Jovi (1984), went gold for amateur skaters, who lived in constant fear of losin the United States (meaning it sold more than ing their Olympic eligibility should they earn money 500,000 units), and “Runaway” became a top-forty by skating. For them, he won the right to remunerahit. The following year, 7800° Fahrenheit (named for tion and in so doing gained respect for them as prothe temperature of an erupting volcano) appeared fessionals. and became Bon Jovi’s first Top 40 album. Although Bon Jovi’s reputation was based on the Further Reading group’s heavy metal sound and image, the band Boitano, Brian, and Suzanne Harper. Boitano’s Edge: decided to change direction in 1986. Collaborating Inside the Real World of Figure Skating. New York: Siwith songwriter Desmond Child, Bon Jovi abandoned mon & Schuster, 1997. the tough intensity of heavy metal in favor of softer, Boo, Michael. “1980’s Personalities.” In The History of more melodic ballads, and the group began to sport Figure Skating. New York: William Morrow, 1989. jeans instead of leather. “You Give Love a Bad Name” Sylvia P. Baeza and “Livin’ on a Prayer” became hits in the United States when they appeared on the group’s next alSee also Olympic Games of 1988; Sports. bum Slippery When Wet (1986), and “Livin’ on a Prayer” won the award for Best Stage Performance at the fourth annual MTV Video Music Awards the following year. The album itself held the number-one ■ Bon Jovi spot on the Billboard 200 chart for eight weeks and sold more than 12 million copies worldwide. Identification American pop-metal band Bon Jovi’s next album, New Jersey (1988), continued the style of “pop-metal” and included two more Bon Jovi combined the rhythms and guitar distortion of number 1 hits, “Bad Medicine” and “Born to Be My hard rock, the power and rebellion of heavy metal, and the Baby,” as well as the ballad “Blood on Blood,” a memmelodic interest and romantic lyrics of pop to become one of oir of Jon Bon Jovi’s adolescence. Both Slippery When America’s leading mainstream rock bands by the end of the Wet and New Jersey were huge sellers, establishing Bon decade.
The Eighties in America
Jovi as one of the premier rock bands of the day. It won the American Music Award for Favorite Pop/ Rock Band in 1988. The group continued to tour extensively and the following year (1989) appeared at the Moscow Music Peace Festival in Lenin Stadium. Impact Bon Jovi achieved enormous success during the 1980’s by mixing heavy metal with the softer, melodic interest of pop. The group introduced metal to a wider audience, including women, making it one of the most popular musical subgenres of the decade. Further Reading
Raymond, John. “Bon Jovi at Memorial Coliseum, Portland, Oregon, May 8, 1989.” In The Show I’ll Never Forget: Fifty Writers Relive Their Most Memorable Concertgoing Experience, edited by Sean Manning. Cambridge, Mass.: Da Capo Press, 2007. Walser, Robert. Running with the Devil: Power, Gender and Madness in Heavy Metal Music. Hanover, N.H.: Wesleyan University Press, 1993. Weinstein, Deena. Heavy Metal: A Cultural Sociology. New York: Lexington Books, 1991. Mary A. Wischusen
Bonfire of the Vanities, The
125
emulating realist novelists such as Émile Zola and Sinclair Lewis. To combat writer’s block, Wolfe arranged to write in installments, each of which would be published in Rolling Stone. His daring plan worked, although he became dissatisfied with aspects of the serialization as it appeared from 1984 to 1985. When revising the novel for publication as a book in 1987, he made significant changes, most notably in transforming the protagonist, Sherman McCoy, from a writer to a Wall Street bond trader. The novel’s McCoy became an expensively dressed thirty-eight-year-old with a big income but bigger debts, a questioning daughter, a spendthrift wife, and a voluptuous mistress. In The Bonfire of the Vanities, McCoy’s lifestyle and sense of power begin their decline when he and his mistress find themselves lost at night in the Bronx. They encounter two young, seemingly threatening black men, and as a result of the encounter McCoy becomes the defendant in a hit-and-run trial. McCoy, a rich, white suspect in a crime with a black victim becomes a pawn for many other characters: District Attorney Abe Weiss, for example, wants to convict him to ensure his reelection, while Assistant
See also Heavy metal; MTV; Music; Music videos; Pop music; Springsteen, Bruce.
■ Bonfire of the Vanities, The Identification Best-selling novel Author Tom Wolfe (1931) Date Serialized 1984-1985; novel published 1987
The first novel by controversial journalist Wolfe, The Bonfire of the Vanities demonstrated its author’s reportorial skill, irreverent social insight, and flamboyant style as it presented a panorama of New York in the 1980’s. The novel portrayed an ethnically divided New York where love of status triumphs over decency, the rich and the poor seldom meet, a demagogue manipulates the news media, and most politicians care more about votes than about justice. Tom Wolfe became famous before the 1980’s for his innovative nonfiction, but during the early 1980’s he began work on a book he had long hoped to write. This novel about New York City would bring together characters from diverse social levels. To achieve such juxtapositions convincingly, he decided that he would need to conduct serious research,
■
Tom Wolfe. (Courtesy, Author)
126
■
The Eighties in America
Bonin, William
District Attorney Larry Kramer sees a conviction as a means to increase his income, as well as his status with his boss and with the woman he wants to make his mistress. Meanwhile, the shrewd Reverend Bacon of Harlem stirs up journalists, including tabloid reporter Peter Fallow, in order to further his own political agenda. As a result, an initially obscure accident captures the public’s attention, as it is made by journalists and politicians to stand for racial and class divisions in the city. As the novel ends, the tough, impartial Judge Myron Kovitsky, a Jewish American like Weiss and Kramer, has lost his reelection campaign, and McCoy has lost the status he once thought essential. In its place, he has gained the shrewdness and ferocity of a beast fighting for survival in the jungle of the United States’ biggest city. Impact The title of Wolfe’s novel alluded to Girolamo Savonarola’s famous bonfire in Florence, Italy, on February 7, 1497, at which thousands of objects were burned as occasions of sin. As he suggested with this title, Wolfe used the novel figuratively to burn away the vanity of status in 1980’s American culture. While the book offended one group after another, it detailed the importance of sexual, monetary, and political power to that culture, as well as the consequences of both the obsession with obtaining power and the fear of losing it. Further Reading
Bloom, Harold, ed. Tom Wolfe. Broomall, Pa.: Chelsea House, 2001. Ragen, Brian Abel. Tom Wolfe: A Critical Companion. Westport, Conn.: Greenwood Press, 2002. Shomette, Doug, ed. The Critical Response to Tom Wolfe. Westport, Conn.: Greenwood Press, 1992. Victor Lindsey See also African Americans; Book publishing; Do the Right Thing; Jewish Americans; Journalism; Literature in the United States; Poverty; Racial discrimination; Wall Street.
■ Bonin, William Identification
Serial killer and rapist known as the Freeway Killer Born January 8, 1947; Downey, California Died February 23, 1996; San Quentin, California William Bonin’s crime spree terrorized residents of Southern California during the early 1980’s. Parents feared for the safety of their children, and schools warned students of the dangers of hitchhiking. William Bonin was a notorious sex offender and serial killer in the early 1980’s. His earliest reported offenses occurred during his early teens, leading him to be sentenced to several terms in correctional facilities. His crimes escalated to the brutal rape and murder of boys and young men, aged twelve to nineteen. With one of his four accomplices (George Matthew Miley, James Michael Munro, William Ray Pugh, and Vernon Robert Butts), Bonin cruised the freeways of Southern California looking for male victims, most of whom were hitchhikers or prostitutes. Victims were raped and killed, usually by stabbing or strangulation. Bonin was dubbed the Freeway Killer because of his preference for finding victims and dumping their bodies along the freeways of Los Angeles and Orange Counties. Bonin’s crime spree ended on June 11, 1980, when he was arrested while in the process of sodomizing a teenage boy in the back of his van. One of his former accomplices, Pugh, had identified him to the police in order to avoid being charged with another crime. Psychiatrists who examined Bonin concluded that his tortured past, including his own probable sexual abuse by older boys, contributed to his crimes. Bonin eventually confessed to the rape and murder of more than twenty-one victims between May 28, 1979, and June 2, 1980. Because of a lack of evidence, he was charged with only fourteen murders, ten in Los Angeles County and four in Orange County. On March 13, 1982, a remorseless Bonin was sentenced to death. Bonin unsuccessfully appealed the court’s decision on multiple occasions. Impact Bonin’s murders of more than twenty-one young men over a one-year period instilled fear in the public, causing parents and schools to take steps aimed at protecting vulnerable children. The murders and the profile of Bonin that emerged during his trial also contributed to a growing fascination
The Eighties in America
with serial killers, who seemed to surface with more regularity as the decade progressed. Subsequent Events When he was executed at the San Quentin penitentiary on February 23, 1996, Bonin became the first person in California to be executed using lethal injection. The method was touted at the time as more humane than the gas chamber—California’s previous method of execution—but it would later become controversial as unnecessarily painful for those being executed. Further Reading
Hickey, Eric W. Serial Murderers and Their Victims. 4th ed. Belmont, Calif.: Thomson Higher Education, 2006. Levin, Jack, and James Alan Fox, eds. Mass Murder: America’s Growing Menace. New York: Plenum Press, 1985. Leyton, Elliot. “Towards an Historical Sociology of Multiple Murder.” In Serial Murder: Modern Scientific Perspectives, edited by Elliot Leyton. Burlington, Vt.: Ashgate, 2000. Jocelyn M. Brineman and Richard D. McAnulty See also
Atlanta child murders; Central Park jogger case; Homelessness; Night Stalker case; Supreme Court decisions.
■ Book publishing Definition:
Professional publication, marketing, and distribution of fiction and nonfiction books
The 1980’s represented a period of change in the book publishing industry. Corporate acquisitions, new computer systems, and the proliferation of bookseller chains transformed the processing and distribution practices of the publishing companies. Book publishing in the United States experienced its golden age after World War II (1939-1945). The U.S. economy boomed in the postwar years, and American culture experienced intellectual growth as well, thanks in part to the G.I. Bill, which helped fund education for veterans. A better-educated populace with more money to spend created new markets for authors and literary enterprises. The situation began to change, however, in the 1970’s and 1980’s: The industry began suffering from bureaucratization and from a blockbuster mentality similar
Book publishing
■
127
to the one developing in Hollywood at the same time. Rather than make a modest profit on each of many titles, publishers began to focus the bulk of their marketing efforts on a few best sellers each season, leaving many books to fall through the cracks, unable to find their audience. Business Mergers, the Best Seller, and Bookstore Chains Mergers and acquisitions in the industry
continued from the 1970’s. Penguin, a subsidiary of the Pearson conglomerate, expanded its publishing markets, whereas Simon & Schuster expanded into textbooks and software. In 1986, Harcourt Brace Jovanovich bought the educational and publishing rights owned by the Columbia Broadcasting System (CBS), which already included Holt, Rinehart, and Winston. In order to create a broad appeal for the general public, publishing houses began to rely on fiction or nonfiction titles that were massive best sellers, in such genres as true crime, celebrity scandals, romance, Westerns, and science fiction. The result was that work of lesser quality (but that was easier to categorize and market to a mass audience) took precedence over more original works that were difficult to pigeonhole. These mass-market books produced the profits that editors used to underwrite riskier works, especially those produced by new authors. Paperbacks, popularized in the 1950’s, began to dominate sales in the industry, because they were cheaper to produce. However, by the mid-1980’s, both hard- and soft-cover books doubled in price, making consumers more selective in their title purchases. Hardcover book sales declined from 54 percent in 1978 to 43 percent in 1983. This saturation of titles caused a glut in the market, and the excess overhead, in the form of unsold books, caused the industry to strengthen its reliance on backlists. Booksellers had always been able to return unsold copies for full credit, and the percentage rates of returns kept increasing. Moreover, retail bookstore chains like Barnes and Noble and Borders Books had the power to limit pricing on titles, because they sold such a high percentage of the industry’s products. Their influence on pricing helped consumers, but it further reduced publishers’ profits. The industry had regularly given extravagant cash advances to its most prominent authors—who provided the blockbuster best sellers upon which publishers were increasingly dependent. These authors
128
■
The Eighties in America
Book publishing
also commanded premium royalty payments. Not every work by a famous author was successful, however, so publishers’ investments in big names did not always pay off. Academic publishers also faced numerous challenges in the 1980’s. University presses confronted increasing budget constraints and tighter management, which curbed the production of scholarly monographs. Some university presses formed consortiums or sought external funding, but academic sales declined despite aggressive marketing tactics. Impact The changing economic landscape of the publishing industry during the 1980’s forced publishers to adopt new methods. New technologies and desktop-publishing systems allowed typesetting operations and editing to become more cost-efficient. The availability of electronic formats revolutionized production, distribution, and promotion and became part of a vast reorganization throughout the industry. They could not shield publishers completely, however, from the deleterious economic effects of the decade. Corporate restructuring led to the formation of numerous subdivisions within the large publishing houses that were often left without clear directions or goals. Decreasing salaries also deterred recent graduates from entering the profession; instead, they opted for more lucrative positions in the other communications fields. Subsequent Events The advent of the Internet dramatically affected the publishing profession. Beginning in 1994, customers could order books online directly from distributors’ warehouses at Amazon. The used-book trade also exploded, as search engines and centralized Web sites made it much easier to find a used copy of a specific title. Companies began as well to experiment with alternative formats such as e-books and CD-ROMs in order to boost sales. However, issues from the previous decade continued to plague the industry. Publishing mergers were still common, and the expansion of bookstore chains throughout the country took away business from smaller retailers in cities and malls. Thus, while it became possible for almost anyone to self-publish online, the competition for shelf space in brick-andmortar stores—or for a prominent position on Amazon’s virtual shelves—became more intense than ever.
Further Reading
Allen, Walter C. Ed. Library Trends 33 (Fall, 1984). The journal devotes its entire issue, “The Quality of Trade Book Publishing in the 1980’s,” to the trends in the book publishing industry. The authors of the special issue’s eleven articles address the role of the library, authors, editors, agents, marketing, and promotion in the publishing world. Coser, Lewis A., Charles Kadushin, and Walter W. Powell. Books: The Culture and Commerce of Publishing. New York: Basic Books, 1982. The authors apply sociological analysis and organizational theory to the publishing industry. Curtis, Richard. Beyond the Bookseller: A Literary Agent Takes You Inside the Book Business. New York: New American Library, 1989. A New York City literary agent offers insights into the trends and economic aspects of the publishing profession. Davis, Kenneth C. Two-Bit Culture: The Paperbacking of America. Boston: Houghton Mifflin, 1984. Davis, a journalist, analyzes the social, economic, educational, and literary impact of the mass marketing of paperbacks in the United States. Geiser, Elizabeth A., Arnold Dolin, and Gladys Topkis, eds. The Business of Book Publishing: Papers by Practitioners. Boulder, Colo.: Westview Press, 1984. Provides a description of the publishing process for individuals who want to enter the profession. Potter, Clarkson N. Who Does What and Why in Book Publishing: Writers, Editors, and Money Men. New York: Birch Lane Press, 1990. Describes the bookpublishing business for aspiring authors from a book’s initial conception to the time it reaches a consumer. Tebbel, John. Between Covers: The Rise and Transformation of Book Publishing in America. New York: Oxford University Press, 1987. An abridgement of Tebbel’s four-volume History of Book Publishing in America (1972-1981). Tebbel divides the work into five chronological sections, delving into the aspects and trends in American publishing from 1700 to 1985. Gayla Koerting See also
Advertising; Apple Computer; Business and the economy in the United States; Children’s literature; Consumerism.
The Eighties in America
■ Bork, Robert H. Identification
Federal appellate judge and unsuccessful nominee for associate justice of the United States Born March 1, 1927; Pittsburgh, Pennsylvania Robert Bork’s failed nomination to the U.S. Supreme Court in 1987 resulted in increased awareness among the American people of the judicial selection and decision-making processes. It also had lasting effects on the strategies employed by future presidents and nominees to increase the likelihood that their nominations would succeed. On July 1, 1987, following the retirement of Associate Justice Lewis F. Powell, President Ronald Reagan nominated Robert H. Bork to be the next associate justice of the United States. Bork’s confirmation hearings proved unusually lengthy and contentious.
Bork, Robert H.
■
129
In evaluating his fitness for the position, the American Bar Association (ABA) split on his nomination to the Supreme Court, with four members of the fifteen-person committee declaring him “not qualified” because of his “judicial temperament.” Additionally, 1,925 law professors signed letters opposing the nomination, and numerous influential interest groups—including the American Civil Liberties Union (ACLU); the American Federation of LaborCongress of Industrial Organizations (AFL-CIO); the Leadership Conference on Civil Rights; People for the American Way; Planned Parenthood; the National Abortion Rights Action League (NARAL); and the National Association for the Advancement of Colored People (NAACP)—mobilized grassroots opposition. In September, 1987, the Senate Committee on the Judiciary rejected Bork’s nomination by a vote of
Judge Robert H. Bork, center, listens as former president Gerald Ford, left, introduces him at his U.S. Senate Judiciary Committee confirmation hearings on September 15, 1987. (AP/Wide World Photos)
130
■
The Eighties in America
Bourassa, Robert
nine to five. Although acknowledging Bork’s legal credentials, the committee was especially concerned with his perceived hostility toward the civil rights of blacks, women, and other minorities; his rejection of a general constitutional right to privacy, especially in light of its implications for reproductive and homosexual rights; and his limited interpretation of the constitutional protection of the freedom of speech. On October 23, 1987, the full Senate defeated Bork’s nomination by a vote of fifty-eight to fortytwo. In addition to sustained opposition from civil rights advocates and his conservative ideology, a number of other factors coalesced in Bork’s defeat: Bork had been the solicitor general of the United States during the Watergate scandal. In October, 1973, President Richard M. Nixon ordered Attorney General Elliot Richardson to fire Archibald Cox, the special prosecutor who was investigating Nixon’s White House. Richardson resigned in protest rather than carry out the order. Deputy Attorney General William Ruckelshaus became acting attorney general, and Ruckelshaus also resigned rather than fire Cox. Bork then became acting attorney general, and, unlike Richardson and Ruckelshaus, he followed the president’s order and fired Cox. These resignations and termination became known as the “Saturday Night Massacre,” and Nixon’s opponents still blamed Bork for his role in the event. In addition, President Reagan was in the last two years of his second term and had been politically weakened by the Iran-Contra scandal, reducing his ability to persuade senators to support his nominee. The Democrats were in control of the Senate, having become the majority party in January, 1987, and they were particularly worried by Bork’s insistence that originalism—the philosophy that judges should interpret the Constitution according to the original intentions of its framers—was the only legitimate approach. The Democrats and the interest groups that opposed Bork believed that the Supreme Court was already trending conservative, a trend that would have been magnified had the conservative Bork replaced the “swing” voter Powell. Impact On February 8, 1988, Bork resigned his appellate judgeship. The seat to which he had been nominated on the Supreme Court was occupied by Anthony Kennedy, who would eventually become an even more important “swing” voter than Powell had been.
Bork’s confirmation hearings resulted in criticism of the perceived politicization of the judicial selection process. They revealed that, within the context of polarizing politics, any substantive statement by a nominee is potentially controversial enough to become fodder to be used against the nominee. This situation had a lasting effect on the nomination process, as many subsequent nominees simply refused to answer the Senate’s questions about their judicial views, stating that it would be inappropriate to discuss in advance views that would affect their future decisions on the bench. As a result of the politicization of the process, numerous bipartisan committees and task forces offered specific recommendations to limit the partisan politics in the confirmation process. Further Reading
Bork, Robert H. “Original Intent and the Constitution.” Humanities 7 (February, 1986): 22-26. _______. The Tempting of America: The Political Seduction of the Law. New York: The Free Press, 1990. Bronner, Ethan. Battle for Justice: How the Bork Nomination Shook America. New York: W. W. Norton, 1989. Jordan, Barbara. Barbara Jordan: Speaking the Truth with Eloquent Thunder. Edited by Max Sherman. Austin: University of Texas Press, 2007. Shaffer, Ralph E., ed. The Bork Hearings: Highlights from the Most Controversial Judicial Confirmation Battle in U.S. History. Princeton, N.J.: Markus Wiener, 2005. Richard A. Glenn See also
Abortion; Homosexuality and gay rights; Iran-Contra affair; Meese, Edwin, III; Pornography; Reagan, Ronald; Supreme Court decisions.
■ Bourassa, Robert Identification Premier of Quebec, 1985-1994 Born July 14, 1933; Montreal, Quebec Died October 2, 1996; Montreal, Quebec
During Bourassa’s second term as premier of Quebec, he led the Canadian province out of the turbulence of the Lévesque era and into a period of economic prosperity, only to have it founder amid a rising constitutional controversy. At the beginning of the 1980’s, few expected that Robert Bourassa would ever again become premier of Quebec. Not only had his first term ended in 1976
The Eighties in America
amid corruption scandals, but also the Parti Québécois, led by René Lévesque, seemed poised to lead the province to semi-independence. Bourassa was thought to be politically finished, but the Parti Québécois lost the referendum on sovereignty in May, 1980, Lévesque retired, and Bourassa, after biding his time in academic positions abroad, returned to leadership of the Liberal Party in 1983. In 1985, the Liberals won a majority in the National Assembly, defeating the Parti Québécois by a margin of more than 15 percent of the popular vote, capturing 99 of the assembly’s 122 seats, and returning Bourassa to the premiership. Bourassa’s second term was a boom time for Quebec economically, with retail sales and natural resources doing particularly well. His particular project, selling hydroelectric power from James Bay in the north of the province to American consumers frustrated by high energy costs, had mixed success in the era of seemingly renewed energy availability. The James Bay project and its enivronmental effects angered the indigenous Cree people who lived in the area. Bourassa helped Quebec join the global economy, achieving what commentators François Benoit and Philippe Chauveau called “l’acceptation globale,” or global acceptance. The cerebral, austere Bourassa, known for his “passionless politics,” proved surprisingly popular, despite lacking the charisma of some of his predecessors. He fit the mood of the pragmatic, business-minded Quebec of the 1980’s. Bourassa’s party easily won reelection in 1989 by an even more impressive margin than they had enjoyed four years previously. Problems arose when renewed constitutional negotiations commenced in the late 1980’s with the goal of normalizing the still-unresolved relationship between Quebec and the rest of Canada. Bourassa, although a staunch anti-secessionist, was enough of a Québécois patriot to insist on recognition of the idea of Quebec as a “distinct society.” Anglophone Canadians looked upon even this recognition as an infringement of the Canadian constitution’s Charter of Rights and Freedoms (passed as part of the Canada Act, 1982). This feeling was intensified by Bourassa’s moderate but nonetheless decided efforts to make the French language the exclusive vehicle of communication in Quebec. Impact Bourassa’s premiership ended in a series of largely unsuccessful negotiations with his fellow pro-
Bowers v. Hardwick
■
131
vincial premiers over the proposed Meech Lake constitutional accord. Despite personal goodwill on the part of Bourassa, he refused to compromise on what he saw as good for his province simply to achieve a resolution. The battle over recognition for Quebec cost Bourassa and the Liberals their popularity, and the Parti Québécois regained control of the National Assembly less than a year after Bourassa’s resignation as premier. Nicholas Birns Further Reading
Bourassa, Robert. Power from the North. Scarborough, Ont.: Prentice-Hall Canada, 1985. Lisée, Jean-François. The Trickster: Robert Bourassa and Quebecers. Toronto: Lorimer, 1994. MacDonald, L. Ian. From Bourassa to Bourassa: Wilderness to Restoration. Montreal: McGill-Queen’s University Press, 2002. See also
Quebec English sign ban; Quebec referendum of 1980; Lévesque, René; Meech Lake Accord.
■ Bowers v. Hardwick Identification Supreme Court decision Date Decided on July 30, 1986
The constitutionality of a Georgia law outlawing sodomy was upheld, slowing the progress of gay rights by allowing states to prosecute gay male sexual behavior as a felony. The decision was one of many that sought to define the limits of an implicit constitutional right to privacy. Michael Hardwick was a gay man in Atlanta, Georgia, who was targeted by a police officer for harassment. In 1982, an unknowing houseguest let the officer into Hardwick’s home. The officer went to the bedroom, where Hardwick was engaged in oral sex with his partner. The men were arrested on the charge of sodomy. Charges were later dropped, but Hardwick brought the case forward with the purpose of having the sodomy law declared unconstitutional. Hardwick claimed that Georgia’s sodomy law violated a constitutional right to privacy that was implicit in the Ninth Amendment and the due process clause of the Fourteenth Amendment to the U.S. Constitution. Previous decisions, notably Griswold v.
132
■
The Eighties in America
Boxing
Connecticut (1965) and Roe v. Wade (1973), had derived a right to privacy from the due process clause. The U.S. Court of Appeals for the Eleventh Circuit had agreed and had invalidated the Georgia law. The Supreme Court overturned that decision by a vote of five to four. The majority opinion, written by Justice Byron White, placed definite limits upon such a right. White pointedly and repeatedly referred to a “right [of] homosexuals to engage in sodomy,” in order to differentiate the specifics of the case from any general right to privacy. He insisted that the lower court, in upholding Hardwick’s right, had extended the right to privacy beyond any reasonable constitutional interpretation. He wrote: There should be . . . great resistance to expand the substantive reach of [the due process] Clauses, particularly if it requires redefining the category of rights deemed to be fundamental. Otherwise, the Judiciary necessarily takes to itself further authority to govern the country without express constitutional authority. The claimed right pressed on us today falls far short of overcoming this resistance. Impact The political impact of Bowers v. Hardwick was enormous. The case was decided during a time of conservative backlash against the sexual revolution and the women’s rights and lesbian, gay, bisexual, and transgendered (LGBT) rights movements. Anti-abortion groups were sharply criticizing the practical outcome reached in Roe v. Wade, another case decided on the basis of a constitutional right to privacy. This right to privacy, which the Court had determined to exist implicitly in the Bill of Rights and the Fourteenth Amendment, was also being questioned by conservative scholars, who were opposed to what they saw as legislation from the judicial bench. The Bowers Court slowed acceptance of privacy as a constitutional right. Meanwhile, the AIDS pandemic caused a great deal of public panic and, as a result, had fueled a great deal of homophobia, as well as sex phobia. As a result of Bowers, the brakes were slammed on the progressive sex-positive policies begun during the 1960’s. The effect of Bowers, ironically, had little to do with the issue of criminal sodomy. Sodomy laws were seldom enforced against private consensual conduct in the years before and after the Bowers decision. However, by allowing states to criminalize gay sexual behavior, the decision served the foes of gay and lesbian anti-discrimination laws, hate-crimes laws, and
later same-sex marriage laws. It was argued that there ought not to be equal protection for individuals to engage in criminal sexual behavior. Subsequent Events
After Bowers was decided, many states repealed or overturned their own sodomy laws. The Georgia Supreme Court ruled that the state’s sodomy law violated the state constitution in Powell v. State (1998). At the national level, Bowers was effectively overturned by the Supreme Court in 2003 in Lawrence v. Texas.
Further Reading
Harvard Law Review Association. Sexual Orientation and the Law. Cambridge, Mass.: Harvard University Press, 1989. Leonard, Arthur S. “Equal Protection and Lesbian and Gay Rights.” In A Queer World: The Center for Lesbian and Gay Studies Reader, edited by Martin Duberman. New York: New York University Press, 1997. Mohr, Richard D. Gay Ideas: Outing and Other Controversies. Boston: Beacon Press, 1992. Ringer, Jeffrey R., ed. Queer Words, Queer Images. New York: New York University Press, 1994. Thomas, Kendall. “Corpus Juris (Hetero) Sexualis: Doctrine, Discourse, and Desire in Bowers v. Hardwick.” In A Queer World: The Center for Lesbian and Gay Studies Reader, edited by Martin Duberman. New York: New York University Press, 1997. Daniel-Raymond Nadon See also
ACT UP; AIDS epidemic; AIDS Memorial Quilt; Homosexuality and gay rights; Kiss of the Spider Woman; Military ban on homosexuals; Supreme Court decisions; Toronto bathhouse raids of 1981.
■ Boxing Definition
Professional prizefighting
The 1980’s produced an unusually large number of great fighters and saw many memorable fights. The period was also marked by a continued proliferation of ring-governing bodies, increased competition among boxing promoters, and several high-profile ring deaths that led to rule changes aimed at protecting fighters. Among the top boxers of the 1980’s were heavyweight Larry Holmes, who dominated the heavyweight class during the first half of the decade, and
The Eighties in America
Mike Tyson, who dominated it during the second. Top fighters in the lower weight classes included Sugar Ray Leonard, Wilfred Benitez, Roberto Duran, Thomas Hearns, Marvin Hagler, Alexis Arguello, and Salvador Sánchez. Many of these lighterweight fighters moved up in weight class during the decade in order to fight the other top fighters of the period, creating an exceptionally high number of exciting bouts. The Great Fighters and the Great Fights
Two heavyweight legends of the previous era in boxing, Muhammad Ali and Joe Frazier, both fought their last fights in December of 1981, as the torch was passed to the new generation. The immediate recipient of that torch was Larry Holmes, who opened the decade as the dominant force in the heavyweight division. After winning the World Heavyweight Championship in 1978, Holmes defended the title sixteen times between 1980 and 1985, when he finally lost it by decision to Michael Spinks. After a brief period of
Boxing
■
133
multiple champions, the youthful Mike Tyson exploded onto the scene, unifying the title in a series of bouts in 1986 and 1987 and establishing himself as the top heavyweight of the second half of the decade. Sugar Ray Leonard, who won his first professional title as a welterweight in 1979, fought many of the top fighters of the lower weight classes during the 1980’s. After winning the World Boxing Council (WBC) welterweight title from Wilfred Benitez in 1979, he lost the title to former lightweight champion Roberto Duran in June of the following year but regained it in a rematch in November. In 1981, Leonard fought undefeated knockout artist Thomas Hearns, winning by an exciting fourteenth round technical knockout (TKO) and adding the World Boxing Association (WBA) welterweight title to his earlier title. Although sidetracked by eye problems during the middle years of the decade, Leonard returned to the ring in April of 1987, defeating middleweight champion Hagler in a controversial decision.
Sugar Ray Leonard, right, delivers a blow to Roberto Duran during their first welterweight title fight in June, 1980. Duran won the fight by decision, but Leonard won their rematch in November. (AP/Wide World Photos)
134
■
The Eighties in America
Boxing
Benitez, Duran, and Hearns went on to win additional titles and to fight in other top bouts of the decade. Benitez won the WBC junior middleweight title in 1981 and defended it successfully against Duran the following year, before losing it later in the same year to Hearns. Duran won the WBA version of the same title two years later, as well as the WBC middleweight title in 1989, and fought memorable losing efforts against both Hagler (1983) and Hearns (1984). Hearns also held titles in several weight classes during the decade, in addition to engaging in a historic losing effort against Hagler for the middleweight title in 1985, the first round of which is considered to be one of the most exciting rounds in the history of the sport. Only Hagler among the fighters mentioned here fought solely in his primary weight class during the decade. The great former featherweight and super featherweight champion of the 1970’s, Alexis Arguello, also moved up in class during the 1980’s, capturing the WBC lightweight title in 1981. He then moved up yet another weight class to challenge WBA junior welterweight title-holder Aaron Pryor, losing twice by knockout in epic struggles in 1982 and 1983. One of the most promising young fighters of the era, featherweight Salvador Sánchez, exploded onto the boxing scene in 1980, winning the WBC featherweight title from Danny Lopez by a fourteenthround knockout. He then defended the title twice, knocking out two of the top fighters in the division, Wilfredo Gomez and Azumah Nelson, before dying in a car crash in August of 1982. Other popular fighters of the period included flashy boxer and ring personality Hector Camacho, who held titles in three weight classes—junior lightweight, lightweight, and junior welterweight—during the decade; lightweight champions Ray Mancini and Edwin Rosario; and Irish boxer Barry McGuigan, who held the WBA featherweight title from 1985 to 1986. Organizational Dimensions On the organizational side of the sport, the 1980’s saw the arrival of two new governing bodies, the International Boxing Federation (IBF) in 1983 and the World Boxing Organization (WBO) in 1989. These served, along with the already existing World Boxing Council (WBC) and the World Boxing Association (WBA), to bring the total number of such organizations to four and to
further increase (and fragment) the titles of the various weight divisions. On a more positive note, new fight promoters—among them Bob Arum, Butch Lewis, and Murad Muhammad—arose to challenge to some degree the supremacy of Don King. Finally, the deaths early in the decade of bantamweight Johnny Owens (1980) and lightweight Duk Koo Kim (1982) from injuries suffered in the ring led to new rules protecting the safety of fighters, most notably the shift from fifteen to twelve rounds as the length for championship fights. Impact The continuing proliferation of governing bodies in professional boxing served to create numerous titles at each weight class, diluting in the process the significance of each title. However, the fighters who held these titles were generally of high quality, and they fought each other quite often during the decade, resulting in many exciting matches to entertain existing fans and win new ones. Thus, the 1980’s was a rich era in boxing history, particularly in the lower weight divisions. Further Reading
Giudice, Christian. Hands of Stone: The Life and Legend of Roberto Duran. Wrea Green, Lancashire, England: Milo Books, 2006. Biography of one of the great fighters of the era, who fought the best in five weight classes, including Leonard, Benitez, Hearns, and Hagler. Heller, Peter. Bad Intentions: The Mike Tyson Story. New York: New American Library, 1989. Early biography of Tyson, chronicling his early life and rise to dominance in the heavyweight division in the 1980’s. McIlvanney, Hugh. The Hardest Game: McIlvanney on Boxing. Updated ed. New York: Contemporary Books, 2001. Contains articles by boxing writer McIlvanney on many of the great fights and fighters of the decade. Myler, Patrick. A Century of Boxing Greats: Inside the Ring with the Hundred Best Boxers. London: Robson Books, 1999. Contains short biographies and ring records of most of the boxers mentioned. Scott Wright See also Holmes, Larry; Leonard, Sugar Ray; Raging Bull; Sports; Tyson, Mike.
The Eighties in America
■ Boy George and Culture Club Identification
British pop band
For a brief period in the 1980’s, Culture Club was one of the most popular bands on the international pop scene, and its controversial lead singer, Boy George, was among the era’s most recognizable faces. The band Culture Club made a big splash on the pop music charts and in music videos during the middle years of the 1980’s. Band members included bassist Mikey Craig, guitarist and keyboard player Roy Hay, drummer Jon Moss, and singer Boy George (born George Alan O’Dowd). Between 1983 and 1986, Culture Club had a string of top-ten hits in the United States, including “Do You Really Want to Hurt Me?” (1983), “I’ll Tumble 4 Ya” (1983), “Karma Chameleon” (1984)—which reached num-
Boy George and Culture Club
■
135
ber one in both the United Kingdom and the United States—and “Move Away” (1986). The band had a charming, catchy, pop sound sometimes characterized as “blue-eyed soul,” referring to music written and performed by white musicians but influenced by such black musical styles as soul, rhythm and blues, and reggae. Perhaps even more than their music, though, the band was known for the flamboyantly androgynous look and gender-bending antics of its charismatic front man, Boy George, whose stage persona wore outrageous costumes, heavy pancake makeup, lipstick, and dramatic eyeliner. George was not the first pop musician to play with gender expectations in this way. His look and attitude were influenced by such singers of the 1970’s as “glam rock” star David Bowie and the theatrical lead singer of Queen, Freddie Mercury. Boy George, however, took the look further and
Culture Club in 1984. From left: Jon Moss, Roy Hay, Boy George, and Mikey Craig. (Hulton Archive/Getty Images)
136
■
Boyle, T. Coraghessan
made it more visible, as the new television channel MTV provided an outlet for Culture Club’s distinctive music videos. During the band’s heyday, there was a good deal of public speculation about Boy George’s sexuality, though the singer himself generally remained coy about the issue. In later years, he discussed his homosexuality more openly. By 1985, Boy George’s heavy drug use had begun to affect the band’s ability to function. He became addicted, at various times, to cocaine, heroin, and prescription narcotics, and in 1986 he was arrested for possession of cannabis. In 1987, he released his first solo album, Sold, which included several songs that were hits in Britain. Without the other members of Culture Club, however, Boy George failed to achieve major popularity with U.S. audiences. Impact Few critics would claim that Boy George and Culture Club had a major influence on later pop music. Individual band members continued to work after the 1980’s, and the band even got together for reunion tours and an album in the late 1990’s. Their moment as a significant cultural force, though, was largely confined to the middle years of the 1980’s, a fact that guaranteed their later assocation with nostalgia for the decade. Further Reading
Boy George and Spencer Bright. Take It Like a Man: The Autobiography of Boy George. New York: HarperCollins, 1995. Rimmer, David. Like Punk Never Happened: Culture Club and the New Pop. London: Faber and Faber, 1986. Robbins, Wayne. Culture Club. New York: Ballantine, 1984. Janet E. Gardner See also Androgyny; Homosexuality and gay rights; MTV; Music; Music videos; Pop music.
■ Boyle, T. Coraghessan Identification
American novelist and short-story writer Born December 2, 1948; Peekskill, New York Boyle published three novels and two collections of short stories during the 1980’s, establishing himself as one of the most distinctive voices in American fiction.
The Eighties in America
As a teenager, Thomas John Boyle adopted his mother’s maiden name, changing his name to T. Coraghessan Boyle. The graduate of the State University of New York at Potsdam entered the Writers’ Workshop at the University of Iowa in 1972. A collection of stories, published as Descent of Man (1979), served as his dissertation in 1977. Boyle then began teaching at the University of Southern California. Boyle’s first novel was Water Music (1981), loosely based on the experiences of Scottish explorer Mungo Park (1771-1806). The novel features a narrative that alternates between Park and Ned Rise, a fictional London criminal who joins Park’s African expedition. The deliberately anachronistic, postmodern novel established Boyle’s concern with the disparity between the haves and have-nots, presenting Park’s cultured Britain in ironic juxtaposition to Rise’s poverty. The novel’s innumerable coincidences, moreover, indicated Boyle’s debt to Charles Dickens. In Budding Prospects: A Pastoral (1984), Felix Nasmyth, having failed at everything else, tries raising marijuana in Northern California. Felix and his friends want to get rich quick, and Boyle uses them to satirize American greed and the perversion of the free-enterprise system. Greasy Lake, and Other Stories (1985) was Boyle’s first short-story collection after his dissertation. It dealt with such topics as survivalist paranoia, an Elvis Presley imitator, and an affair between Dwight D. Eisenhower and the wife of Soviet premier Nikita S. Khrushchev. “The Hector Quesadilla Story,” in which an aging baseball player redeems himself during the longest game in history, demonstrated Boyle’s concerns with myth, redemption, and popular culture as a metaphor for American life. World’s End (1987) represented a shift to less comic fiction, as Boyle presented conflicts among Dutch and English settlers and Native Americans in New York’s Hudson River Valley in the seventeenth century and the consequences of those conflicts for the inhabitants’ twentieth century descendants. In the novel, the wealthy Van Warts exploit the poorer Van Brunts, while the Kitchawanks are consumed by a desire for revenge. Boyle uses these characters to explore myths about America and to dramatize the nation’s self-destructive impulse. The stories in If the River Was Whiskey (1989) depict such characters as a Hollywood public relations specialist who tries to transform the image of the
The Eighties in America
Ayatollah Ruhollah Khomeini and a man forced to wear a full-body condom by his health-obsessed girlfriend. The collection also includes a parody of Lawrence Kasdan’s film The Big Chill (1983). “Me Cargo en la Leche (Robert Jordan in Nicaragua),” on the other hand, is less a parody of Ernest Hemingway’s For Whom the Bell Tolls (1940) than a questioning of Americans’ loss of idealism. The unhappy marriages in “Sinking House” comment on the emotional failures of the American middle class. Impact Boyle refused to be tied down to a single subject, genre, or style, demonstrating a restless need to encompass all of American experience in his fiction. His true subject was the contradictions at play in the American soul. His work was therefore of particular importance to the 1980’s, a decade in which the contradictions between humanitarianism and greed, between altruism and nationalism, were more apparent than ever. Further Reading
Boyle, T. Coraghessan. “An Interview with T. Coraghessan Boyle.” Inverview by Elizabeth Adams. Chicago Review 37, nos. 2/3 (Autumn, 1991): 5163. Kammen, Michael. “T. Coraghessan Boyle and World’s End.” In Novel History: Historians and Novelists Confront America’s Past (and Each Other), edited by Marc C. Carnes. New York: Simon & Schuster, 2001. Law, Danielle. “Caught in the Current: Plotting History in Water Music.” In-Between: Essays and Studies in Literary Criticism 5, no. 1 (March, 1995): 41-50. Michael Adams See also
Big Chill, The; Literature in the United
States.
■ Brat Pack in acting Definition
Group of young American actors
The so-called Brat Pack figured prominently in entertainment news and gossip columns of the 1980’s. The young, attractive actors appeared in several hit films together and frequently socialized with one another off screen as well, helping drive tabloid sales during the decade. As early as the 1940’s, Hollywood studios produced some films that focused on and appealed to Ameri-
Brat Pack in acting
■
137
cans in their teens and twenties. During the 1980’s, however, that demographic assumed a much more important role in Hollywood’s production and marketing decisions. As a result, the teen comedy subgenre—characterized by earthy, raucous accounts of sex, school, and family told from the perspective of high school and college students and inspired in great part by two hits of the 1970’s, Animal House (1978) and American Graffiti (1973)—came to occupy a sizable proportion of the nation’s movie screens for the first time. Some of the most successful teen comedies starred members of a group of young performers who came to be known in the media as the “Brat Pack,” after a cover story in New York magazine so labeled them. The name was modeled after the Rat Pack, the nickname given to a group of singers, actors, and comedians in the 1960’s, centered on Frank Sinatra. As with Sinatra’s group of friends, the roster of Brat Pack members was unofficial and fluid, but seven names figured most prominently in publicity using the term: Molly Ringwald, Anthony Michael Hall, Rob Lowe, Demi Moore, Ally Sheedy, Judd Nelson, and Emilio Estevez. Andrew McCarthy and Matthew Broderick were also often considered Brat Packers, as was John Hughes, the director of some of their most popular films, though he was somewhat older than his stars. Likewise, the question of what qualifies as a Brat Pack film is subject to controversy, although four are most often considered examples: Sixteen Candles (1984), The Breakfast Club (1985), Pretty in Pink (1986), and St. Elmo’s Fire (1985). The first three of these—all comedies directed by Hughes—are sometimes seen as a Brat Pack trilogy, linked by shared character types and themes rather than by shared characters or continuing story lines. St. Elmo’s Fire was the Brat Pack’s most noteworthy attempt at straightforward drama. Others sometimes seen as Brat Pack films include Oxford Blues (1984) and Class (1983). Icons and Archetypes The actors who made up the Brat Pack came to be seen as a distinct group, simply because they often worked—and played—together in public. However, their significance for the 1980’s lies not in their being branded with a memorable label by the press but in the iconic status they achieved in their films. The best evidence of this is found in Hughes’s The Breakfast Club, in which five of the Brat Pack’s members embody high school character stereotypes: the athlete (Estevez), the teen princess
138
■
Brat Pack in acting
The Eighties in America
to explore the meaning or significance of those icons. Thus, most of their films, though entertaining, were ultimately insubstantial, more adept at illustrating issues than at exploring the causes and solutions of those issues. The Hughes films seemed fatalistic, even despairing, in their outcomes. At the end of The Breakfast Club, for instance, the five protagonists go home, their forced togetherness during a Saturday spent in detention ended. For one day, they have bonded, and they plan to continue meeting as the club of the film’s title, but little in their parting scenes indicates that they will succeed in Brat Packers Rob Lowe, left, and Emilio Estevez, right, flank fellow youth star Tom Cruise their newfound desire to break at a Beverly Hills premiere party in 1982. (Hulton Archive/Getty Images) down the walls dividing the high school cliques to which they belong. Similarly, many viewers, (Ringwald), the juvenile delinquent (Nelson), the both then and since, have complained about the geek (Hall), and the weirdo (Sheedy). These porending of Pretty in Pink, in which Ringwald’s “teen trayals provided contemporary revisionings of these queen” chooses the handsome high school “hunk” stereotypes, making them more specific to the over the nerdy boy who seems more suited to her. 1980’s. Sheedy, for example, depicted not some geAgain, Hughes seems to suggest that young people neric weirdo but a specifically 1980’s embodiment— must inevitably follow certain rules and strictures: black-clad, vaguely Goth—and Ringwald provided a The pretty, popular girl must choose the handsome 1980’s take on the “girl next door”: waifish but agboy and not experiment with other possibilities or gressive, clothes-conscious but obviously influenced follow her heart. The ensemble’s most prominent somewhat by punk styles popular at the time. attempt at drama, St. Elmo’s Fire, with its multiple Pioneer film theorist Parker Tyler suggested in story lines following a group of graduates through his work that each new generation of film stars intheir first year after college, failed to impress critics, cludes actors and actresses who come to embody arwho found the dialogue trite and the plot unimagichetypes common to human consciousness and culnative. ture. During the 1980’s, the Brat Pack arguably fulfilled a similar function, embodying in their films Impact The Brat Pack was very much a phenomesome of the stereotypes through which teens of the non of its decade. By the early 1990’s, many of the ca1980’s made sense of their world. By repeatedly playreers of the group had begun to fade, even that of ing similar roles in similar films throughout the deMolly Ringwald, who had been the most admired of cade, these actors became icons in their own right. the core members. Most continued to act in indeFor moviegoers of the era, it was difficult to think of pendent films or on television, notably Ally Sheedy, the “high school girl” or the “science nerd” without and Emilio Estevez tried his hand at directing; howimmediately imagining Ringwald or Hall, respecever, only Demi Moore enjoyed a long and lucrative tively. career in mainstream films. Nevertheless, the Brat Packers provided the decade that was their heyday Issues and Unease If the Brat Pack embodied culwith iconic representations of American youth of turally resonant icons, however, their work did little the time, and their successful collaborations with
The Eighties in America
John Hughes ensured the persistence of the teen comedy as a cinematic genre for decades afterward. Also, despite its flaws, St. Elmo’s Fire can readily be seen as the prototype for an immensely popular television genre that developed in its wake: prime-time teen soap operas such as Beverly Hills, 90210, and The O.C. which featured multilayered narratives and large casts of characters almost exclusively in their teens and twenties. Further Reading
Blum, Daniel. “The Brat Pack.” New York, June 10, 1985, 40-47. The article that gave the group its name. Good representation of the sort of publicity that Brat Pack members received at the time. Davies, Steven Paul, and Andrew Pulver. Brat Pack Confidential. London: Batsford, 2003. Brief but comprehensive and insightful study. Tyler, Parker. Magic and Myth of the Movies. New York: Simon & Schuster, 1970. Seminal cinematic text explaining how actors become archetypes for their era, as the Brat Pack did. Thomas Du Bose See also Brat Pack in literature; Breakfast Club, The; Broderick, Matthew; Film in the United States; Hughes, John; Teen films.
■ Brat Pack in literature Definition
American novelists Tama Janowitz, Jay McInerney, and Bret Easton Ellis
In the 1980’s, a high-profile group of young people made fortunes in such industries as investment banking and computer start-up companies. The popular portrayal of this general trend influenced the reception of a group of young fiction writers who were dubbed the “Brat Pack” by the media. They were highly publicized as authentic voices of a new, hip generation, and they embraced that role and the publicity it brought them. The three definitive members of the Brat Pack rose to fame during three consecutive years in the middle of the 1980’s: Jay McInerney in 1984 for Bright Lights, Big City; Bret Easton Ellis in 1985 for Less than Zero; and Tama Janowitz in 1986 for Slaves of New York. Although Janowitz had published a first book, American Dad, in 1981, it had received almost no attention. It was the publication and notable success of
Brat Pack in literature
■
139
McInerney’s first novel that led publishers to look for other works about young, urban people, written by young authors who were themselves immersed in urban settings. The rapid publication and acclaim awarded these books led the press to invent the term “Brat Pack,” modeled on the term “Rat Pack” that designated three crooners in the 1970’s: Frank Sinatra, Sammy Davis, Jr., and Dean Martin. McInerney and Ellis Jay McInerney was born on the East Coast, attended Williams College in Massachusetts, and in 1977 received a Princeton Fellowship to travel to Japan, where he studied and taught English at Kyfto University. Returning to the United States in 1979, he worked first at New York magazine and later as a reader of unsolicited manuscripts at Random House. He became familiar with the haunts and habits of 1980’s youth culture in New York, which he used as background for Bright Lights, Big City. The novel portrayed its characters as drugaddled, angst-ridden, superficial young men and women. The novel was fast-paced and written in the second person, instead of the more conventional first or third person. It was considered by critics to express the zeitgeist of the 1980’s, and McInerney was often taken to be the model for the novel’s protagonist, a charge he denied. Bright Lights, Big City was adapted for a 1988 film of the same name, for which McInerney wrote the screenplay. Despite the wide popularity of the book, the movie adaptation was not a success. McInerney published two more novels in the 1980’s: Ransom (1985) and Story of My Life (1988), but neither fulfilled the promise of his first work. Bret Easton Ellis’s literary debut occurred when he was only twenty-one. His novel, Less than Zero, portrayed the rich, drug-soaked Los Angeles party scene of the times, full of disaffected and vacuous youth. He was considered part of that scene, having grown up in Sherman Oaks, in the San Fernando Valley, but he moved east to attend Bennington College in Vermont. Like McInerney, his novel seemed to capture the mood of what was then called the twentysomething generation, and it sold well. In 1987, Ellis moved to New York, and his second novel, Rules of Attraction, appeared. His novels fit into the postmodern movement with their attendant techniques of self-reference, the presence of real characters, and a flat style of narration. Less than Zero and Rules of Attraction were both made into
140
■
The Eighties in America
Brawley, Tawana
films, as was Ellis’s notoriously violent American Psycho (1990), which achieved a cult following. Janowitz Tama Janowitz was brought up in a highly educated, literate household: Her father, Julian, was a Freudian psychiatrist, and her mother, Phyllis, was a poet and assistant professor at Cornell University. They divorced when Janowitz was ten, and she and her brother were brought up primarily in Massachusetts. She earned a B.A. from Barnard College in 1977 and the following year received an M.A. from Hollins College, where she wrote her first novel. In the early 1980’s, Janowitz enrolled in the Yale School of Drama and then spent two years at the Fine Arts Work Center in Provincetown, Massachusetts. She also earned an M.F.A. from Columbia University a year before Slaves of New York was published. A couple of the short stories included in that collection, her best-known work, were published in magazines, including the title story in The New Yorker. Adept at promoting herself, Janowitz became a friend of Andy Warhol, made the rounds of the New York art parties, and set herself up on the gossip circuit before the book’s publication. When it came out in 1986, it was a great success, appearing on the New York Times best seller list, as well as all the other major lists of best-selling fiction. The collection made Janowitz an instant celebrity, and it was made into a film in 1989 directed by James Ivory and starring Bernadette Peters. Although Janowitz wrote the screenplay and worked closely with producer Ismail Merchant and director Ivory, the film was not a success. In 1987, Janowitz published A Cannibal in Manhattan, a reworking of a previous manuscript. The protagonist, Mgungu Yabba Mgungu, is a young man brought to New York from a remote island by a wealthy socialite. He finds city life both barbarous and incomprehensible, and his viewpoint acts as commentary on a New York society of consumerism and capitalism. The critics were not as impressed with A Cannibal in Manhattan as they had been with the earlier work. Impact The media’s breathless reporting of every public appearance the Brat Pack made, as well as of their personal peccadilloes and habits, fostered the idea that writers’ lives were equal in importance to their work. Their celebrity never equaled that of music or film stars, but the Brat Pack demonstrated that the world of literature was capable of producing tabloid sensations of its own. As they were used by the
media, the three writers learned to use the media in turn for publicity purposes. The later careers of Janowitz, Ellis, and McInerney, however, failed to fulfill their early, and exaggerated, promise. Further Reading
Calcutt, Andrew, and Richard Shephard. Cult Fiction: A Reader’s Guide. Lincolnwood, Ill.: Contemporary Books, 1999. All three members of the Brat Pack are included in this comprehensive study of dozens of fiction writers who achieved a cult following. St. John, Warren. “His Morning After.” The New York Times, February 5, 2006. Describes the changes McInerney went through in the years after Bright Lights, Big City. Spy Editors. Spy Notes on McInerney’s “Bright Lights, Big City,” Janowitz’s “Slaves of New York, ” Ellis’ “Less than Zero,” and All Those Other Hip Urban Novels of the 1980’s. New York: Dolphin/Doubleday, 1989. Provides literary criticism and analyses of the Brat Pack’s most famous books, as well as other 1980’s urban fiction. Sheila Golburgh Johnson See also Book publishing; Brat Pack in acting; Clancy, Tom; Journalism; Literature in the United States; Minimalist literature; thirtysomething; Yuppies.
■ Brawley, Tawana Identification
Accuser in a prominent 1987 New York rape case Born September 3, 1972; Duchess County, New York Against a backdrop of several prominent racial incidents in New York, Tawana Brawley’s accusations of rape against six New York police officers and public officials created a storm of controversy and raised wide-ranging questions about issues of race, gender, politics, and media in the United States. Tawana Brawley was a fifteen-year-old African American high school student who leaped into the national spotlight after claiming she had been raped and abused by six white men, including several police officers. On November 28, 1987, she had been found behind her home lying in a fetal position, smeared with feces and wrapped in a garbage bag.
The Eighties in America
Brawley, Tawana
■
141
Protesters led by C. Vernon Mason (second from left), Al Sharpton (center), and Alton Maddox (right) march on New York mayor Ed Koch’s home to protest Tawana Brawley’s treatment. (AP/Wide World Photos)
Her clothes were torn and burnt and her hair was cut short and matted. When her clothes were removed at the hospital, racial slurs were found written on her body, prompting the Duchess County Sheriff’s Department to call in the Federal Bureau of Investigation (FBI) to investigate possible civil rights violations. Almost immediately, three members of the district attorney’s office began examining the case, and within two weeks the New York State Police had joined the investigation. The suicide days later of part-time police officer Harry Crist, Jr., offered investigators a potential lead in the case, although the connection between Brawley’s abduction and Crist’s suicide proved to be tenuous, partly because of an alibi presented to the grand jury by prosecutor Steven Pagones. Soon, advisers to the Brawley family—attorneys Alton Maddox, Jr., and C. Vernon Mason and the Reverend Al Sharpton—refused to allow Brawley or her family to cooperate with the investigation, insisting that justice was impossible for African Americans in a white-
dominated legal system. Eventually Sharpton, Maddox, and Mason asserted that Pagones was one of the rapists and accused him in frequent news conferences and speeches of being complicit in a plot to cover up the crime and protect its perpetrators. During a seven-month-long hearing, the grand jury identified numerous inconsistencies between the evidence and Brawley’s account of the crime and heard from witnesses whose testimony cast doubt on Brawley’s motives and the veracity of her story. Eventually, they concluded that no crime had actually occurred and that no officials had been involved in any effort to conceal a crime. Impact The Brawley case raised many of the concerns about race and justice that had become particularly prominent as mainstream issues in the 1980’s. The perceived ability of Brawley and her advisers to manipulate those concerns—and through them, the media—shaped public attitudes throughout the remainder of the decade and beyond. Brawley’s
142
■
name was frequently evoked in similar high-profile cases to cast doubts upon a victim’s veracity. Brawley herself quickly disappeared after the grand jury verdict, moving to Virginia with her family and eventually converting to Islam and adopting a Muslim name. Her impact on those who were drawn into her story proved to be profound and long-lasting. Pagones sued Sharpton, Maddox, and Mason for defamation of character and won a $345,000 judgment. Sharpton moved into the political arena, eventually seeking the Democratic nomination for president, but his involvement in the Brawley affair continued to tarnish his reputation. Further Reading
Macfadden, Robert D. Outrage: The Story Behind the Tawana Brawley Hoax. New York: Bantam, 1990. Taibbi, Mike, and Anna Sims-Phillips. Unholy Alliances: Working the Tawana Brawley Story. New York: Harcourt, 1989. Devon Boan See also
African Americans; Bonfire of the Vanities, The; Central Park jogger case; Goetz, Bernhard; Howard Beach incident; Racial discrimination; Rape; Scandals.
■ Break dancing Definition
The Eighties in America
Break dancing
Hip-hop street dance style
Thought of as a constructive alternative to violent urban street gangs, it managed to somewhat divert violence and to be passed as a source of inspiration via word-of-mouth rather than through formal dance instruction. Break dancing is characterized by multiple body contortions, wriggling, electric waves, popping body parts, and touching the ground with one’s hands, back, or head—all performed with mechanical precision. It possesses an unstructured and improvisational format, so different elements can be inserted at will into a dance. The most important elements remain coordination, flexibility, style, rhythm, and transitions. The dance incorporates other moves with specific purposes: “Uprock,” for example, mimics combat through foot shuffles, spins, and turns. “Downrock” is performed with both hands and feet on the floor, as a preliminary chain of movements progresses to more physically demanding power
moves that display upper-body strength. Dance routines include “freezes,” during which the body is suspended off the ground, and they usually end with a “suicide,” a falling move in which it appears that the dancer has lost control and crashes. Break Dancing’s Beginnings
Influenced by martial arts, Brazilian capoeira, gymnastics, and acrobatics, break dancing originally appealed to a generation of youth striving against the demands of society and city life. The movement originated in the South Bronx, New York, and during the 1980’s, it prevailed among rival ghetto gangs as an alternative means of resolving territorial disputes. Soon, it grew from a ritual of gang warfare into a pop culture phenomenon that captured the attention of the media. Break dancing got its name from the music to which it was performed, which followed a “break” structure, made of multiple samples of songs of different genres, including jazz, soul, funk, disco, rhythm and blues, and electro-funk. These samples were compiled and chained together by a disc jockey, or deejay. Tempo, beat, and rhythm cued dancers in the performance of their moves, and dancers would often have specific moves at which they excelled, called “perfections.” Break-dance battles resembled chess games, as the participants sought to catch their opponents off-guard by challenging them with unexpected moves. From its onset, break dancing represented a positive diversion from the threatening demands of city life; however, it was not entirely successful as an alternative to fighting. Nevertheless, break dancing did provide a stage where many youth experienced a feeling of belonging, helping them define themselves and open themselves to socialization. The jargon they created, the Adidas shoes and hooded nylon jackets they established as fashion, and their boom boxes and portable dance floors were all part of their attempt to flood the streets with their presence and their purpose: bringing dance to every corner of the inner city.
From the Streets to the Screen to the Olympics
Afrika Bambaataa can be said to have designed, carried forward, and nurtured the street life of break dancing through his work as a record producer. (His “Looking for the Perfect Beat” was a number-four single in 1983.) In addition, his leadership in the Zulu Nation, the spiritual force behind break dancing, became the engine that ran the break-dancing
The Eighties in America
Break dancing
■
143
such films as Wild Style (1981), which inaugurated a new style of break dancing that included acrobatics in the form of head spins, “handglides,” and “windmills.” In addition, the invention of the beat box in 1981 allowed for the sophisticated programming of beats and rhythms, greatly expanding deejays’ creative powers. Finally, in 1982 the New York rollerskating rink the Roxy became a hip-hop center sponsoring break-dance concerts. This high-profile venue helped disseminate the dance form throughout and far beyond New York. By 1983 and 1984, break dancing became a dance craze, spreading to major cities, dance contests, and music videos. It began to influence other dance forms. In 1983, many of New York’s top break-dancers performed for President Ronald Reagan and many other cultural leaders during a tribute to anthropological choreographer Katharine Dunham as part of the annual Kennedy Center Honors. The event was broadcast nationwide. In 1984, break dancing was included as part of the opening ceremony for the Olympic Games in Los Angeles. Gender Issues
A Manhattan break-dancer performs during an October street fair in 1989. (AP/Wide World Photos)
machine. He encouraged young dancers to believe in themselves and to persevere in their endeavors. Following the push that Bambaataa gave to the street dance form, the mass media introduced break dancing to the general public in April, 1981. This introduction proved to be a double-edged sword. It made the movement known nationwide, but it also marked the decline of its more ritualized, competitive aspects. With the rising demand for break-dancers from MTV and the movie industry, potential financial success replaced the original street rivalries as break-dancers’ primary motivation. They began to rehearse and hone their skills to be discovered, rather than to defeat rivals. Break dancing in the 1980’s was also advanced by
Break dancing began as a maledominated form of expression, and it remained associated with masculinity for the length of the decade. Masculine solidarity and competition provided the context of the dance, which often expressed such stereotypically male traits as aggression. The nearly exclusive association of the dance form with young African American male subcultures entailed the alienation of break-dancers not only from women but also from mainstream American culture and from the members of previous generations. Women break-dancers did exist, but they were relegated to the background of the movement, leaving the foreground to the male dancers. The dance form’s origins in mock combat made men reluctant to engage in dance “battles” with women, while the need for significant upper-body strength to perform break-dancing moves favored typical male physiques over typical female physiques. Despite the masculine culture of break dancing, however, some women managed to step to the foreground. For example, Puerto Rican Daisy Castro, also known as Baby Love, achieved significant recognition for her dancing— more than many male break-dancers were comfortable with.
Impact Break dancing’s popularity lasted throughout the decade, thanks to its exposure in commer-
144
■
Breakfast Club, The
The Eighties in America
cials, movies, and other media. While it enjoyed this spotlight, break dancing influenced music composition, music technology, fashion, and other dance forms, as well as helping to definine hip-hop culture generally. Despite the media explosion, though, many believed that break dancing belonged in the streets, and after the craze faded, the form returned to the street corners where it had been born. Further Reading
Breakdancing: Mr. Fresh with the Supreme Rockers. New York: Avon Books, 1984. Technically a how-to book, but the opening chapters cover the history of break dancing. Includes a complete glossary of terms and step descriptions. Perkins, William E., ed. Droppin’ Science: Critical Essays on Rap Music and Hip Hop Culture. Philadelphia: Temple University Press, 1996. Particularly the chapters “Women Writin’ Rappin’ Breakin’,” “Hip Hop,” and “Dance in Hip Hop Culture” provide the most enlightening information on the impact of break dancing upon hip-hop culture. Toop, David. Rap Attack 2: African Rap to Global HipHop. London: Pluto Press, 1991. Helpful for understanding the hip-hop culture that gave birth to break dancing. Sylvia P. Baeza See also
Dance, popular; Jazz; MTV; Music; Olympic Games of 1984; Performance art.
■ Breakfast Club, The Identification Teen movie Writer/Director John Hughes (1950Date Released February 15, 1985
)
The Breakfast Club achieved a cult following among teenagers, helping define teen culture of the 1980’s and cementing John Hughes’s reputation as a master of teen films. In 1985, John Hughes released The Breakfast Club, his second directorial effort after 1984’s Sixteen Candles. The 1985 film takes place at a high school in Shermer, Illinois, on a Saturday when five teenagers are serving a day of detention, each for his or her own transgressions. Aside from being forced to spend the day together in the same room, the characters in the film appear at first to have nothing in common. However, as the teens interact, they get to know one another,
Molly Ringwald played the prom queen in The Breakfast Club. (AP/Wide World Photos)
and they realize that they share much more than a simple Saturday detention. Fellow Brat Packers Emilio Estevez, Molly Ringwald, Judd Nelson, Anthony Michael Hall, and Ally Sheedy made the roles of the five teens their own. Each character in the film represents a stereotypical teen personality: the jock, the prom queen, the juvenile delinquent, the geek, and the weirdo. Critics of the film have often dismissed these characters as one-dimensional, but others have argued that the film’s humorous and emotionally effective dialogue reveals hidden facets of each character, while engaging issues all teenagers face, such as insecurity and the pressure to live up to society’s standards. The Breakfast Club was a reasonable hit, striking a chord with teenagers all over the United States and taking in more than $45 million. The sound track to the film was also extremely popular, especially the single “Don’t You Forget About Me” by the New Wave band Simple Minds. Hughes’s decision to accompany an extended sequence in the film with nearly the entire song helped popularize the strategy of using teen films to pushing hit singles, which became an important marketing device for the industry. Moreover, be-
The Eighties in America
Brett, George
cause the film used costume so effectively to help differentiate the five types represented by its protagonists, it became a useful document of the range of popular fashion in the mid-1980’s.
■
145
See also
Brat Pack in acting; Fashions and clothing; Fast Times at Ridgemont High; Film in the United States; Flashdance; Hughes, John; MTV; New Wave music; PG-13 rating; Preppies; Slang and slogans; Teen films.
Impact
The Breakfast Club was frankly marketed to only one age group, teenagers. As a result, it was unable to achieve blockbuster status, but it was incredibly popular among the teens who formed its target audience, helping define the teen culture of the 1980’s. Many high school kids could quote entire scenes from the film verbatim. The film’s success, following on the heels of Sixteen Candles, confirmed Hughes as a major, bankable talent in the lucrative teen market. He directed two more teen films, Weird Science (1985) and Ferris Bueller’s Day Off (1986), before transitioning to comedies with older protagonists, including Planes, Trains, and Automobiles (1987) and She’s Having a Baby (1988). Despite his relatively limited output as a director (after 1991, Hughes contented himself with writing and producing films for other directors), John Hughes is considered by many to have defined the genre of the teen movie and to have connected it irrevocably with 1980’s American culture.
■ Brett, George Identification
American professional baseball player Born May 15, 1953; Glen Dale, West Virginia George Brett was one of the most feared hitters in the American League and a fan favorite who helped lead the Kansas City Royals to a world championship in 1985. During the 1980 Major League Baseball season, fans were increasingly drawn to the hitting exploits of Kansas City Royals third baseman George Brett. For the first time since 1941, it seemed possible that a
Further Reading
Clark, Jaime, ed. Don’t You Forget About Me: Contemporary Writers on the Films of John Hughes. Foreword by Ally Sheedy. New York: Simon Spotlight Entertainment, 2007. Deziel, Shanda. “The Man Who Understood Teenagers.” Maclean’s 119, no. 45 (November, 2006): 7. Prince, Stephen. A New Pot of Gold: Hollywood Under the Electronic Rainbow, 1980-1989. Vol. 10 in History of the American Cinema. Berkeley: University of California Press, 2002. Schneider, Steven Jay. 1001 Movies You Must See Before You Die. London: Quintet, 2003. Jennifer L. Titanski
Kansas City Royal George Brett prepares to run after hitting a triple against the Philadelphia Phillies in game 4 of the 1980 World Series. (AP/Wide World Photos)
146
■
The Eighties in America
Bridges, Jeff
player’s batting average would exceed .400. Despite intense media attention, which Brett compared to the scrutiny that Roger Maris received when he challenged Babe Ruth’s home run record, Brett handled the pressures both on and off the field with humor and professionalism. In mid-September, his average finally slipped below .400, and Brett finished the season at .390. Nevertheless, that average remains among the very highest since 1941. Brett received the American League Most Valuable Player award in 1980 for his feat. Throughout the 1980’s, left-handed hitter Brett remained remarkably consistent and a dangerous clutch hitter. Slowed by serious shoulder and knee injuries in 1986 and 1987—which resulted in a shift from third to first base—Brett was still able to maintain a high average. He would win his third batting championship in 1990. He frequently hit with power, averaging more than thirty doubles and twenty home runs in each season of the 1980’s (excluding the strike-abbreviated 1981 season). Brett demolished the Toronto Blue Jays in the 1985 American League playoffs. In the 1980 and 1985 World Series, he batted .375 and .370, respectively. Kansas City won its first World Series title in 1985, thanks to excellent pitching and Brett’s leadership. Brett’s consistency was further reflected in his style of play and loyalty toward the fans. A true team leader, Brett inspired others with his constant hustle and determination. Fans still remember the incident on July 24, 1983, when Brett exploded out of the dugout after an umpire negated his potential game-winning home run against the New York Yankees because of excess pine tar on his bat. The umpire’s decision was subsequently overturned on appeal and the rule changed. At a time when free agency and increasing salaries increased player movement and drained talent from small-market clubs, Brett remained with the Kansas City Royals for his entire major-league career. Through the 1980’s, Brett was a valued member of the Kansas City community, unpretentious around the fans, generous with the media, and devoted to the promotion of baseball. Impact George Brett guaranteed his election to the Hall of Fame in 1999 with his batting prowess in the 1980’s. He was a model ballplayer, representing traditional baseball values for a small-market franchise at a time when baseball was wracked by labor strife and increasing free agent movement.
Further Reading
Cameron, Steve. George Brett: Last of a Breed. Dallas: Taylor, 1993. Matthews, Denny, and Matt Fulks. Tales from the Royals Dugout. Champaign, Ill.: Sports, 2004. Zeligman, Mark. George Brett: A Royal Hero. Champaign, Ill.: Sports, 1999. M. Philip Lucas See also Baseball; Baseball strike of 1981; Jackson, Bo; Sports.
■ Bridges, Jeff Identification American film actor Born December 4, 1949; Los Angeles, California
Refusing to be typecast as a golden boy romantic lead, Jeff Bridges crafted many memorable film performances across genres and styles of filmmaking. Jeff Bridges entered Hollywood easily: The son of actors Lloyd Bridges and Dorothy Dean Bridges, he infrequently appeared in his famous father’s television series, Sea Hunt. After attending a military academy and serving a stint in the Coast Guard, Jeff Bridges began acting in films, notably portraying a cocky Texas teenager in The Last Picture Show (1971)—a performance for which he received the first of his Academy Award nominations—and a naïve young boxer in Fat City (1972). In the 1980’s, Bridges appeared in sixteen theatrical films. Four superb performances demonstrated his range: He played a noncommitted, womanizing opportunist forced to make a moral choice in Cutter’s Way (1981); a gentle visitor to Earth in Starman (1984); an aggressive, optimistic inventor in the biopic Tucker: The Man and His Dreams (1988); and a world-weary entertainer performing in a lounge act with a less talented and far more upbeat older brother (played by his actual brother, Beau Bridges) in The Fabulous Baker Boys (1989). Bridges received his greatest critical response of the decade—and the only Academy Award nomination ever granted in the Best Actor category for a non-human characterization—for his performance in Starman, in which Bridges abandoned his natural grace for the mechanized movements and halting speech of a newly embodied alien who learns to “act human.”
The Eighties in America Impact Bridges’s work in the 1980’s continued to be both prolific and well-received, demonstrating his rare ability to excel in many films made in rapid succession. Off-screen, Bridges advocated for social justice; in 1983, he founded the End Hunger Network. He modeled the possibility of maintaining a distinguished and prolific acting career, a solid marriage and family life, commitment to humanitarian causes, and involvement in artistic pursuits (including photography, sketch art, songwriting, and musical performance) within the chaotic shadow of the film industry. Further Reading
Bridges, Jeff. Pictures. Brooklyn, N.Y.: Powerhouse Books, 2003. Palmer, William J. The Films of the Eighties: A Social History. Carbondale: Southern Illinois University Press, 1993. Carolyn Anderson See also
Academy Awards; Film in the United States; Heaven’s Gate; Science-fiction films; Tron.
■ Broadway musicals Definition
Musical theater productions opening on Broadway
The 1980’s saw a shift on Broadway away from bookdriven American musicals and toward the rising British mega-musicals. The early 1980’s welcomed strong book musicals on Broadway. One of the most popular was 42nd Street (pr. 1980), based on the 1930’s Busby Berkeley film of the same name. Opening at the Winter Garden Theatre, it delighted audiences with its tap-driven spectacle, including an opening number in which the curtain rose only high enough to expose more than forty pairs of tapping feet. Produced by David Merrick and directed by Gower Champion, the show garnered international press when, after multiple ovations on its opening night, Merrick came forward to reveal to the audience and cast that Champion had died of cancer hours before the performance. The production ran 3,486 performances over its eight-year original run. Other productions of note in the early 1980’s from American composers included Cy Coleman,
Broadway musicals
■
147
Michael Stewart, and Mark Bramble’s Barnum (pr. 1980), Maury Yeston and Arthur Kopit’s Nine (pr. 1982), Henry Krieger and Tom Eyen’s Dreamgirls (pr. 1981), and My One and Only (pr. 1983), which featured the music of George and Ira Gershwin and a book by Peter Stone and Timothy S. Mayer and followed its two-year Broadway run with an even more successful road tour. The British Invasion of the Early 1980’s
Perhaps the defining moment of the change that was about to take place on Broadway occurred in 1982, when Andrew Lloyd Webber and director Trevor Nunn bumped 42nd Street from the Winter Garden Theatre to mount Cats (pr. 1982). Produced by Cameron Macintosh and Lloyd Webber and based an a book of poems by T. S. Eliot, Cats focused almost exclusively on spectacle and music instead of plot and theme. The result was a production that could be enjoyed by entire families and by tourists who spoke little English. Even more influential than the style of the show, however, was Macintosh’s aggressive marketing campaign. Featuring two yellow cat eyes with dancers’ silhouettes for pupils, the Cats logo became synonymous with the production and was featured on everything from T-shirts to Christmas ornaments. Macintosh and Lloyd Webber also started a trend by releasing tracks from the musical before it opened, so audiences would already be familiar with the music. Barbra Streisand recorded “Memory” from the show, and it became a pop hit well before the Broadway production opened. Cats lasted eighteen years on Broadway, closing in 2000 after 7,485 performances.
Mega-musicals of the Mid-1980’s The mid-1980’s saw few American musicals enjoying long runs. La Cage Aux Folles (pr. 1983), by Jerry Herman with a book by Harvey Fierstein and Arthur Laurents; Sunday in the Park with George (pr. 1984), by Stephen Sondheim and James Lapine; and Roger Miller and William Hauptman’s Big River (pr. 1985) were the exceptions. In fact, Big River would be followed by years of no American musical reaching the 1,000performance mark. Even the rarity of Sunday in the Park with George winning a Pulitzer Prize in drama, one of very few musicals to do so, was not enough to keep the production open for more than 604 performances. The mid-1980’s also brought about the proliferation of the British mega-musical, so-called for its fo-
148
■
The Eighties in America
Broadway musicals
At the 1988 Tony Awards ceremony in New York City, the best actor and actress winners display their awards. Joanna Gleason, center left, won Best Actress in a Musical for her performance in Into the Woods, while Michael Crawford, center right, won Best Actor in a Musical for his work in The Phantom of the Opera. (AP/Wide World Photos)
cus on spectacle and lavish set and costume designs. Following Me and My Girl (pr. 1986), an exception to this formula, Broadway welcomed Lloyd Webber’s Starlight Express (pr. 1987) and The Phantom of the Opera (pr. 1988). Even France’s Claude-Michel Schönberg and Alain Boublil got in on the act, although their Les Miserables had a relatively strong book (based on the 1862 Victor Hugo novel). The large casts, melodramatic story lines, and sung-through style of these productions ushered in a new era of romanticism on Broadway, as well as one of dramatically increased ticket prices: Those prices nearly doubled in just five years. The British versus American musical controversy seemed to reach its apex in 1988, when Stephen Sondheim and James Lapine’s Into the Woods (pr. 1987) went head-to-head against The Phantom of the Opera at the Tony Awards. Although numerous crit-
ics favored Sondheim’s work, especially for its complex musicality, they could not overlook the incredible popularity of Lloyd Webber. While Into the Woods won Tony Awards for its book and music, The Phantom of the Opera took the top prize for Best Musical. Into the Woods would eventually close after 769 performances, while The Phantom of the Opera would become the longest-running musical in Broadway history. The End of the Decade
The year 1988 also saw one of the largest flops in Broadway history: Carrie. Based on Stephen King’s best-selling novel, the production, which also originated in London, lasted only five performances on Broadway. Incorporating a pop sound and high-energy choreography, the production ultimately died under the weight of its weak book and overtly desperate special effects. A book
The Eighties in America
entitled Not Since Carrie: Forty Years of Broadway Musical Flops (1991) commemorates the failure of this production. The final years of the 1980’s saw a slight decline in British imports, with Lloyd Webber’s Aspects of Love (pr. 1990) being eclipsed by a return to Americana in the form of Tony winners Jerome Robbins’ Broadway (pr. 1989) and City of Angels (pr. 1989). This trend would continue into the 1990’s, with American musicals receiving critical acclaim and British imports seeing huge box-office numbers. Impact The Broadway musical had been, until the 1980’s, thought to be an art form exclusively mastered by Americans. The 1980’s proved this theory wrong. British and French musicals, particularly those created by Andrew Lloyd Webber and produced by Cameron Macintosh, brought pop elements into the traditional musical score that allowed them to transcend what younger audiences saw as out-of-date show tunes. Additionally, the use of aggressive marketing campaigns by Macintosh forced American producers to find other ways in which to sell their productions to audiences. Television commercials and branded items in addition to the traditional posters, T-shirts, and programs became the new norm. To accommodate the lavishness of the productions and the increased marketing costs, average ticket prices rose tremendously, from $25 in 1980 to well over $50 in 1988. The rise in ticket prices was also a result of declining productions and audiences. There were sixty-seven productions on Broadway in the 1980-1981 season, while the last season of the decade saw only twenty-nine productions.
Broderick, Matthew
■
149
amination of how musicals promote social change. Includes important insights into the 1980’s political world and Broadway’s response to it. Larkin, Colin. The Virgin Encyclopedia of Stage and Film Musicals. London: Virgin Books, 1999. An excellent resource covering both American and British musicals. Mandelbaum, Ken. Not Since Carrie: Forty Years of Broadway Musical Flops. New York: St. Martin’s Press, 1991. A look at musicals that have flopped, including 1988’s Carrie. Mordden, Ethan. The Happiest Corpse I’ve Ever Seen: The Last Twenty-Five Years of the Broadway Musical. New York: Palgrave for St. Martin’s Press, 2004. A compelling and detailed account of the Broadway musical and the effect the 1980’s had on its development. Singer, Barry. Ever After: The Last Years of Musical Theatre and Beyond. New York: Applause, 2004. Covering musicals from 1975 to the early twentyfirst century, Singer’s analysis concerning the 1980’s is particularly compelling. Suskin, Steven. Show Tunes, 1905-1991: The Songs, Shows, and Careers of Broadway’s Major Composers. New York: Limelight Editions, 1992. A phenomenal resource for exploring Broadway composers throughout the twentieth century. Wilmeth, Don B., and Christopher Bigsby, eds. The Cambridge History of American Theatre. New York: Cambridge University Press, 2000. Covering theater from post-World War II to the 1990’s, this text’s analysis of theater in the 1980’s is unsurpassed. Tom Smith
Further Reading
Block, Geoffrey. Enchanted Evenings: The Broadway Musical from “Show Boat” to Sondheim. New York: Oxford University Press, 1997. An exploration of music as a dramatic tool in musicals of the twentieth century. An important resource for discerning stylistic differences between composers in the 1980’s. Flinn, Denny Martin. Musical! A Grand Tour. New York: Schirmer Books, 1997. A look at the history of the American musical on both stage and screen. Useful for understanding the form’s development over time. Jones, John Bush. Our Musicals, Ourselves: A Social History of the American Musical Theatre. Lebanon, N.H.: Brandeis University Press, 2003. Literate ex-
See also
Cats; Phantom of the Opera, The; Theater.
■ Broderick, Matthew Identification American actor Born March 21, 1962; New York, New York
Broderick, a highly charismatic actor starred in serious and comedic productions on both stage and screen in the 1980’s. His disarming manner combined with a forceful stage presence, enabling him to realize several of the most memorable characters of the decade. Matthew Broderick, a major young actor of the 1980’s, was born into an artistic family. His father,
150
■
The Eighties in America
Broderick, Matthew
In War Games, Broderick played a teenager who accidentally hacks into a military computer and almost starts World War III. The film was a response to two actual NORAD computer malfunctions and to the rise of computer hacking. The film contained many key components of early 1980’s American culture: personal computers, coin-operated video games, hacking, fears of computer malfunctions, and Cold War anxieties about nuclear holocaust. It enjoyed worldwide success and made Broderick a bankable star. One of Broderick’s most important 1980’s film roles was the title character in Ferris Bueller’s Day Off (1986), a comedy written and directed by John Hughes. The film was an excellent showcase for Broderick’s particular charisma, as his character continually broke down the “fourth wall” to address the audience directly. In 1988, Broderick starred in the film versions of both Torch Song Trilogy and Biloxi Blues. His boyish good looks kept him typecast in teen comedies until 1989, when he starred in Glory— partly written by his mother and directed by thirtysomething cocreator Edward Zwick—playing Civil War hero Robert Gould Shaw. He continued his success in film and stage as an adult.
Matthew Broderick. (Hulton Archive/Getty Images)
James Broderick, was a popular New York stage actor and his mother, Patricia Biow Broderick, was a screenplay writer, actress, and painter. Acting alongside his father in a 1981 workshop production of Horton Foote’s Valentine Day led to a part in the OffBroadway production of Harvey Fierstein’s Torch Song Trilogy (pr. 1981). The excellent review he received led to him being cast as the lead, Eugene Morris Jerome, in the Broadway production of Neil Simon’s autobiographical Brighton Beach Memoirs (pr. 1983), for which he won the 1983 Tony Award. Broderick reprised the role of Eugene Morris Jerome on Broadway in Simon’s Biloxi Blues (pr. 1985). This led to Broderick’s first movie role in Simon’s Max Dugan Returns (1983). The same year saw the release of War Games (1983), Broderick’s first big-screen success.
Impact Broderick’s greatest impact on the 1980’s came as a result of portraying Ferris Bueller. Ferris Bueller’s Day Off, an iconic 1980’s film, follows three teenagers as they ditch school for an adventure in greater Chicago. The movie proved so popular that it spawned two 1990’s television situation comedies and the name of a 1990’s ska band. It has also earned a place on many lists of top film comedies and became one of the most widely quoted films of the 1980’s. Further Reading
Clarke, Jaime, ed. Don’t You Forget About Me: Contemporary Writers on the Films of John Hughes. New York: Simon Spotlight Entertainment, 2007. Strasser, Todd. Ferris Bueller’s Day Off. New York: Signet, 1986. Leslie Neilan See also
Cold War; Computers; Hughes, John; Film in the United States; Teen films; Theater; thirtysomething; Torch Song Trilogy; Video games and arcades.
The Eighties in America
■ Brokaw, Tom Identification Television news anchor Born February 6, 1940; Webster, South Dakota
In 1982, Brokaw became the youngest person ever to anchor a national network evening news program. When the 1980’s began, Tom Brokaw was the cohost with Jane Pauley of the popular National Broadcasting Company (NBC) morning program, the Today show. Brokaw’s background was in news coverage; he had covered civil rights violence in the South and was the NBC White House correspondent during the Watergate scandal. He was known for his ability to function with a sometimes exceptional workload, most conspicuously when, while on the Today show, he also covered presidential primary elections and campaigns. In 1982, he was offered a two-milliondollar contract to co-anchor the NBC Nightly News with Roger Mudd. Brokaw would report from New
Brokaw, Tom
■
151
York and Mudd from Washington. The previous anchor, John Chancellor, was to do occasional commentary. Chancellor and Mudd eventually dropped away, and Brokaw became the sole anchor on September 5, 1983. His style was easygoing and relaxed; he projected decency and a sense of humor, and he remained composed under stress. His charm was enhanced, rather than diminished, by a slight speech impediment. While his broadcast was rarely the toprated news program during the early part of the decade, his Nielsen ratings rose after 1984. Arguably, Brokaw’s most significant broadcast of the 1980’s was his 1989 coverage of the fall of the Berlin Wall. The Wall had separated East and West Berlin since it was erected by Communist leaders in 1961. Already on assignment in Berlin, Brokaw was present when travel restrictions were lifted, and he reported on location as crowds of East Berliners poured over the former border and as a woman attacked the Wall itself with hammer and chisel.
Today show host Tom Brokaw interviews the wife and mother of a hostage during the Iranian hostage crisis in January, 1980. (AP/ Wide World Photos)
152
■
The Eighties in America
Bush, George H. W.
In 1984, Brokaw prepared a documentary to correspond with the fortieth anniversary of the Allied invasion of Normandy, France, during World War II. (This interest would lead, in 1998, to the publication of his first book, The Greatest Generation, a celebration of those Americans who survived the Great Depression and fought in World War II.) In 1986, Jay Barbree revealed the cause of the Challenger spacecraft disaster on Brokaw’s evening news program. During 1987, Brokaw interviewed both Soviet leader Mikhail Gorbachev in the Kremlin and President Ronald Reagan in the White House. By the end of the decade, he had become the voice of NBC news and a trusted source of information for millions of people. Impact Of the three major network news anchors during the 1980’s (Dan Rather for CBS, Peter Jennings for ABC, and Brokaw), Brokaw was probably least affected by the increasing corporate ownership of the networks. General Electric acquired NBC and its parent company, RCA, in 1986. As a result, NBC news suffered cuts and, like the other networks, saw an increasing emphasis on profit, as the corporate culture of the 1980’s rejected the traditional model of networks’ news divisions as “loss leaders” (that is, divisions that would lose money but would build the prestige of their networks, contributing to their brand recognition and overall value). Brokaw benefited from this trend, however: He seemed less sophisticated and learned than Jennings, while he was cooler and more controlled than Rather. In the context of the drive to make news profitable, Brokaw’s youthful appearance and charm appealed to the kind of young, successful audience being sought by advertisers in the 1980’s, auguring the future of American network news broadcasting. Further Reading
Alan, Jeff, with James M. Lane. Anchoring America: The Changing Face of Network News. Chicago: Bonus Books, 2003. Fensch, Thomas, ed. Television News Anchors: An Anthology of Profiles of the Major Figures and Issues in United States Network Reporting. Woodlands, Tex.: New Century Books, 2001. Goldberg, Robert, and Gerald Jay Goldberg. Anchors: Brokaw, Jennings, Rather, and the Evening News. New York: Carol, 1990. Betty Richardson
See also
Berlin Wall; Challenger disaster; Jennings, Peter; Journalism; Network anchors; Pauley, Jane; Rather, Dan; Soviet Union and North America; Television.
■ Bush, George H. W. Identification
U.S. vice president, 1981-1989, and president, 1989-1993 Born June 12, 1924; Milton, Massachusetts Bush served as either vice president or president during such major 1980’s political events and changes as the end of the Cold War, the adoption of supply-side economics, the dominance of conservatism in American politics, the investigation of the Iran-Contra affair, and the invasion of Panama. George H. W. Bush was the second son of Prescott S. Bush (1895-1972) and Dorothy (Walker) Bush. Prescott S. Bush was a Wall Street banker and served as a Republican senator from Connecticut from 1953 to 1963. Senator Bush belonged to the northeastern, moderate, pro-civil rights wing of the Republican Party. George H. W. Bush served as a Navy combat pilot during World War II and graduated from Yale University. He married Barbara Pierce in 1945, and his first son, future president George W. Bush, was born in 1946. He moved his family to Texas in 1948 in order to enter the oil business. Political Career During the 1960’s and 1970’s
During the 1950’s, Bush was busy developing his oil and investment interests and raising a large family. By the 1960’s, he entered Republican politics in Texas as the state’s politics became more conservative and Republican candidates became more competitive. Bush ran unsuccessfully for the Senate in 1964 and 1970. In his 1964 campaign, he opposed the Civil Rights Act of 1964 in an attempt to attract conservative white voters and dispel his image as a moderate Yankee carpetbagger. Bush, however, was elected as a U.S. representative for a Houston-based district in 1966 and reelected in 1968. He compiled a mostly conservative voting record in Congress, especially by supporting large cuts in domestic spending and foreign aid and a more aggressive military strategy in the Vietnam War (1959-1975). After Bush lost the 1970 Senate race, President Richard M. Nixon appointed him as the United
The Eighties in America
States ambassador to the United Nations. During the Nixon and Ford administrations (1969-1977), Bush also served as a special envoy to China, chairman of the Republican National Committee (RNC), and director of central intelligence. In these appointed positions, Bush became well known among Republicans in Washington, D.C., for his prudent judgment, cautious management, loyalty to the moderate establishment of the Republican Party, and willingness to accept difficult assignments. He chaired the RNC after the Watergate scandal and Nixon’s forced resignation from the presidency substantially weakened the party, and he headed the Central Intelligence Agency (CIA) after its reputation and status were damaged by congressional investigations. Presidential Ambition and the Vice Presidency
In 1979, Bush announced his candidacy for the Republican presidential nomination of 1980. Bush campaigned as a moderate, pro-choice Republican and hoped that his centrism and extensive experience in several administrative positions would attract enough electoral support in the primaries and caucuses. Bush, however, competed against several other
Bush, George H. W.
153
Republican candidates who were better known and more skillful campaigners, especially former California governor Ronald Reagan (1911-2004). During the campaign, Bush criticized Reagan’s “trickle-down” economic theory, calling it “voodoo economics.” He performed poorly as a candidate and withdrew from the presidential race in May, 1980. During the Republican National Convention of 1980, Bush accepted Reagan’s offer to run for vice president. After Reagan won the 1980 presidential election, Bush loyally served Reagan as vice president and adapted to the more conservative Republican Party under Reagan’s leadership, especially by adopting Reagan’s conservative positions on such social issues as abortion, gun control, and school prayer. Bush chaired a White House task force on deregulation, one of Reagan’s top economic priorities. Bush’s influence with Reagan was further increased by his friendship and earlier political relationship with James Baker, who served as White House chief of staff and secretary of the treasury under Reagan. As Bush began to prepare for his 1988 presidential campaign, his close association with Reagan became less of a political asset. In the 1986 midterm elections, the Democrats increased their majority in the House of Representatives and won control of the Senate. Reagan’s popularity and credibility declined, as the Democratic-controlled Senate investigated the Iran-Contra affair, in which Bush was implicated, and rejected Reagan’s nomination of Robert H. Bork, an outspoken conservative, to the Supreme Court in 1987. Furthermore, the stock market crash of 1987 and rapidly increasing budget deficits during the 1980’s weakened support for Reagan’s economic policies. Press coverage of Bush’s presidential campaign often emphasized the need for him to define himself in his own terms, rather than simply as Reagan’s vice president. The 1988 Presidential Election
George H. W. Bush. (Library of Congress)
■
Nonetheless, Bush’s 1988 presidential campaign generally emphasized continuity with Reagan’s presidency. Bush competed against several Republicans for their party’s presidential nomination, most notably Senator Bob Dole of Kansas, who defeated Bush in the Iowa caucuses. Bush was supported, however, by New Hampshire governor John H. Sununu of New Hampshire, and his decisive, unexpected victory in the New Hampshire primary stimulated momentum behind
154
■
Bush, George H. W.
The Eighties in America
Bush’s campaign for future priA Thousand Points of Light mary victories, allowing Bush to secure his party’s presidential In George H. W. Bush’s inaugural address, delivered January 20, 1989, nomination. Choosing Senator he reiterated his notion that impoverished and disenfranchised Americans Dan Quayle of Indiana as his runshould primarily be aided, not by the federal government but rather by the ning mate, Bush promised more nation’s private charitable organizations, which he referred to as “a thouactive, innovative policies in envisand points of light.” ronmental protection and education and greater encouragement I have spoken of a thousand points of light, of all the commuof volunteerism, which he characnity organizations that are spread like stars throughout the Naterized as “a thousand points of tion, doing good. We will work hand in hand, encouraging, somelight.” In general, he expressed times leading, sometimes being led, rewarding. We will work on his desire to achieve “a kinder, this in the White House, in the cabinet agencies. I will go to the gentler America.” people and the programs that are the brighter points of light, and While such rhetoric implied a I will ask every member of my government to become involved. more moderate, conciliatory doThe old ideas are new again because they are not old, they are mestic policy agenda than Reatimeless: duty, sacrifice, commitment, and a patriotism that finds gan’s, Bush soon used more agits expression in taking part and pitching in. . . . gressive, conservative rhetoric to Some see leadership as high drama, and the sound of trumcriticize and eventually defeat the pets calling, and sometimes it is that. But I see history as a book Democratic presidential nominee, with many pages, and each day we fill a page with acts of hopefulGovernor Michael Dukakis of Masness and meaning. The new breeze blows, a page turns, the story sachusetts. Influenced by camunfolds. And so today a chapter begins, a small and stately story of paign consultant Lee Atwater, the unity, diversity, and generosity—shared, and written, together. Bush campaign portrayed Dukakis as a high-taxing, liberal elitist who was soft on crime and indifferent or hostile to traditional American cratic legislation on abortion, child care, gun convalues such as patriotism. Bush defeated Dukakis, cartrol, and family leave. rying forty states to win 54 percent of the popular vote In foreign policy, Bush demonstrated more selfand 426 electoral votes. The Democrats, however, confidence and initiative. He appointed James continued to control Congress. Baker as secretary of state and Dick Cheney as secretary of defense. With the end of the Cold War, signiBush’s Presidency (1989-1993) Unlike Reagan, Bush fied by the removal of the Berlin Wall in 1989 and was unable to articulate a cohesive, persuasive vision the formal dissolution of the Soviet Union in 1991, to unify the ideas and policy goals of his presidency. Bush announced the potential for a “New World OrIn domestic policy, Bush incrementally developed der” in international relations. The expectation of and signed more moderate, compromised versions steady reductions in defense spending and a permaof Democratic legislation, such as the Americans nently smaller U.S. military led to more discussion in with Disabilities Act of 1990, the Clean Air Act of Congress about how to apply this “peace dividend” 1990, and the Civil Rights Act of 1991. With public to deficit reduction and domestic policy needs. As concern about high budget deficits and a movement part of a more proactive, internationally oriented to adopt a balanced-budget amendment to the Conwar on drugs, Bush ordered a 1989 invasion of Panstitution, Bush reluctantly signed legislation to inama in order to remove its dictator, Manuel Noriega, crease taxes. This decision contradicted a campaign who was deeply involved in drug smuggling. promise he had made, when he famously said, “Read my lips: No new taxes.” The perceived betrayal of a promise proved to be as harmful to his standing as Impact During the 1980’s, George H. W. Bush’s political career was revived and advanced by the popuwas the fact that he raised taxes. Nonetheless, Bush larity and electoral success of President Reagan. If steadily increased his use of vetoes against Demo-
The Eighties in America
he had not served as Reagan’s two-term vice president, it is unlikely that Bush would have been nominated and elected president in 1988. Political commentators often claimed that many voters supported Bush in the 1988 presidential election because they perceived a Bush presidency as the equivalent of a third term for Reagan. During his one-term presidency, Bush suffered politically from the perception of social conservatives and the Religious Right that he was not sincerely committed to their issue positions and the perception of many Americans that he could not understand and effectively respond to their economic problems. Following as he did in the footsteps of Reagan, who had been nicknamed the Great Communicator, Bush suffered by comparison, because his speeches lacked the eloquence and drama of his predecessor. Subsequent Events When Iraqi dictator Saddam Hussein invaded Kuwait in August, 1990, Bush quickly assembled a multinational coalition that included Middle Eastern governments to oppose Hussein’s occupation of Kuwait and demand his withdrawal. With the support of the United Nations and a congressional resolution, Bush ordered air and ground military operations against Iraqi troops in Kuwait. Bush refrained from invading Iraq, however, and the Persian Gulf War ended victoriously and with few casualties for the United States and its allies by April, 1991. Bush’s public approval ratings then approximated 90 percent, higher than that of any previous president. Initially confident of being reelected president, Bush overestimated the influence of the Persian Gulf War and underestimated the influence of the 1990-1991 economic recession on the 1992 presidential election. In contrast to the Democratic presidential campaign of Bill Clinton, the governor of Arkansas, the Bush campaign seemed unfocused and listless. Further weakened by greater public concern with economic issues and the strong independent presidential candidacy of wealthy businessman Ross Perot, Bush suffered one of the worst electoral defeats of any incumbent president seeking reelection. He received only 38 percent of the popular vote and 168 electoral votes.
Bush, George H. W.
■
155
Further Reading
Barilleaux, Ryan J., and Mark J. Rozell. Power and Prudence: The Presidency of George H. W. Bush. College Station: Texas A & M University Press, 2004. A balanced analysis of the Bush presidency that emphasizes the influences of incrementalism, moderation, and caution on its policies and administration. Duffy, Michael, and Dan Goodgame. Marching in Place: The Status Quo Presidency of George Bush. New York: Simon & Schuster, 1992. A mostly unfavorable analysis of Bush’s presidency, including the 1988 presidential election. Kelley, Kitty. The Family: The Real Story of the Bush Dynasty. New York: Anchor Books, 2005. An extensive history of the Bush family that includes George H. W. Bush’s political career during the 1980’s. Kolb, Charles. White House Daze: The Unmaking of Domestic Policy in the Bush Years. New York: Free Press, 1994. Explores and provides explanations for the domestic policy failures of the Bush presidency. Morrison, Donald, ed. The Winning of the White House, 1988. New York: Time, 1988. Collection of articles on the 1988 presidential election; includes the influence of the Reagan presidency on the Bush campaign. Schaller, Michael. Right Turn: American Life in the Reagan-Bush Era: 1980-1992. New York: Oxford University Press, 2007. A broad survey of the social, economic, and political history of the Reagan-Bush era that highlights major events, issues, and trends during the 1980’s. Sean J. Savage See also Atwater, Lee; Business and the economy in the United States; Cold War; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Foreign policy of the United States; Iran-Contra affair; Panama invasion; Quayle, Dan; Reagan, Ronald; Reagan Revolution; Reaganomics; Recessions; Sununu, John H.
156
■
Business and the economy in Canada
■ Business and the economy in Canada Definition
Structure and functioning of the Canadian economy, including the production and distribution of goods, services, and incomes and related public policies
As a full-fledged member of the developed world, Canada shared with the rest of that world the economic ups and downs that marked the 1980’s. In particular, the Canadian service economy expanded significantly, and the agricultural sector lagged behind most others in recovering from the recession that began the decade. During the 1980’s, Canada continued the process begun after World War II, when it started to diverge from the colonial economy that had characterized the country up to that time. The nation’s primary sector—agriculture, forestry, and fishing—shrank further as a proportion of the Canadian gross domestic product (GDP), as did the manufacturing sector. The mining and especially the service sectors grew. The primary sector fell from representing 4.3 percent of the GDP in 1980 to representing 2.5 percent in 1990. Mining and quarrying increased slightly from 5.2 percent to 5.6 percent, but manufacturing fell from 21 percent to 19 percent. Construction advanced slightly, from 6.4 percent to 7.8 percent, but the great change occurred in the non-business sector, which more than doubled, from 7.7 percent to 15.9 percent. “Other,” according to Statistics Canada, fell slightly, from 55.3 percent to 49 percent. During the 1980’s, the GDP doubled, from just over 3 billion Canadian dollars in 1980 to 6.5 billion Canadian dollars in 1989. Most of this growth occurred in the years after 1983, as in the early 1980’s Canada responded to the second oil shock of 1979 with a deep recession, one of the worst since the Great Depression of the 1930’s. Even the years after 1983 were a bumpy ride, as the economic recovery was accompanied by significant inflation. Recovery was most pronounced in central Canada, especially in Ontario, where much of Canada’s manufacturing was located. Recovery was slower in the far west, where much activity was resource-based, and in the east, whose Atlantic provinces never recovered to their prior level of prosperity. Investment
After dropping precipitously in the recession of 1981-1982, investment rebounded during
The Eighties in America
the latter part of the decade. A significant portion of that investment came from outside the country, as foreign investors accounted for almost 50 percent of the investment in Canada’s manufacturing industries. Canadian investors were more prominent in the oil and gas sectors, supplying close to 70 percent of the investment funds in those sectors and nearly as much in mining and smelting. In utilities, almost all of the investments were locally generated, because nearly all Canadian utilities were publicly owned at that time. The availability of investment capital did not guarantee success, and the Canadian economy experienced a significant number of business failures during the 1980’s. Two banks in western Canada, the Canadian Commercial Bank of Edmonton and the Northland Bank, collapsed. The government had to bail out a number of small, local banks. Dome Petroleum failed, Chrysler Canada required a government bail-out, and the large agricultural equipment maker Massey-Ferguson had to be reconstituted under a new name, Varity. Several large retail firms sold out, notably Hudson’s Bay Company. Foreign Trade Canada’s economy is heavily dependent on foreign trade, most particularly with the United States; some 30 percent of Canada’s income generally comes from foreign trade. By the end of the 1980’s, 80 percent of Canada’s foreign trade was with the United States, mostly in the form of automobiles and automotive parts, thanks to the Autopact of 1965, which made it possible to ship cars and parts across the Canadian-U.S. border without triggering import duties. These benefits were extended to other products in the Canada-United States Free Trade Agreement, signed on October 3, 1987. Under this agreement, duties on most manufactured products were progressively lowered on both sides of the border. The friendly terms on which most Canadian-U.S. trade took place did not, however, extend to the field of softwood lumber, the lumber used in most house construction in both Canada and the United States. Canada supplies about one-third of the softwood lumber sold in the United States in the form of two-by-fours and similar pieces. In 1982, the Coalition for Fair Lumber Imports, a group of U.S. lumber producers mostly on the West Coast, claimed that Canadian lumber was effectively subsidized, because the price Canadian loggers were paying land-
The Eighties in America
Business and the economy in the United States
■
157
owning provincial governments was less than the price U.S. loggers had to pay to the private landowners from whom they bought most of the trees they logged. The coalition demanded that the U.S. government impose a “countervailing duty” to compensate for this low initial cost paid by Canadian producers. This demand led to a special agreement in 1986 in which the United States “temporarily” imposed a “countervailing duty” on Canadian softwood lumber imports into the United States, but the issue continued to roil the Canadian-U.S. lumber trade in succeeding years, coming several times before the World Trade Organization (WTO) for adjudication. In agriculture, prices can be extremely volatile, and it has long been the practice of both the Canadian government and other governments in the developed world to subsidize their agriculture. During the 1980’s, prices that had been sufficient to support agricultural producers in the preceding years sank, and many Canadian farmers complained. Canada’s wheat, its most important agricultural product, continued to be marketed through the Canadian Wheat Board, a government body. During the 1980’s, much Canadian wheat went to Asia rather than to Europe, its destination in prior decades.
Environmental Policy: Political Economy and Public Policy. Vancouver: University of British Columbia Press, 2005. Provides an exhaustive account (to date) of the softwood lumber issue. Organization for Economic Cooperation and Development. Economic Surveys: Canada. Paris: Author, 1989-1990. Survey of Canada’s economy at the end of the decade. One of the annual surveys conducted by the OECD of the economies of all its member nations. Statistics Canada. Canadian Economic Observer: Historical Statistical Supplement, 1989-90. Ottawa, Ont.: Ministry of Supply & Services, 1990. Contains many useful statistics on the 1980’s. Nancy M. Gordon
Impact Canada began the 1980’s in a recession but enjoyed in large part the economic recovery experienced by many developed nations as the decade progressed. As with those other nations, however, the recovery was not evenly distributed, and some sectors such as agriculture remained difficult, even as the Canadian service economy mushroomed.
Definition
Subsequent Events
The conclusion of the North American Free Trade Agreement (NAFTA) in 1993 tied the Canadian economy even more closely to that of the United States. At the same time, Canada’s more generous provision of social services (especially health care) created tensions in Canadian-U.S. relations.
Further Reading
Bothwell, Robert, Ian Drummond, and John English. Canada Since 1945. Rev. ed. Toronto: University of Toronto Press, 1989. The last chapter, added in the revised edition, provides a comprehensive survey of the Canadian economy in the 1980’s. Hessling, Melody, Michael Hewlett, and Tracy Sommerville, eds. Canadian Natural Resource and
See also
Agriculture in Canada; Canada and the British Commonwealth; Canada and the United States; Canada-United States Free Trade Agreement; Foreign policy of Canada; Mulroney, Brian; National Energy Program (NEP).
■ Business and the economy in the United States Structure and functioning of the U.S. economy, including the production and distribution of goods, services, and incomes and related public policies
After a brief but painful recession (which helped lower inflation and interest rates), the U.S. economy embarked on a steady upswing of growth that brought prosperity to many Americans. Economic conditions were a major topic of controversy during the U.S. presidential election campaign of 1980. Ronald Reagan criticized President Jimmy Carter for failing to deal effectively with inflation, high interest rates, and unemployment. An economic “discomfort index,” calculated by adding the inflation rate to the unemployment rate, was widely quoted. The index had risen from 13.4 in 1977 to 20.5 in 1980. Reagan’s decisive electoral victory was a measure of the public’s discontent with the economy. During his campaign, Reagan had promised that the rate of monetary creation would be slowed and that this would help reduce inflation and interest rates. President Carter had already appointed Paul
158
■
The Eighties in America
Business and the economy in the United States
U.S. Production Output and Employment in Major Sectors, 1980 and 1989 Sector Output (in billions of 1992 dollars)1
Number Employed (in thousands)2
1980
1989
Change
1980
1989
Change
Agriculture
58
89
31
3,364
3,199
−165
Mining
82
93
11
1,027
692
−335
Construction
215
252
37
4,346
5,171
+825
Manufacturing
823
1,106
283
20,285
19,391
−894
Transportation and utilities
385
475
90
5,146
5,614
+468
Wholesale and retail trade
601
920
391
20,310
25,662
+5,352
Finance
863
1,102
239
5,160
6,668
+1,508
Services
811
1,150
339
17,890
26,907
+9,017
Government
749
848
99
16,241
17,779
+1,538
4,615
6,062
+1,147
93,770
111,083
+17,313
Total3
1) Output is value added; data for all sectors add up to gross national product. 2) Wage and salary workers except agriculture. 3) Totals include some miscellaneous items not in listed sectors. Source: Economic Report of the President, 1998.
Volcker to chair the Federal Reserve Board. The election results strengthened Volcker’s resolve to slow monetary growth. The short-term result of this slowdown was economic recession. Each of the quarterly estimates of real gross national product (GNP) for 1982 was lower than the corresponding estimate for the previous year. The unemployment rate, which had been around 6 percent in 1978-1979, rose to exceed 10 percent by late 1982. The recession passed quickly, however. The world price of petroleum was declining, and this decline reduced general costs to consumers and helped eliminate inflationary expectations. Recovery The recession and the waning of inflationary expectations helped bring about a rapid decline in interest rates. Home mortgage rates, for instance, which went above 16 percent in late 1981, were down to 12 percent by late 1983, helping stimulate housing expenditures. Federal fiscal policy provided strong antirecessionary stimulus. President Reagan persuaded Con-
gress to pass the Economic Recovery Tax Act of 1981. By 1984, a four-person family with median income owed about $1,100 less in income tax than it would have under previous rates. Between 1981 and 1988, the top federal income-tax rate was reduced from 70 percent to 28 percent. The 1981 law also provided for automatic adjustment for inflation of income-tax brackets. Greater opportunity was offered for households to contribute to tax-deferred Individual Retirement Accounts (IRAs). By 1984, about 15 million persons were saving for retirement through IRAs. This was also the period when 401(k) retirement accounts made their appearance. These often involved matching, tax-deferred contributions by employer and employee, often invested in common stocks. The decrease in taxes was not accompanied by a decrease in government expenditure, however. As a result, the federal budget deficit expanded, and the national debt increased from about $900 billion in 1980 to more than $2.8 trillion in 1989. Some economists feared this ballooning debt would drive up interest rates and harm the market for productive
The Eighties in America
private-capital investments. However, home mortgage rates continued to decline, although they were still around 10 percent in 1989. Private-capital expenditures, after adjustment for inflation, were relatively flat between 1984 and 1988. Growth and Stability
After its rough start, the economy performed increasingly well across the decade. The recession officially ended in November, 1982, and it was followed by an economic boom and expansion that continued over ninety-two months until July, 1990. The average annual unemployment rate reached a peak of 9.7 percent in 1982, then declined every year to a low of 5.3 percent in 1989. The Michigan index of consumer sentiment, which had fallen below 60 in 1981, shot up to well over 90 by 1984 and remained high until 1990. Improving economic conditions played a big part in President Reagan’s reelection in 1984. Improvement in the economy reflected the relatively balanced expansion of both aggregate demand (need or desire for goods and services combined with sufficient purchase power) and aggregate supply (growth of productive capacity). Both demand and supply were stimulated by the continued growth of the U.S. population, which rose from 228 million in 1980 to 247 million in 1989. Besides natural increase, the United States received about 600,000 immigrants each year. During the economic expansion, the economy created 17 million additional jobs. The number of employed women increased by 11 million, while employed men increased only 7 million. Labor productivity increased more than 10 percent over the decade. An important contributor to higher productivity was a steady rise in the average educational level. The proportion of the labor force with some college education rose from 40 percent in 1980 to more than 45 percent in 1989. The proportion with no high school degree declined by a similar amount. Higher productivity resulted in an improvement in real incomes and consumption. Real disposable income per person rose about 18 percent from 1979 to 1989, and real personal consumption rose about 20 percent, meaning that some of the increase in consumption was driven by a corresponding increase in debt. The increase in income was not driven by an increase in real wages, which actually declined slightly during the decade. Higher fringe benefits offset some of that downward trend, but household
Business and the economy in the United States
■
159
incomes rose mostly because of lower unemployment rates and an increase in the proportion of households with more than one wage earner. Furthermore, household incomes from interest, dividends, and government transfer payments such as Social Security rose more rapidly than did labor income, contributing to the rise in income. In 1980, about one-eighth of the population was classified as living in poverty. That proportion rose slightly during the recession, then declined slightly again to end in 1989 very close to the 1980 level. This apparent stability masked a high rate of turnover: Many recent immigrants and young people just entering the labor force were in poverty initially but soon rose out of it to be replaced by other immigrants and young workers. The poverty rate was disproportionately high among persons with little education and among female-headed families. The latter category increased by one-fourth over the decade and accounted for much of the poverty among children. Sector Output and Employment All major sectors of the economy experienced increased output over the decade of the 1980’s. Agricultural output expanded by more than 50 percent, despite a decrease in employment. The increased output reflected the continued rise in labor productivity, helping to diminish world hunger but putting downward pressure on farm prices. One startling pattern was that, while manufacturing output continued to increase, employment in manufacturing declined. Higher productivity meant companies could produce the same amount with fewer workers. The impact of higher labor productivity was even more visible in mining, where employment fell by one-third despite rising output. Weakness in manufacturing and mining employment contributed to wage stagnation and to a decline in union membership. The decrease in manufacturing employment generated controversy. Critics blamed it on competition from imports and urged more restrictions against goods from abroad. Management experts insisted that Japanese firms were managed better than were American firms. Perhaps paradoxically, during the 1980’s, six major Japanese automakers opened manufacturing facilities in the United States. Located in areas of low population density where workers did not insist on union membership, they were
160
■
Business and the economy in the United States
able to continue their inroads into the American market. The biggest increases in employment were in trade and services. Rapidly growing service sectors included medical care, recreation and entertainment, and education. Improved medical technology and accessibility helped extend life expectancy and raised the proportion of the population over age sixty-five from 11.3 percent in 1980 to 12.4 percent in 1989. Household entertainment resources were rapidly transformed. Cable television began to supplement the traditional broadcast networks (whose number was augmented when FOX entered the business in 1986). Ted Turner introduced the Cable News Network (CNN) in 1980; it had 39 million subscribers by 1985. MTV offered viewers a steady diet of music videos, to the mingled delight and dismay of the nation. Audio compact discs (CDs) came on the market in the early 1980’s and grew to a $2 billion business by 1988, with a corresponding wane in long-playing record sales. By 1988, two-thirds of homes had a videocassette recorder (VCR). The decade saw the emergence of the desktop personal computer (PC). In 1980, most computers were large mainframes with many satellite keyboards and monitors. In 1981, there were about 2 million PCs in use, most of them in business firms. By 1988, the number had grown to 45 million, the majority in households. Apple Computer had pioneered the PC in the 1970’s, but a major step came when International Business Machines (IBM) permitted MS-DOS software to be installed on new machines. Software became a major industry in itself, with Microsoft as the chief firm. Deregulation The effort to stimulate competition and innovation by deregulating sectors of the economy had begun under President Carter and was vigorously extended under President Reagan. In 1980, Congress removed the ceilings on interest rates that could be paid by banks and other deposit institutions to investors. Rate ceilings had prevented deposit institutions from offering competitive interest rates when market rates went so high in the later 1970’s. Savings institutions had experienced heavy withdrawals, and many investors had shifted their funds into the newly developed money-market mutual funds. Government regulation of pricing, entry of new firms, and other operations had been traditional in
The Eighties in America
transportation and public utilities. Many of these regulations were now reduced or removed. The Motor Carrier Act of June, 1980, gave firms in the trucking industry freedom to determine their routes, schedules, and rates. Similar deregulation of airlines had begun in 1978. One effect was the creation of People Express Airline, which began low-price, lowfrill service in 1980. The telecommunications sector also experienced increased competition and flexibility. One factor contributing to this increase was the 1982 antitrust settlement that fragmented American Telephone and Telegraph (AT&T), creating regional operating companies (the so-called Baby Bells) and opening the way for competition in longdistance telephone service by such firms as Sprint and MCI. In the short term, deregulation seemed to lead to lower prices and greater access to competing suppliers. However, it also created the appearance of disorderly market conditions. Consumers confronted problems getting accurate information about products and services from suppliers, who often made confusing offers. At the same time that traditional, economic forms of regulation were being dismantled, however, social regulatory programs were expanding through such agencies as the Environmental Protection Agency, the Equal Employment Opportunity Commission, the Consumer Products Safety Commission, and the Occupational Safety and Health Administration. Monetarism and the End of Inflation With the support of President Reagan, Federal Reserve chief Paul Volcker was able to slow down the rate at which the money supply expanded. Many economists had become adherents of monetarism, according to which inflation would tend to move in proportion to the rate of monetary growth. The monetarists also believed that interest rates would tend to move in proportion to the expected rate of inflation. Both views seemed to be validated in the 1980’s, as monetary slowdown reduced both interest rates and the rate of price increase (to between 4 and 5 percent annually in 1987-1989). However, home mortgage rates were still in the neighborhood of 10 percent in 1989. Reduction in the inflation rate was greatly aided by the decline in world petroleum prices. Oil imports, which cost the United States almost $80 billion annually in 1980-1981, cost less than $40 billion annually in 1986-1988.
The Eighties in America
Business and the economy in the United States
■
161
In 1985, a Ford plant lies idle after being forced to close for a week by adverse economic conditions. The economic recovery of the mid-1980’s was not experienced in all sectors of the economy. (Hulton Archive/Getty Images)
Financial Shocks The macroeconomy was able to enjoy steady expansion after 1982 in spite of significant financial disturbances. One was the Savings and Loan crisis. In 1980, Congress increased the limit for deposit insurance coverage to $100,000. Additional legislation in 1982 widened the opportunity for savings and loans (S&Ls) to provide loans and investments outside their traditional area of household mortgages. Some S&Ls saw this as an opportunity to increase high-risk loans and investments, confident that if these ventures failed, the government would be stuck with most of the cost. S&Ls, with most of their funds invested in long-term home mortgages, were also hard-hit by deposit withdrawals and by competitive pressure to pay higher rates to depositors. In 1989, authorities estimated that seven hundred federally insured S&Ls were insolvent, representing assets of $400 billion. In that year, drastic federal legislation abolished the separate agency in-
suring S&Ls and created the Resolution Trust Company (RTC) to sell off the assets of troubled institutions. When the RTC was wound up 1995, the S&L crisis had cost the government $145 billion. Profits and the Stock Market Corporate profits roughly doubled over the decade of the 1980’s, reaching $380 billion in 1989. The steadiness of the business upswing and the decline in interest rates helped boost prices of corporate stock even more— the Standard and Poor index in 1989 was three times what it had been a decade earlier. The stock boom was severely interrupted, however, when the market crashed in October, 1987— the largest one-day decline in stock prices to that point in history. The crash, its run-up, and its aftermath reduced household wealth by an estimated $650 billion, and some economists predicted that a serious recession might result. This did not happen, however. Spending for goods and services showed a
162
■
The Eighties in America
Business and the economy in the United States
brief and minor interruption; then, boom conditions continued. The stock decline was buffered by Federal Reserve action to raise bank reserves and the money supply and to lower interest rates. International Economic Developments
The 1980’s brought radical change to the international economic position of the United States. Since World War II, the United States had normally generated a surplus in its international current accounts. That is, the value of exported goods and services plus the investment income coming into the country generally exceeded the value of goods and services purchased from abroad by Americans and U.S. investments in foreign businesses and governments. In the 1980’s, however, despite the decline in petroleum prices, the dollar value of U.S. imports of goods and services grew much more rapidly than the corresponding value of export transactions. As a result, currentaccount deficits exceeded $100 million a year from 1985 to 1989. Two main factors were responsible for these deficits. First, the business-cycle upswing was more vigorous in the United States than in other major trading countries. Second, profits and the overall investment climate in the United States were so attractive to foreign investors that a vast flow of international investment entered the country, providing funds to buy imports without foreign-exchange complications. As a result, the United States became, from the mid-1980’s, a net debtor on international account. These developments were highly controversial. Complaints were numerous from the sectors that felt particular pressure from foreign competition, including textiles, automobiles, and steel. Despite these complaints, U.S. foreign and trade policy continued to favor reducing international trade barriers worldwide. The government consistently attempted to influence other nations to reduce trade barriers and open foreign markets to American products. Much of this influence was felt in the activities of the General Agreement on Tariffs and Trade (GATT). Under U.S. leadership, GATT member nations began a new round of multilateral negotiations in Uruguay in September, 1986. A major effect of tradebarrier reduction was to create an atmosphere favorable to rapid economic growth in many low-income countries. This was most notably true for China, which had opened its economy to trade and investment after the death of Mao Zedong in 1976.
Impact The favorable economic climate of the 1980’s worked to the advantage of the Republican Party, which achieved comfortable victories in the presidential elections of 1984 and 1988. In 1987, Alan Greenspan took over as head of the Federal Reserve. Monetary policy continued to be well managed. The inflation rate and interest rates continued to decline. Prosperity appeared to affirm their prevailing philosophy, giving greater emphasis to economic freedom and enterprise and less to big government. These developments were reinforced by the collapse of the Soviet empire in 1989 and by the steady expansion of economic cooperation in the European Union. Japan’s economy lost its luster, entering a stagnation period lasting more than a decade. The economy began to sour somewhat after President Reagan left office, however, and economic woes were the primary factor in President George H. W. Bush’s failure to win a second term in 1992. Further Reading
Blank, Rebecca M. Do Justice: Linking Christian Faith and Modern Economic Life. Cleveland: United Church Press, 1992. Identifies moral shortcomings of the U.S. economy, based mainly on the developments and conditions of the 1980’s. Dertouzos, Michael L., Richard K. Lester, and Robert M. Solow. Made in America: Regaining the Productive Edge. New York: Harper & Row, 1989. Reviews the history and condition of a number of major American industries to judge whether the United States is systematically falling behind. Also reviews government-industry relations and the nation’s facilities for education and training. Friedman, Benjamin. Day of Reckoning: The Consequences of American Economic Policy Under Reagan and After. New York: Random House, 1988. Argues that economic policy put present comforts ahead of provision for the future. Evidence is increased public debt, decreased international assets, and preference for consumption over investment. Hughes, Jonathan, and Louis P. Cain. American Economic History. 4th ed. New York: HarperCollins, 1994. This college text provides a good description and analysis of the transition from an industrial economy to a service economy. Page, Benjamin I., and James R. Simmons. What Government Can Do. Chicago: University of Chicago Press, 2000. Objective, historical, descriptive assessment of government economic programs,
The Eighties in America
particularly those relating to poverty. Phillips, Kevin. The Politics of Rich and Poor: Wealth and the American Electorate in the Reagan Aftermath. New York: Harper & Row, 1990. Argues that public policies of the 1980’s drastically shifted income from the poor to the rich. Reich, Robert. The Work of Nations: Preparing Ourselves for Twenty-First Century Capitalism. New York: Knopf, 1991. Explores the increasing complexity of business operations as individual functions and services become globalized. Sees the world economy evolving into three kinds of work: routine production services, in-person services, and symbolicanalytic services.
Business and the economy in the United States
■
163
Wattenberg, Ben. The Good News Is the Bad News Is Wrong. New York: Simon & Schuster, 1984. Lively journalistic refutation of negative representations of American society and economy. Perceptive observations on poverty, homelessness, and unemployment. Paul B. Trescott See also Agriculture in the United States; AT&T breakup; Demographics of the United States; Income and wages in the United States; Inflation in the United States; Poverty; Recessions; Savings and Loan (S&L) crisis; Unemployment in the United States; Unions.
C ■ Cabbage Patch Kids
The Cabbage Patch Kids were one-of-a-kind, soft-sculpture, needle-art dolls that were sold with names and birth certificates. A major fad of the mid-1980’s, the Cabbage Patch Kids recorded sales of nearly 3 million units in 1983, a first-year doll-sale record that exceeded the previous record by more than 1 million dolls.
quantity of dolls dispensed to each customer. A father made headlines by flying to London to buy a doll when he could not obtain one in the United States. Scalpers sold dolls for outrageous prices, with one doll reportedly selling for nearly one hundred times Coleco’s retail price of $27.99. At the height of the 1983 buying mania, Coleco canceled all paid advertising, resulting in an industry-low advertising expenditure of less than $500,000 for a toy introduction. Throughout the 1980’s, sales remained unusu-
Profits from the Cabbage Patch Kids and innumerable tie-in products like clothing, accessories, games, and books resulted in one of the greatest modern rags-to-riches stories, catapulting the dolls’ impoverished Georgia creator, Xavier Roberts, into a multimillionaire. Driving consumer demand was an inspired marketing concept: Each doll was unique, thanks to a computerized creation process that produced variations in hair, eye, and skin colors and other facial characteristics. Moreover, a cabbage patch birth story and an adoption oath accompanied and humanized each doll, and each one also featured Roberts’s signature as a mark of authenticity. Through Roberts’s tireless promotional efforts, the Cabbage Patch Kids received unprecedented free publicity, appearing on children’s television programs and on network programs such as the Today show and Johnny Carson’s The Tonight Show. The dolls received national news coverage when they were presented to children at the White House and when celebrities “adopted” them. As demand for the Cabbage Patch Kids exploded during the 1983 Christmas season, Coleco chartered Boeing 747’s to airlift dolls from Asian factories, an event that generated even more publicity but did not fully satisfy demand. Shoppers waited in lines for hours, and stampedes occurred in department stores as consumers fought to grab the coveted dolls. In one store, dolls were snatched off shelves in thirtysix seconds. Some stores held lotteries to distribute the scarce supply, while others placed limits on the
A Cabbage Patch Kid is displayed with her birth certificate and adoption papers. (Hulton Archive/Getty Images)
Definition Children’s dolls Manufacturer Coleco (licensee 1982-1989)
The Eighties in America
ally high for the Cabbage Patch Kids, fueled by Roberts’s marketing genius, including the 1985 publicity coup of sending an astronaut Cabbage Patch Kid into outer space. Impact With 65 million dolls sold throughout the 1980’s, the Cabbage Patch Kids are considered one of the most successfully marketed dolls in the toy industry. The dolls fulfilled the usually contradictory criteria for toy sale success: They were steadily selling products that sold annually in relatively anticipated quantities, and they were a fad that required a high level of promotion yet brought in significant profits. In addition, the dolls proved that computer technology could be used to create one-of-a-kind, massproduced products, and the uniqueness of each unit could be used effectively as a marketing device to drive mass consumer demand. Further Reading
Hoffman, William. Fantasy: The Incredible Cabbage Patch Phenomenon. Dallas: Taylor, 1987. Lindenberger, Jan. Cabbage Patch Kids Collectibles: An Unauthorized Handbook and Price Guide. Atglen, Pa.: Schiffer, 1999. Official Cabbage Patch Kids Web Site. http://www .cabbagepatchkids.com. Sullivan, Kathleen A. “As Luck Would Have It: Incredible Stories from Lottery Wins to Lightning Strikes.” Library Journal 129, no. 7 (April 15, 2004): 148. Taylor Shaw See also
Advertising; Children’s television; Computers; Consumerism; Fads; Toys and games.
■ Cable television Identification
Television distribution system in which programming is delivered to subscribers from a centralized provider by cable
Cable television greatly expanded the number of channels and program choices available to viewers, whose willingness to pay for the service radically changed the medium’s revenue model. In addition to paying monthly fees, cable subscribers developed new habits of viewing to which advertisers in the 1980’s had to respond. The relaxation of federal rules regulating cable television, the improvement of satellite delivery sys-
Cable television
■
165
tems, and the expansion of the cable infrastructure throughout the United States combined with American consumers’ enthusiasm for choice to fuel cable’s growth in the 1980’s. By the end of 1983, 40 percent of American television households had cable; by 1990, cable reached 60 percent of those households. The success of cable television stations made them effective competitors with the Big Three broadcast networks—the Columbia Broadcasting System (CBS), the National Broadcasting Company (NBC), and the American Broadcasting Company (ABC). Even the minimum tier of programming, basic cable, offered multiple channels, usually including broadcast stations, in return for a set monthly fee. Premium, or pay, cable channels required a fee in addition to the monthly cost of basic cable. The majority of cable channels that entered the market between 1980 and 1989 were offered on basic cable. Cable programmers mined television audiences for specific interests and demographics and created dedicated cable channels with programming designed to appeal directly to those more limited audiences—a practice known as narrowcasting. Advertisers quickly recognized the potential of narrowcasting, which allowed them more easily to tailor commercials to particular demographics. The response was overwhelming: Viewers wanted aroundthe-clock information and entertainment and became cable subscribers, while advertisers leapt at the chance to spend less money to reach more specific audiences. News, Music, and Sports In 1980, Ted Turner launched the Cable News Network (CNN), a twentyfour-hour news channel. Broadcast news organizations dismissed CNN, but Turner correctly identified a public hunger for news and information. CNN gained legitimacy through instant coverage of news events such as the attempted assassination of President Ronald Reagan in 1981. In 1981, CNN expanded to launch CNN2, later named CNN Headlines. The network built a global presence as well, with CNN International launching in Europe in 1985. By 1989, it broadcast in Africa, Asia, and the Middle East as well. CNN turned a financial corner in 1985, when five years of losses turned into $13 million in profit. In 1987, President Ronald Reagan known as the Great Communicator held an end-ofthe-year press conference with anchors from the Big Three networks and Bernard Shaw of CNN, demon-
166
■
Cable television
strating the network’s achieved legitimacy. In 1981, MTV (music television) was launched with the iconic symbol of an astronaut planting the MTV flag on the moon. Geared to a generation of teens and young adults who had been raised on television and rock and roll, MTV showed music videos—a new type of programming combining popular songs with image tracks. Young “veejays” (video deejays) hosted the network’s programs and introduced the videos, in much the same fashion as their radio counterparts. Music artists collaborated with filmmakers to transform the music video from promotional marketing to artistic expression. As MTV’s popularity grew, the channel became an arbiter of young-adult tastes and trends, influencing 1980’s American culture generally. It therefore began to be targeted by organizations worried about its lack of diversity. Others criticized the distinctive, fast-paced editing style of MTV’s programs and videos, which they believed had a negative effect on teen viewers’ attention spans. Throughout the 1980’s, MTV continued to reinvent itself, recognizing changing trends in music and producing its own original programming. MTV created VH-1 (later VH1), a second music channel featuring music for baby boomers. ESPN started as the Entertainment & Sports Programming Network, a twenty-four-hour channel devoted to sports. The network met the challenge of programming twenty-four hours a day by covering international events and obscure sports such as the “World’s Strongest Man” competition. Its programming expanded significantly beginning in 1984, when the network was acquired by ABC, which had significant sports resources, including both rights to cover future events and a library of past, “classic” sports coverage. In 1985, the sports network’s name was officially changed simply to ESPN, which went on to become a respected brand name in sports broadcasting. In 1987, ESPN came of age when it concluded a deal for partial broadcast rights with the National Football League (NFL). Children’s Programming and Home Shopping
Nickelodeon was an early cable presence with children’s programming. It began as a local broadcast channel called Pinwheel. Pinwheel became a cable channel in 1979 and changed its name to Nickelodeon in 1981. Recognizing that its target audience went to sleep early, the network modified its programming
The Eighties in America
in 1985. It continued to broadcast children’s shows during the day, but at night it broadcast reruns of old television shows that would appeal to parents nostalgic for their own childhood. The nighttime broadcast was labeled Nick at Nite. Nickelodeon, MTV, and VH-1 were owned by the same company, WarnerAmex Satellite Entertainment. In 1985, they were acquired by Viacom. In 1983, the Disney Channel launched and brought favorite Walt Disney characters such as Mickey Mouse and Donald Duck to a new generation. Home Shopping Network (HSN) began as a local cable-access venture selling surplus items. It found a home on national cable systems, however, providing the home shopping experience almost twenty-four hours a day and eventually developing products that were available exclusively through the network. A second home shopping channel, QVC (quality, value, convenience), launched in 1986. Home shopping networks offered a department store’s variety of items for sale—jewelry, apparel, kitchenware, tools— with sales that lasted for hours or minutes, thus encouraging buyers to make impulse purchases. More Choices Three Spanish-language stations— Telemundo, its subsidiary Galavision, and Univision—addressed the growing Latino population in the United States with programming geared toward Latino cultures and concerns. Univision was the first company in the United States authorized to receive and rebroadcast foreign television programming via satellite. Telemundo began broadcasting in 1987 with world and national news programs. The Discovery Channel aired documentaries and other nonfiction programming, primarily about the natural world. Bravo and the Arts and Entertainment Channel (A&E) concentrated on film, drama, documentaries, and the performing arts. Lifetime focused on women’s programming and health issues. Black Entertainment Television (BET) started in 1980, broadcasting programs geared toward African Americans, such as music videos featuring black artists. Turner Broadcasting Systems (TBS) launched Turner Network Television (TNT) in 1988 with sports and colorized movies. Religious Programming Television ministries were also beneficiaries of cable’s growth in the 1980’s. Cable television made stars out of charismatic Christian ministers Jim Bakker, Jimmy Swaggart, and Pat Robertson. Robertson, an entrepreneur as well as a pop-
The Eighties in America
ular preacher, built the Christian Broadcasting Network, which became Trinity Broadcasting Network (TBN). Jim Bakker and his wife Tammy Faye Bakker headed PTL (Praise the Lord) television, which was carried by twelve hundred cable systems. In the late 1980’s, Bakker and Swaggart were forced from their ministries by financial and sex scandals, and Bakker was convicted of fraud in 1989. The power and reach of the so-called televangelists’ television shows made their subsequent falls from grace into national news. Premium Cable Home Box Office (HBO) was one of the first premium cable television services, and the network moved into the 1980’s with the ability to increase its reach to households through satellite technology. It also expanded its programming to include made-for-cable movies as well as theatrical releases that had not yet been released on video. HBO viewers were film buffs who wanted to watch commercial-free, unedited motion pictures. Rival network Showtime aired similar programming but was one-third of the size of HBO, which reached roughly 9.3 million viewers in 1983. HBO launched a sister movie channel, Cinemax, in 1980 and followed it with the Comedy Channel in 1989. As HBO and Showtime evolved, both channels designed programming to maintain their subscriber base. Many cable networks were start-up companies with necessarily lean budgets. They broadcast primarily inexpensive programming, such as syndicated shows, old movies, and talk shows. By contrast, broadcast programming development and production were time-consuming, labor-intensive, and costly. In 1985, the broadcast networks’ revenues fell for the first time in their history. Between 1980 and 1989, the number of households subscribing to cable grew to 59 percent, and the number of viewers watching network television fell by 15 percent. The networks found themselves vulnerable to takeover, and by 1986 each of the Big Three changed ownership. The traditional broadcast networks were institutions incapable of quick adjustments in the changing television marketplace. They were out-maneuvered by cable. Impact In 1981, an average of nine broadcast stations were available to television households. By 1989, an average of twelve broadcast stations and thirty channels were available to the same television household. In ten years, a radical transformation had taken place. The effects of cable television fil-
Cable television
■
167
tered through news, entertainment, and the American consciousness. MTV overcame the music industry’s initial reluctance to produce music videos. Indeed, the industry became dependent on the music channel to introduce new artists and to revive the careers of older artists. Pop cultural icons such as Madonna and Michael Jackson used music videos and personal appearances on MTV to facilitate transitions in their careers. The visual aesthetic cultivated on MTV— fast-paced montage editing and a frenetic overall style—spread throughout television, film, and commercials. The 1984 NBC series Miami Vice owed its visuals and use of current music to the cuttingedge music channel. A new generation of directors achieved their first successes in the music-video industry before moving into film. Such motion pictures as Flashdance (1983) and Footloose (1984) owed their success in part to their ability to cultivate a music-video look and appeal to a young audience that demanded a fresh approach to the movie musical. MTV’s cultural effects went beyond the aesthetic. Both the network and the music industry faced controversy over the language and sexual content of videos that were consumed largely by minors. Concerned parents even enlisted support from Congress. When MTV responded by censoring videos, it was accused of bending to outside pressures and political correctness. Sociologists also attributed shortened attention spans to MTV’s fast-paced content. Rupert Murdoch and News Corporation stepped into the chaos of broadcast networks’ decline to launch the FOX network, an alternative network with cutting-edge programming designed to appeal to a younger audience. The cable era set into motion a frenzied period of media mergers, as entertainment corporations acquired companies with a cable presence, combined broadcast networks with cable channels, and built organizations to accommodate the vertical integration of entertainment products. The broadcast networks were portrayed as dinosaurs, lacking mobility and facing extinction. In response, the networks adapted by copying cable television’s most successful programming and targeted a younger generation of viewers. Cable made huge inroads into viewership, but no individual cable network approached the Big Three in terms of the sheer number of television households they reached. Eventually, cable experienced growing pains, and
168
■
The Eighties in America
CAD/CAM technology
the growth in viewership stalled. Viewers complained of repetitious programming. In response, cable companies instituted original programming. HBO and Showtime produced series and original movies for television. Nickelodeon commissioned original animated series for children. MTV added so-called reality series. The pressure to maintain growth was unending. Television viewers with cable access were no longer dependent on network evening news broadcasts, whose audience declined steadily. Viewers preferred up-to-the-minute news that could be tuned in at any time. Large corporations took over the networks and enforced cost-cutting measures in the network news operations. The networks jettisoned foreign bureaus. CNN, with an expanding international presence, became a primary source for news around the world. Critics described CNN as “crisis news network,” because audiences increased during crisis coverage, a type of coverage at which twenty-fourhour news channels excelled. During slow news cycles, however, CNN still had to fill its airtime, and critics complained about the network’s tendency to do so with less weighty lead stories reminiscent of those found in the tabloids. Television viewers developed new habits of program selection that were troublesome to commercial television broadcasters. Channel surfing, or channel grazing—switching from channel to channel to avoid commercials or select different programming—became a common habit as more channels became available. Advertisers could no longer count on their commercials being seen by the majority of a channel’s viewers. However, the ability to narrowcast on specialized channels combined with the reduced expense of placing commercials on cable television helped convince advertisers not to abandon television advertising. Indeed, as they recognized the growing power of cable television, they accelerated spending in cable markets. From 1980 to 1989, advertising dollars spent on cable increased from $53 million to $1.5 billion. Further Reading
Auletta, Ken. Media Man: Ted Turner’s Improbable Empire. New York: W. W. Norton, 2004. Personal portrait of Ted Turner. _______. Three Blind Mice: How the TV Networks Lost Their Way. New York: Random House, 1991. Details the factors that led to the precipitous decline of
broadcast network viewership. Excellent behindthe-scenes descriptions. McGrath, Tom. MTV: The Making of a Revolution. Philadelphia: Running Press, 1996. Inside story of MTV from its inception through 1992. Roman, James. Love, Light, and a Dream: Television’s Past, Present, and Future. Westport, Conn.: Praeger, 1996. Perspective on different eras of television. Vane, Edwin T., and Lynne S. Gross. Programming for TV, Radio, and Cable. Boston: Focal Press, 1994. Excellent information on ratings and networks. Nancy Meyer See also
Bakker, Jim and Tammy Faye; Children’s television; CNN; Colorization of black-and-white films; Flashdance; Home shopping channels; Madonna; Miami Vice; MTV; Music videos; Televangelism; Turner, Ted.
■ CAD/CAM technology Definition
Employment of computers to aid in industrial or architectural design and to guide the automated manufacture of parts or commodities
Developments in CAD and CAM technology in the 1980’s streamlined the manufacturing process by expediting the design, analysis, testing, documentation, and manufacturing of products and parts. CAD and CAM also utilized databases and innovative networking systems, such as Ethernet, in order to increase the efficiency of all elements involved in the process, including engineers, suppliers, managers, craftsmen, factory supervisors, materials handlers, factory layout, and machines. CAD (computer-aided design) technology allows designers to develop precise plans and schematics for everything from snack foods and pharmaceutical pills to mechanical parts and buildings. CAM (computer-aided manufacturing) technology helps automate the realization of such schematics by providing instructions to automated machines that carry out repetitive assembly and manufacturing tasks. These technologies were born from computer graphics research in the 1950’s. However, as late as the mid-1980’s, the practical application of these technologies faced many obstacles: Two major and closely related problems were the expense of computer technology and the lack of incentive for man-
The Eighties in America
agers and chief executive officers (CEOs) to implement the technology. Doing so was a big risk for some firms and a drastic change for all of them. In 1962, Ivan Sutherland at the Massachusetts Institute of Technology (MIT) outlined the need for computer graphics to be used in manufacturing in his doctoral thesis. His contemporary at MIT, Steve Coons, began to market the idea using computer graphics in engineering as a synergistic process between human and computer. He gave the idea a label: CAD. From Academia to Industry
The push to adopt CAD technology thus began at MIT. Much of the rest of academia embraced the concept, and as early as 1964 the University of Michigan and the University of Detroit sponsored a short course in the new technology, with Coons serving as one of the principal lecturers. MIT, the University of California, Los Angeles (UCLA), and the Tennessee Space Institute followed suit, and by the late 1970’s, many of the nation’s top engineering schools were teaching short courses in CAD and serving as centers for its research and development. While scholars continued to embrace CAD and CAM, General Motors and Lockheed-Georgia, using hardware manufactured by International Business Machines (IBM), had implemented the first fully functional CAD systems by 1966. Throughout the 1970’s, Boeing, the U.S. Air Force (USAF), the National Aeronautics and Space Administration (NASA), and General Dynamics, among a few others, began to implement the technology, with NASA and the USAF practicing ways to share graphics data between computers. These were similar to the local area network (LAN) system developed by the Xerox Palo Alto Research Center (Xerox PARC) in the 1970’s. In the early 1980’s, CAD scholars and users began to develop user manuals for CAD and CAM. By the middle of the 1980’s, the Society of Manufacturing Engineers (SME), which has thousands of members in academia and corporate America, coined the term CIM (computer-integrated manufacturing) and established that CAD and CAM technology could be applied to all facets of the manufacturing process. Combined with CAE (computer-aided engineering), which focused on the analysis and testing of a CAD design, CAD, CAM, and CIM, formed the C4 concept that revolutionized manufacturing by the end of the 1980’s.
CAD/CAM technology
■
169
Obstacles and Incentives
Though LAN networking made the process more practical and computers had become somewhat more affordable by the early 1980’s, most CEOs and business managers still did not believe CAD and CAM implementation to be cost effective. Many saw the technology as something only giant companies such as General Motors, Boeing, McDonnell-Douglas, and Lockheed could afford. For businesspeople, the incentive for CAD and CAM began in the late 1970’s through the early 1980’s, as inefficiency in U.S. manufacturing threatened to weaken the economy relative to Japan and Western Europe. This danger was partly due to waste and profit loss caused by overproduction, poor quality, high design costs, frugal consumers, lengthy production times, a lack of synergy between small suppliers and large manufacturers, and a sudden consumer demand for smaller, more fuel-efficient automobiles. Managers understood that CAD and CAM could begin to remedy some of these problems, but the extra financial burdens and the organizational stress of adopting these robust systems weighed against their wholehearted acceptance. Three specific financial drawbacks were the costs of implementation, training, and system maintenance. Because these processes could cost much more than the initial purchase of the CAD/CAM hardware and software and because management had to restructure entire companies around the new processes, adopting those processes was a precarious decision, and many decided against it. Furthermore, there was an ethical problem to consider. CAD/ CAM scholars and business managers knew that drafting and materials-handling jobs would be lost to machines. That ethical dilemma has never been solved; however, the financial concerns were reduced by the late 1980’s. Vast improvements in networking, chip design, processor speed and capacity, software capabilities, and computer design made adopting the C4 system an easier decision. The prices of software and hardware began to decrease, and implementation, training, and maintenance became less costly as they grew slightly less timeconsuming and confusing. Furthermore, the success of the first companies to adopt the technology in the mid-1980’s provided an added incentive for their competitors to emulate them in the late 1980’s.
Impact CAD, CAM, CAE, and CIM became necessary technologies for nearly all manufacturers, from
170
■
Caffeine
The Eighties in America
Popular stimulant found in beverages such as coffee, tea, soft drinks, and cocoa
Colonial European Americans primarily drank tea, until the British placed a tax on it prior to the American Revolution. Coffee then became the drink of choice, because the product could be imported from Caribbean and Central American plantations by the early nineteenth century, whereas tea came from Asia and other lands whose trade was dominated by Great Britain. The advantages of consuming caffeine include increased energy, greater physical endurance, improved memory, and the ability to complete tasks more quickly. Caffeine can also be used as an analgesic in combination with aspirin or other drugs to help control pain. In the 1960’s, the quest for self-awareness and the psychoactive qualities of caffeine caused a growth in communal coffeehouses in major U.S. cities. The boom in consumption slowed, however, and the coffee industry experienced a slump in sales by the beginning of the 1980’s. Around the same time, the medical community began to publish research data about the adverse effects of overconsuming caffeine, usually defined as ingesting more than 650 to 800 milligrams per day. Empirical research showed that global consumption of soft drinks increased by 23 percent from 1960 to 1982, possibly indicating physical dependence. Habitual caffeine drinkers can experience minor withdrawal symptoms without caffeine, including headaches and sleeplessness. Some other studies concluded that chronic consumption of caffeine could be associated with irregular heartbeats, higher levels of cholesterol, and bladder cancer in males. In 1980, the U.S. Food and Drug Administration (FDA) issued a warning, calling for pregnant women to restrict or eliminate coffee consumption. Before the 1984 Olympics, an international committee placed caffeine on the list of banned substances. The beverage industry soon began an aggressive marketing campaign to offset the harmful effects portrayed in research and in the media. As with wine, later studies provided evidence that moderate consumption of caffeinated beverages could be beneficial to health, while confirming the detrimental effects of overuse.
The United States is one of the world’s major consumers of caffeine. In the 1980’s, the psychoactive properties of the drug were both hailed and criticized by various institutions. Beverage companies also actively promoted and marketed caffeine, causing a surge in its popularity by the end of the decade.
Impact Beginning in 1987, Starbucks became a popular national coffeehouse chain, driving an American cultural obsession. Indeed, the American demand for coffee became so great that major chains such as Starbucks, Coffee Bean and Tea Leaf, Seattle’s Best Coffee, and Caribou Coffee were able
small independent contractors to large corporations. CAD and CAM reduced errors in transcription, documentation, design, analysis, and cost prediction. It became easier for engineers to see how components and subsystems of a product would interact before the product was built, and teamwork increased within and among project groups, thereby increasing the efficiency of each new process from design to delivery. Developments in CAD and CAM software directly resulted in the practical use of robots, rapid prototyping (the automatic creation of a model based on a CAD design), and virtual reality in manufacturing. Further Reading
Bowman, Daniel. The CAD/CAM Primer. Indianapolis: Howard W. Sams, 1984. A dynamic instructional publication on the basics of CAD and CAM for the entrepreneurial engineer; outlines the effectiveness of the technology along with the problems of its implementation. Groover, Mikell, and Emory Zimmers. CAD/CAM: Computer-Aided Design and Manufacturing. Englewood Cliffs, N.J.: Prentice Hall, 1984. Thorough and lucid review of the applications of CAD and CAM, including numerical-control programming, quality control, graphics software and databases, robot technology, and inventory management. Machover, Carl. The CAD/CAM Handbook. New York: McGraw-Hill, 1996. Extensive history of the C4 concept focused on the applications of CAD, CAM, CAE, and CIM. Troy Place See also
Apple Computer; Business and the economy in Canada; Business and the economy in the United States; Computers; Inventions; Microsoft; Robots; Science and technology; Virtual reality.
■ Caffeine Definition
The Eighties in America
to coexist with many local mom-and-pop establishments, rather than driving them out of business. By the late 1980’s, the national and local coffeehouses began to offer decaffeinated or caffeine-free beverages in an effort to make themselves into social destinations for everyone, regardless of caffeine consumption. Debate continues, meanwhile, regarding the positive and negative effects of caffeine on the body. Further Reading
Gilbert, Richard J. Caffeine: The Most Popular Stimulant. New York: Chelsea House, 1986. Gupta, B. S., and Uma Gupta, eds. Caffeine and Behavior: Current Views and Research Trends. New York: CRC Press, 1999. James, Jack E. Caffeine and Health. New York: Harcourt Brace Jovanovich, 1991. Schultz, Howard, and Dori Jones Yang. Pour Your Heart into It: How Starbucks Built a Company One Cup at a Time. New York: Hyperion, 1997. Gayla Koerting
Cagney and Lacey
■
171
the creation of the series in 1982. Because Swit had other acting commitments, the series debuted with Meg Foster replacing Swit as Cagney. Almost immediately, however, the show came under fire by CBS executives, who were concerned that Foster’s portrayal would be interpreted by viewers as having homosexual overtones. The network threatened to cancel the series if Foster was not replaced. At the beginning of the 1982 television season, Sharon Gless took over the role of Christine Cagney, a career-minded, single police detective, while Daly continued in the role of Mary Beth Lacey, a police detective, wife, and mother. The show simultaneously traced both the personal and the professional lives of each detective, as well as bringing attention to contemporary social problems, most often related to women, such as rape, abortion, and breast cancer. Network executives, concerned to minimize controversy, were in continual negotiation with the show’s writers and producers about how these issues should be presented.
See also
Consumerism; Health care in the United States; Starbucks.
■ Cagney and Lacey Identification Television police series Date Aired from 1982 to 1988
A critically acclaimed police series about two female detectives in New York City, Cagney and Lacey focused on the experiences of female characters working in a maledominated occupation. Many of the show’s story lines also dealt with social issues predominantly faced by women of the 1980’s. Cagney and Lacey, which aired on the Columbia Broadcasting System’s television network (CBS-TV) from March 25, 1982, to May 16, 1988, was a drama about the careers and personal lives of two female New York City police detectives. Created by Barbara Corday and Barbara Avedon in 1974, the story was originally designed to be a feature film. The writers were unable to sell the story to a movie studio, however, and in 1981 the project was made into a television movie for CBS starring Loretta Swit as Christine Cagney and Tyne Daly as Mary Beth Lacey. The enormous popularity of the television movie led to
Sharon Gless (left) and Tyne Daly as Christine Cagney and Mary Beth Lacey in Cagney and Lacey. (Hulton Archive/Getty Images)
172
■
Camcorders
At the same time that the show struggled with controversial story lines, it also struggled to stay on the air. In 1983, CBS canceled the series because of poor ratings. After fans responded with a massive letter-writing campaign, the network brought the detective series back for a second season. During the show’s six-year run, its popularity continued to grow. In total, the drama earned thirty-six Emmy nominations and won fourteen of the awards, including four Emmy Awards for Daly and two for Gless. Impact
Cagney and Lacey made television history in the 1980’s as one of the first television shows to feature women in a predominantly male occupation. Just as controversial were the weekly story lines. For women in the 1980’s, Cagney and Lacey reflected the rapidly changing roles of women in American society.
Further Reading
D’Acci, Julie. Defining Women: Television and the Case of Cagney and Lacey. Chapel Hill: University of North Carolina Press, 1994. Thompson, Robert J. Television’s Second Golden Age. Syracuse, N.Y.: Syracuse University Press, 1997. Bernadette Zbicki Heiney See also Television; Women in the workforce; Wom-
en’s rights.
■ Camcorders Definition
Portable camera and videocassette recorders
Camcorders changed the nature of filming and provided other professionals and consumers with a handy video device. Before the 1980’s, portable videotaping equipment, such as the Portapac, had reel-to-reel tapes and was very bulky. The videocassette recorder (VCR), introduced in the Betamax format by Sony in 1975 and in the Video Home System (VHS) format by Radio Corporation of America (RCA) in 1977, was promoted for taping television programs. In the early 1980’s, electronics companies produced models for video cameras that contained videocassette recorders— for both television production and for home use. The first camcorders were developed for television filming. At the 1981 and 1982 conventions of the
The Eighties in America
National Association of Broadcasters, a number of leading manufacturers, including RCA, Panasonic, Sony, and Bosch, displayed new portable television camcorders, which weighed between thirteen and twenty-three pounds. Many improved models followed. Technical issues to be resolved included size and weight, quality of the images, and standardization of tape formats. The earliest professional camcorders used pickup tubes to translate light into electrical energy. In 1984, solid-state cameras in which charged-coupled devices (CCD) replaced tubes appeared. These CCD microchips allowed for improvements in image clarity and the size and weight of the camcorder and permitted taping in low light. By the mid-1980’s, Sony had triumphed in the competitive market with its Betacam— released in 1982—which used cassettes the same size as its Betamax. The Betacam employed a component system that recorded chrominance (color) and luminance (brightness) signals separately to produce a high-quality picture. Sony continued to make technical improvements in the Betacam and in 1986 developed the metallic SP (“superior performance”) tape with 340 lines of resolution. Consumer Camcorders In the early 1980’s, both JVC and Sony marketed cameras that attached to a portable VCR unit. In 1982, JVC introduced a small camcorder with the compact VHS-C cassette, which fit into an adapter for playback in a VHS VCR. A few months later, Sony introduced the Betamovie (model BMC-110), in Beta format. The Sony camera allowed only recording, with neither an electronic viewfinder nor a playback feature. These early personal camcorders typically rested on the shoulder, as they could not be held by one hand. Engineers responded to technical issues with the consumer camcorders as they had with professional camcorders. Though much smaller than the broadcast camcorders, the early personal camcorders were unwieldy. Manufacturers, beginning with Kodak in 1984, developed smaller camcorders that recorded on 8mm, in contrast to the 12mm tape of both Beta and VHS. In 1986, Sony introduced the 1.74-pound Handycam. The camera had to be connected directly to a television for playback, as neither VHS nor Betamax recorders accepted 8mm tapes. Technical improvements throughout the decade followed, and by the end of the 1980’s, consumers had a choice between two types of camcorders: low-
The Eighties in America
band and high-band. High-band recorded a greater range of luminance. While low-band gave 250 lines of resolution, high-band provided 400—a sharper picture. VHS, Beta, and 8mm tapes and their subtypes were low-band. In 1987, JVC developed a camcorder using a high-band S-VHS tape with a 160minute length. Unlike the low-band format, highband enabled copying without loss of quality. Home users were not as interested in this format, as it was more costly, though it caught on with professional and industrial users. Camcorders quickly became very popular with consumers, as they eliminated the chore of threading film into movie cameras and projectors. Other attractive features of the camcorder included the instant availability of the video; the possibility of erasing, editing, and duplicating the video with relatively inexpensive equipment; and the ease of viewing. Moreover, a reel of film in home movie cameras allowed only 3 minutes of filming; even the earliest personal camcorders permitted between 40 and 120 minutes of video. The low-band camcorders differed significantly from the broadcast-quality camcorders. First, the formers’ image quality was poorer, partly because the recording drum heads rotated much more slowly. Second, they recorded all information in one signal. They were, of course, much smaller, lighter, and less expensive than the broadcast models. Impact
The broadcast camcorder allowed for greater flexibility in electronic news gathering (ENG). By the end of the decade, Sony’s Betacam had captured most of the ENG market. The highly portable camcorders allowed for unobtrusive filming of events as they happened and enabled a single person to replace the three people needed by previous systems when filming on-site. The camcorder had political uses. For example, professional camcorder filming of the overthrow of Nicolae Ceaulescu in Romania at the end of 1989 offered an alternative perspective to the official account. The camcorder proved valuable for “guerrilla television,” films produced by political activists. It also spawned new forms of entertainment and shows, including reality programs such as America’s Funniest Home Videos and America’s Most Wanted that began to appear in the late 1980’s. Members of various professions also found uses for the camcorder. A naturalist, for example, could
Canada Act of 1982
■
173
produce aerial films of a landscape under study. Educators realized the potential impact the technology could have in the classroom. Camcorders could be used for filming medical procedures. Moreover, the camcorder with the CCD permitted the development of more useful surveillance and security cameras for crime control. Further Reading
Abramson, Albert. The History of Television, 1942 to 2000. Jefferson, N.C.: McFarland, 2003. A fairly technical study of the history of the equipment used to provide television programs. Clifford, Martin. The Camcorder: Use, Care, and Repair. Englewood Cliffs, N.J.: Prentice Hall, 1989. An instructional book from the 1980’s that describes the equipment and add-on devices then available. Explains the technology and science of the camcorder and provides instructions on how to use it. Dovey, Jon. “Camcorder Cults.” In The Television Studies Reader, edited by Robert C. Allen and Annette Hill. New York: Routledge, 2004. Examines the use of the camcorder both in home videos and in surveillance. Harding, Thomas. The Video Activist Handbook. 2d ed. Sterling, Va.: Pluto Press, 2001. While most of the discussion in this book focuses on events after the 1980’s, the book illustrates how the camcorder changed reportage and permitted firsthand visual documentation for activists. Kristen L. Zacharias See also
America’s Most Wanted; Hobbies and recreation; Home video rentals; Infomercials; sex, lies, and videotape; Television.
■ Canada Act of 1982 Definition
Act of the British parliament that relinquished all control over Canada’s governance and modified Canada’s constitution Date Came into force on April 17, 1982 The Canada Act not only made Canada a sovereign country but also provided a codified, legal process for the functioning of the country’s federal government and enacted an explicit list of rights of its citizens. The Canada Act of 1982 patriated Canada’s constitution from the United Kingdom. An act of the British
174
■
Canada Act of 1982
The Eighties in America
The official proclamation of the Constitution Act, 1982, by Queen Elizabeth II of England, signed April 17, 1982. (Library and Archives Canada)
parliament, it was the most important part of the Canadian constitutional process since the British North America Act of 1867 first created a Canadian constitution. Before the 1982 act passed, the involvement of the United Kingdom was necessary for any modification of Canada’s constitution, because the British retained formal, legal powers over their former colony. Afterward, Canadians gained complete control of their nation and its laws. The Catalyst for Change More than a century elapsed between the creation of a constitution for Canada in 1867 and its patriation. The exceptionally long delay had little to do with British opposition. The several amendments made to the constitution
were automatically adopted by the United Kingdom’s legislature. The primary reason for the extensive time lag was the substantial disagreement among Canadians regarding a process for amending their constitution. In particular, they failed to agree on the degree to which the federal government needed the provinces’ approval for changes prior to sending the British a formal request for an amendment. Precipitating Canada’s efforts finally to patriate its constitution was Quebec’s referendum on independence in 1980. In an attempt to persuade the Québécois to vote against the measure, Canadian prime minister Pierre Trudeau stated that he would pursue constitutional changes as a method to ad-
The Eighties in America
dress their concerns. After the referendum failed, he initiated a process to make significant amendments to the country’s constitution, including acquiring complete control over it. A Difficult Process
Trudeau met with the provincial premiers in an attempt to reach an agreement on constitutional changes. The discussions addressed several topics in addition to an amending process. The various parties could not, however, find a resolution to their differences. Only two provincial leaders supported Trudeau’s initial proposal for constitutional reform. After a legal challenge to the proceedings, Canada’s Supreme Court ruled in Reference re a Resolution to Amend the Constitution (1981, also known as the Patriation Reference) that the federal government’s process was strictly legal, but it violated constitutional conventions held by the provinces. In general, this decision meant that the provinces should have input into the amendment process but that the federal government had the authority to demand patriation of and changes to the constitution. After the Supreme Court ruling, Trudeau restarted the process of pursuing constitutional change. Nine of the ten provincial leaders eventually reached an agreement with Trudeau in November, 1981. Only Quebec premier René Lévesque opposed the final constitutional package.
Major Aspects of the Constitution
The legislation approved by the Canadian parliament that adopted the constitutional changes was called the Constitution Act, 1982. It included several important alterations in addition to patriating the constitution from the United Kingdom. The formal approval given by the British parliament is known as the Canada Act, and it includes the entire text of the Constitution Act. With this new legislation, the Constitution Act was added to the Canadian constitution, which, like its British counterpart, is not contained in any one document. The British North America Act of 1867 was renamed the Constitution Act, 1867, and was incorporated with some amendments into the new constitution. Indeed, the Constitution Act, 1982, included a schedule listing thirty other documents that were part of the Canadian constitution, from the Constitution Act, 1867, through the two Constitution Acts of 1975. The Constitution Act, 1982, is divided into two primary parts. One provides an amendment process. It states that changes may occur in one of two
Canada Act of 1982
■
175
ways. Some changes require the unanimous approval of the federal parliament and all of the provincial legislatures. Most amendments, however, need only the approval of the federal parliament and two-thirds of the provincial legislatures, so long as they represent at least 50 percent of the country’s population. If such an amendment limits the powers of the provincial governments, however, it will not take effect within a province whose legislature dissents from it. The other primary part of the constitution is the Charter of Rights and Freedoms. This charter includes civil and political rights typically associated with Western democracies. Examples include freedom of religion, freedom of expression, freedom to assemble peacefully, freedom of the press, and the right to vote. The charter also includes language rights and mobility rights. Other noteworthy provisions of the constitution require a yearly first ministers’ conference, address aboriginal rights and gender equity, and state a commitment to reducing individual and regional economic inequality. Impact The Canada Act of 1982 severed the final, formal ties between the United Kingdom and Canada’s domestic politics. More important, it provided a formal process regarding the conduct of federalprovincial relations in Canada and outlined specific rights enjoyed by all Canadians. Despite this progress, Quebec’s failure to approve of the constitutional changes meant that its status within Canada remained volatile. Subsequent Events The Canadian government made later attempts to address Quebec’s concerns. The Meech Lake Accord was a proposal put forth in 1987 to amend the constitution to recognize Quebec as a “distinct society.” Not all of the provinces approved of the amendment. The greatest obstacle to its passage was the concern of aboriginal (or First Nations) peoples that they were not also recognized as having a distinct society. The Canadian government then pursued the Charlottetown Agreement in 1992, which addressed the First Nations as well as the Québécois. When submitted for approval in a national referendum, however, it was also rejected. Further Reading
Bothwell, Robert. Canada and Quebec: One Country, Two Histories. Vancouver: University of British Columbia Press, 1995. Transcript of a joint presenta-
176
■
The Eighties in America
Canada and the British Commonwealth
tion and discussion featuring top scholars on the history of Quebec’s relationship with the federal government in Canada. Clarke, Harold D., et al. A Polity on the Edge: Canada and the Politics of Fragmentation. Peterborough, Ont.: Broadview Press, 2000. A comprehensive examination of contemporary divisive issues in Canada and their impact on the country’s politics. McRoberts, Kenneth. Misconceiving Canada: The Struggle for National Unity. Toronto: Oxford University Press, 1997. Overview and analysis of the Canadian federal government’s efforts to reach a final agreement with Quebec to keep the province within the country. Monahan, Patrick J. Constitutional Law. 3d ed. Toronto: Irwin Law, 2006. Comprehensive study of Canadian constitutional law and history. Includes the complete texts of both the Constitution Act, 1867, and the Constitution Act, 1982. Riendeau, Roger. A Brief History of Canada. New York: Facts on File, 2000. Despite the title, a lengthy and detailed coverage of major issues in Canadian history. Particularly strong discussion of the federal government’s relationship with Quebec. Russell, Peter H. Constitutional Odyssey: Can Canadians Become a Sovereign People? 3d ed. Buffalo, N.Y.: University of Toronto Press, 2004. Thorough discussion of the history of Canadian constitutionalism and sovereignty. Bibliographic references and index. See, Scott W. The History of Canada. Westport, Conn.: Greenwood Press, 2001. Another work that explains the primary issues in Canadian history. Also includes a list and brief descriptions of noteworthy people in Canadian history. Kevin L. Brennan See also Aboriginal rights in Canada; Bourassa, Robert; Canada and the British Commonwealth; Canadian Charter of Rights and Freedoms; Lévesque, René; Meech Lake Accord; Minorities in Canada; Quebec referendum of 1980; Trudeau, Pierre.
■ Canada and the British Commonwealth Definition
Diplomatic, cultural, educational, and economic relations among Canada, the United Kingdom, and the other Commonwealth countries
Canada achieved full sovereignty in 1982, but it continued to see its links with the Commonwealth as important to its history and cultural identity. The nation’s Commonwealth status also helped provide it with a separate identity from the United States in North America. The British Commonwealth of Nations was formed in 1931 out of the various self-governing former colonies of Great Britain that still owed allegiance to the British monarch as their head of state. Canada was one of the Commonwealth’s founding members. After World War II, the Commonwealth was expanded greatly to include the newly independent British colonies of Africa and Asia. Canada remained a senior member. In the 1970’s and early 1980’s, this seniority was particularly emphasized by the fact that Prime Minister Pierre Trudeau was one of the longest-serving Commonwealth heads of state. CHOGMs
From the beginning of the Commonwealth, the heads of state of all the member nations had met together to discuss matters of mutual concern. For the first four decades, these meetings took place in London; the British prime minister chaired the meetings, and the monarch attended them. Beginning in 1971, however, the meetings were held in other member states, with the host country’s leader chairing each meeting. The meetings came to be called Commonwealth Heads of Government Meetings, or CHOGMs. Trudeau chaired a particularly successful meeting in Ottawa in 1973. After Trudeau retired in 1984, Prime Minister Brian Mulroney attended these biennial meetings. Mulroney was a junior prime minister, and some of the African presidents, such as Julius Nyerere of Tanzania and Kenneth Kaunda of Zambia, were much more senior and experienced than he was. Moreover, many of the political battles fought within the CHOGMs of the 1980’s centered on Africa, especially the Republic of South Africa, Namibia, and Rhodesia-Zimbabwe. In 1987, Mulroney chaired a CHOGM held in Vancouver and attended by forty-five heads of gov-
The Eighties in America
ernment. The debate was again dominated by discussion of British prime minister Margaret Thatcher’s refusal to engage in further trade sanctions against South Africa. Despite Mulroney’s pleas for unity at his opening address, Kaunda and Robert Mugabe bitterly attacked Thatcher. Following a suggestion first made by Trudeau, CHOGMs began to include weekend retreats. In 1987, Mulroney brought the Vancouver conference participants to Lake Okanagan, Kelowna, British Columbia, where they managed to hammer out a communique noting the United Kingdom’s disagreement with the proposed sanctions against South Africa. One of the accusations that the British made against Canada was that Canadian trade with South Africa had increased 25 percent from 1985 to 1986, whereas British trade had not.
Canada and the British Commonwealth
177
Educational Links In addition to the CHOGMs, Commonwealth education ministers met every four years to coordinate educational links between their nations. Many Commonwealth students studied in Canadian universities. However, in the 1980’s, it was becoming increasingly expensive to fund the number of students, especially from developing nations, wishing to study in Canada, Britain, and Australia. A Commonwealth committee under Lord Briggs had investigated the possibility of distance learning. On the basis of this report, the 1987 CHOGM approved the establishment of the Commonwealth of Learning Institute to promote distance learning and to package educational programs using the latest technology and methods. Canada offered a site in Vancouver and was one of the six nations to offer funding on a voluntary basis, giving it a permanent seat on the Board of Governors. The institute was established in 1988, and it was running by 1989, with Lewis Perinbam, a former vice president of the Canadian International Development Agency, serving as chairman of the Board of Governors. Sporting Links
Prime Minister Pierre Trudeau was one of the longest-serving heads of state in the British Commonwealth.(Library of Congress)
■
Over the last few decades of the twentieth century, sporting links became increasingly important as a vehicle of foreign relations. Although much of Canadian sport has been traditionally linked to that of the United States, Canada has retained strong links with the Commonwealth in athletics and various minor sports, especially through the Commonwealth Games, held every four years, in even-numbered years in which there are no Olympic Games. During the 1980’s, there were two such games, in 1982 in Brisbane, Australia, and in 1986 in Edinburgh, Scotland. The Brisbane Games were seen as some of the most successful, with forty-five nations participating. Canadian athletes did not equal their leading position of the previous games at Edmonton, however. Instead, they stood third in the medals table, behind Australia and England. Nevertheless, there were many excellent performances by Canadian athletes. Canadian 1982 gold medalists included high jumpers Milt Ottey and Debbie Brill; Angela Taylor, who won for the 100-meter dash; and the women’s 4 × 400-meter relay team. Although the 1986 Edinburgh Games was marked by a large boycott by the African members, who were protesting Britain’s sympathetic stance toward South Africa, the games themselves were conducted in a
178
■
good atmosphere. Canadian athletes did better than in 1982, standing second in the medals table behind England. Milt Ottey won the high-jump competition for the second time; Ben Johnson won the 100-meter dash and helped the 4 × 100-meter relay team win. Canada did especially well in boxing and wrestling, taking home six gold medals in boxing and nine out of the ten awarded in wrestling. Impact Canada continued to forge a separate identity as a member of the British Commonwealth in the 1980’s, especially following passage of the Canada Act of 1982, which patriated the nation’s constitution and made it fully sovereign. The country also continued to attract students and immigrants from a large number of other Commonwealth countries, particularly India, Pakistan, and Caribbean nations. Trends in Canadian trade favored the United States rather than the Commonwealth, as the percentage of Canadian trade conducted with the Americans increased and the percentage conducted with Commonwealth countries decreased. Canada nevertheless maintained financial links to the Commonwealth, especially in the aid it provided to the Commonwealth’s developing nations. Further Reading
Francis, R. Douglas, Richard Jones, and Donald B. Smith. Destinies: Canadian History Since Confederation. 5th ed. Toronto: Harcourt Brace Canada, 2004. This second volume of a two-volume history of Canada provides a thorough overview of the nation’s development since 1867. Hillmer, Norman, and J. L. Granatstein. Empire to Umpire: Canada and the World to the 1990’s. Toronto: University of Toronto Press, 1994. One of the Canada in the World series, it serves as an exhaustive study of Canada’s foreign relations in the 1980’s. McIntyre, W. David. A Guide to the Contemporary Commonwealth. New York: Palgrave, 2001. Includes a succinct section on the Commonwealth’s background, as well as sections on voluntary organizations, sporting links, and business connections. David Barratt See also
The Eighties in America
Canada and the United States
Canada Act of 1982; Foreign policy of Canada; Mulroney, Brian; Trudeau, Pierre.
■ Canada and the United States Definition
Diplomatic and economic relations between Canada and the United States
In the 1980’s, U.S.-Canadian relations underwent a revolutionary shift, as Pierre Trudeau’s government, which was less than accommodating to U.S. interests, ended, and the United States found Brian Mulroney’s new ministry to be more open to compromise. The relationship between the United States and Canada is among the closest and most extensive in the world. It is reflected in the staggering volume of bilateral trade between the two countries, the equivalent of $1.2 billion a day in goods, services, and investment income. In addition, more than 200 million people cross the U.S.-Canadian border each year. In contexts ranging from law-enforcement cooperation to environmental cooperation to free trade, the two countries work closely on multiple levels, from federal to local. During the 1980’s, the relationship between the two nations was influenced by the revolutionary nature of a decade that began with an escalation of the Cold War and ended with the toppling of the Berlin Wall and the imminent collapse of the Soviet Union. Early in the decade, during Pierre Trudeau’s ministry, U.S.-Canadian relations were somewhat tense. In the decade’s second half, during Brian Mulroney’s ministry, the two nations concluded a free trade agreement, and in 1989 Canada was admitted as a member of the Organization of American States. The Mulroney government endorsed the George H. W. Bush administration’s invasion of Panama in December, 1989, and Canada would later participate enthusiastically in the U.S.-led alliance in the Persian Gulf War of 1991. Diplomatic Initiatives The tensions over the Trudeau government’s National Energy Program (NEP) and Canadian screening of foreign investment had eased by the time Trudeau left office in 1984. His replacement, Liberal finance minister John Turner, held office for only weeks before Brian Mulroney’s Conservative Party drove the Liberals from power in Ottawa. Mulroney’s arrival and President Ronald Reagan’s reelection were part of a more general international trend toward the political right that ushered in a period of mostly harmonious U.S.-Canadian relations, as Reagan’s Republican government
The Eighties in America
was much more comfortable with Mulroney’s Conservative leadership than it had been with Trudeau’s liberalism, which at times seemed to threaten U.S.Canadian foreign policy. Days after his victory in September, 1984, Mulroney announced in New York that “good relations, super relations with the United States, will be the cornerstone of our foreign policy.” The two conservative governments shared an ideological compatibility. U.S. foreign policy did not change in the course of the 1980’s; Canadian policy accommodated itself to it. Both Republicans and Conservatives accepted the older assumptions of the Cold War; spoke of arms control; and professed a commitment to balanced budgets, to trade liberalization, to privatization, and to deregulation. This political compatibility extended to the two heads of state: Reagan and Mulroney liked each other personally, and a period of amiable relations began. Reagan was one of the few presidents who actively sought a closer relationship with Canada and who consistently made improved trilateral North American relations and liberalized trade a priority of his administration. The new prime minister made a brief visit to Washington, D.C., in September, 1984, and welcomed the reelected president formally to Canada at Quebec City in March, 1985. At this summit, Reagan and Mulroney appointed Drew Lewis, former U.S. secretary of transportation, and William Davis, former Ontario premier, to study acid rain. A year later, the prime minister met the president with the envoys’ conclusion that “acid rain imperils the environment in both countries.” Reagan promised action, and Mulroney showed off Canada’s new influence in Washington, announcing that acid rain had become “a front-burner issue,” but the Reagan administration proved unwilling to enforce the resulting legislation. One of the features of the new relationship between the U.S. and Canadian governments was a greater frequency of high-level meetings. Mulroney and Reagan pledged to meet annually to review outstanding issues, and they did so between 1984 and 1989. Reagan’s secretaries of state and Mulroney’s ministers of external affairs met even more frequently. The improved relationship also owed a good deal to the energy and diplomatic skills of Allan Gotlieb, Canadian ambassador to Washington (1981-1989), who made a real effort to understand the U.S. Congress. He realized that nothing could be accomplished either in trade policy or on environ-
Canada and the United States
■
179
In the United States, Time magazine heralded Brian Mulroney’s election as Canada’s prime minister and its importance to relations between the two countries. (Hulton Archive/Getty Images)
mental controls without active congressional support and that a hostile and protectionist Congress could do great damage to Canadian interests. Conservatives and Republicans worked so well together that by January, 1986, Reagan declared that he had achieved a “renewed spirit of friendship and cooperation with Mexico and Canada” and a “most productive period in Canadian-American friendship.” At a follow-up meeting in Washington, Reagan and Mulroney renewed the North American Aerospace Defense Command (NORAD) agreement for five years, but NORAD suddenly became a contentious issue in Canada, as it involved Canada in the Reagan administration’s Strategic Defense Initiative (SDI). In a charged political atmosphere, consensus in the Canadian parliament was unlikely, and a joint Canadian Senate-House of Commons committee report, while supporting the need for the United States to undertake basic research on SDI technol-
180
■
The Eighties in America
Canada and the United States
ogy, expressed serious reservations about the implications of SDI for U.S.-Canadian relations. The best Mulroney could do was to call Reagan and encourage him to press on with the research without Canadian participation. He also promised that Parliament would not overtly prohibit the involvement of Canadian companies in the SDI. Trade Relationships No bilateral issue of the past half-century evoked in Canadians such strong personal and political emotions as the debate over free trade with the United States. Canada first suggested free trade in 1983; its objectives were to expand trade and to maintain security of access to the United States. Free trade acquired bipartisan respectability in Canada with the establishment of a Royal Commission on the Economic Union and Development Prospects for Canada, chaired by former Liberal finance minister Donald Macdonald, to examine Canada’s economic prospects. In 1985, the Royal Commission issued its final report, based on three years of hearings and study. It strongly recommended that Canada and the United States negotiate a bilateral free trade agreement. Two years later, Canada and the United States successfully concluded free trade negotiations that covered not only trade in goods but also trade in services and investment. The Canada-United States Free Trade Agreement (FTA) entered into force in January, 1989. Neither the United States nor Canada regarded the detailed FTA as a complete success. The first paragraph of article 2005 exempted Canada’s cultural industries from the FTA, but the second paragraph reserved the U.S. right to retaliate against Canadian cultural protectionism. On the positive side, the agreement eliminated tariffs over a ten-year period in commodity trade in industry and agriculture; liberalized Canadian controls on foreign investment; provided national treatment for U.S. firms operating in Canada; and provided limited bilateral access to government procurement contracts in each country. Most important to Canada was the establishment of bilateral dispute-settlement panels to circumvent the political vagaries of U.S. trade laws. The Reagan administration anticipated few modifications in U.S. law being necessitated to implement the FTA. Reagan promised to pursue further liberalization of Canadian investment controls, to extend the agreement to energy and cultural industries, and to eliminate technology-transfer require-
ments and other performance requirements not barred by the FTA. For the steel industry, the administration assured Congress that nothing in the FTA precluded reaching agreement with Canada to reduce Canadian steel exports. In the area of government procurement, although the agreement liberalized competition, there were major exceptions on the U.S. side. One provision left unchanged was an item included in U.S. Defense Appropriations Acts beginning in 1941 known as the Berry Amendment, which required the Department of Defense to purchase certain products, such as textiles, clothing, and certain specialty metals, from U.S. suppliers. The Canadian energy industry, meanwhile, was anxious to expand its access to U.S. markets; in return, the United States wanted guaranteed access to Canadian resources. Impact The thawing of U.S.-Canadian relations under the ideologically similar governments of President Reagan and Prime Minister Mulroney led to significant and far-reaching developments between the two nations, most notably the Canada-United States Free Trade Agreement, which was not, however, finalized until after Reagan left office. For better or for worse, Mulroney’s foreign policy associated his Conservative Party with a pro-United States stance in the eyes of many Canadians, and their later decisions at the polls were influenced by that association. Subsequent Events
The good feelings of the 1980’s between Canada and the United States underwent certain tensions in the 1990’s, as a result of a less accommodating relationship between Prime Minister Jean Chrétien and President Bill Clinton, and a developing ambivalence of Canadians toward Americans. Disputes over softwood lumber in particular colored the tenor of trade between the two nations, which nevertheless remained so dependent on each other that trade continued even as attitudes changed.
Further Reading
Campbell, Bruce, and Ed Finn, eds. Living with Uncle: Canada-US Relations in an Age of Empire. Toronto: J. Lorimer, 2006. History of U.S.-Canadian relations that sets them within the context of European colonialism and imperialism. Carment, David, Fen Osler Hampson, and Norman Hillmer, eds. Coping with the American Colossus. New York: Oxford University Press, 2003. General
The Eighties in America
study of the United States’ effects as an economic and military superpower and the strategies employed by other nations living in its shadow. Clarkson, Stephen. Canada and the Reagan Challenge: Crisis and Adjustment, 1981-85. New updated ed. Toronto: J. Lorimer, 1985. Another mid-decade text; details the tensions between the Reagan administration and the Trudeau government. Doran, Charles F., and John H. Sigler, eds. Canada and the United States: Enduring Friendship, Persistent Stress. Englewood Cliffs, N.J.: Prentice-Hall, 1985. Background papers prepared for a meeting of the American Assembly at Arden House in Harriman, New York, from November 15 to 18, 1984. Details the state of U.S.-Canadian relations at mid-decade. Hart, Michael. A Trading Nation: Canadian Trade Policy from Colonialism to Globalization. Vancouver: University of British Columbia Press, 2002. Focused study of the history of Canadian trade and of the nation’s dependence upon its trade for survival. Hillmer, Norman, and J. L. Granatstein. Empire to Umpire: Canada and the World to the 1990’s. Toronto: Copp Clark Longman, 1994. Details the evolution of Canada’s foreign policy and its effects upon the foreign relations of other nations throughout the globe. McDougall, John N. Drifting Together: The Political Economy of Canada-US Integration. Peterborough, Ont.: Broadview, 2006. Studies U.S.-Canadian economic relations as composing a single, integrated economic system. Stewart, Gordon T. The American Response to Canada Since 1776. East Lansing: Michigan State University Press, 1992. General history of U.S. foreign policy toward Canada. Thompson, John H., and Stephen J. Randall. Canada and the United States: Ambivalent Allies. 3d ed. Athens: University of Georgia Press, 2002. Study of the tensions sometimes hidden and sometimes apparent within U.S.-Canadian relations. United States. Embassy (Canada). United States Presidential Addresses to the Canadian Parliament, 19431995. Ottawa, Ont.: Author, 1995. Transcripts of more than fifty years of presidential addresses to Canada’s legislature. Demonstrates the evolving attitudes of U.S. leaders toward their “neighbor to the north.” Martin J. Manning
Canada Health Act of 1984
■
181
See also Canada Act of 1982; Canada-United States Free Trade Agreement; Elections in Canada; Foreign policy of Canada; Foreign policy of the United States; Mulroney, Brian; National Energy Program (NEP); Reagan, Ronald; Shamrock Summit; Strategic Defense Initiative (SDI); Trudeau, Pierre; Turner, John.
■ Canada Health Act of 1984 Definition
Legislation to improve the national health care system Date Received royal assent on April 1, 1984 The Canada Health Act established national standards for health care delivery, spelling out criteria and conditions that the nation’s provinces and territories were required to satisfy in order to receive federal funds. The Canada Health Act continued a system of public health care that had its roots in a system established in Saskatchewan in 1947. Canada’s national system is public, funded primarily by taxation and administered by provincial and territorial governments, but most of the nation’s health services are provided by private medical practitioners and facilities. Unlike the preceding Health Care Acts of 1957 and 1966, the act of 1984 contained provisions intended to eliminate direct billing to patients in the form of extra-billing and user charges. The act received unanimous support from the House of Commons and the Senate, and it was given royal assent on April 1, 1984, thereby becoming law. The purpose of the legislation was to assure that insured and extended health care services are readily available to all Canadians regardless of their socioeconomic status. In addition to the provisions regarding extra-billing and user charges, the act established five criteria for provincial and territorial systems: public administration, comprehensiveness, universality, portability, and accessibility. For example, it required that health care services of one’s home province be portable to other provinces and territories for a period of up to three months. It also instituted two conditions regarding insured and extended health care services: Provinces and territories must file reports with the federal government on the operation of their health care services, and they must acknowledge that federal cash transfers are responsible for the maintenance of their systems. Violation of the extra-billing and user-charges provi-
Canada-United States Free Trade Agreement
The Eighties in America
sions of the act would result in dollar-for-dollar deductions from a province’s federal funding. The act also stipulated discretionary penalties for failure to adhere to the five criteria and two conditions, although no such penalty has ever been applied.
tween the two nations. In Canada, the agreement was extremely controversial, as some Canadian special interest groups feared that the new agreement would eliminate a certain degree of the country’s newly acquired sovereignty. In the United States, however, the agreement barely attracted any public attention.
182
■
Impact Under the act, provinces and territories have jurisdiction over most services offered to their populations and take responsibility for approving hospital budgets, negotiating fee scales, and determining classification of staff. Health care services vary somewhat among the provinces and territories, as some jurisdictions may offer additional services, such as optometric and dental care, to certain target populations. Children and disadvantaged groups, for example, may be given benefits beyond those given the general population. The act does not cover non-essentials like cosmetic surgery, hospital amenities, and private nursing services. The federal government assumes direct responsibility for some populations, including prisoners and military personnel. Reductions in federal transfers, brain drain, privatization, and waiting times for services are pressing concerns for Canadians and have led to heated debates over reforming the act. Further Reading
Downie, Jocelyn, Timothy Caulfield, and Colleen M. Flood, eds. Canadian Health Law and Policy. 2d ed. Markham, Ont.: Butterworths, 2002. Fulton, Jane. Canada’s Health Care System: Bordering on the Possible. New York: Faulkner and Gray, 1993. National Council of Welfare. Health, Health Care, and Medicare. Ottawa: Author, 1990. Ann M. Legreid See also Health care in Canada; Health care in the United States; Health maintenance organizations (HMOs); Medicine; Trudeau, Pierre.
■ Canada-United States Free Trade Agreement Identification
Agreement between the U.S. and Canadian governments to create open trade markets and fair competition Date Went into effect on January 1, 1989 The Canada-United States Free Trade Agreement opened up possibilities for more liberalized business and trading be-
At the tail end of the 1980’s, the governments of Canada and the United States began to examine the benefits that could result from removing barriers to trade between the two nations. The policy makers of the time believed that it was important to strengthen
Preamble to the Canada-United States Free Trade Agreement The Government of Canada and the Government of the United States of America, resolved: To Strengthen the unique and enduring friendship between their two nations; To Promote productivity, full employment, and a steady improvement of living standards in their respective countries; To Create an expanded and secure market for the goods and services produced in their territories; To Adopt clear and mutually advantageous rules governing their trade; To Ensure a predictable commercial environment for business planning and investment; To Strengthen the competitiveness of Canadian and United States firms in global markets; To Reduce government-created trade distortions while preserving the Parties’ flexibility to safeguard the public welfare; To Build on their mutual rights and obligations under the General Agreement on Tariffs and Trade and other multilateral and bilateral instruments of cooperation; and To Contribute to the harmonious development and expansion of world trade and to provide a catalyst to broader international cooperation.
The Eighties in America
both the relationship between the two nations and the ability of each country to compete in global markets. At the time, Canada and the United States were among each other’s largest trading partners, and boosting productivity, employment, and trade as a whole became an important goal for both countries. Talks began in the mid-1980’s between the Canadian and the U.S. governments. In Canada, a commission on trade recommended expanding relations with the United States, and by 1987, the U.S. Congress gave President Ronald Reagan permission to enter into a trade agreement with the Canadian government. A finalized trade agreement was signed by both nations and officially went into effect on January 1, 1989. The express purpose of the agreement was to remove any barriers between the two nations to trade in both goods and services, including the eventual removal of all tariffs. Policies were put in place to facilitate fair competition within each nation’s territories and to liberalize conditions for investment. Procedures were also established to help settle any future disputes that might arise between the two nations as impartially as possible. Ultimately, both nations hoped there would be an eventual growth and expansion of the agreement’s policies. Impact The Canada-United States Free Trade Agreement greatly increased the amount of trade between the two nations. Opinions about the agreement varied, as some employment sectors suffered losses and others flourished. Overall, the investment of each nation in the other increased. The agreement and the question of sovereignty remained somewhat controversial in Canada, but in the 1990’s, the Canadian government voted to extend many of the terms of the free trade agreement to Mexico. This new agreement involving all three nations was called the North American Free Trade Agreement (NAFTA). Further Reading
Kreinin, Mordechai E., ed. Building a Partnership: The Canada-United States Free Trade Agreement. East Lansing: Michigan State University Press, 2000. Siddiqui, Fakhari, ed. The Economic Impact and Implications of the Canada-U.S. Free Trade Agreement. Queenston, Ont.: Edwin Mellon Press, 1991. Smith, Murray G. Assessing the Canada-U.S. Free Trade Agreement. Halifax, N.S.: Institute for Research on Public Policy, 1987. Jennifer L. Titanski
Canadian Caper
■
183
See also Business and the economy in Canada; Business and the economy in the United States; Canada and the United States; Foreign policy of Canada; Foreign policy of the United States.
■ Canadian Caper The Event
Joint Canadian-CIA operation that led to the rescue of six U.S. diplomats from Iran Date January 28, 1980 Place Tehran, Iran Almost three months into the Iranian hostage crisis, the public learned that six members of the U.S. embassy’s staff had not been captured with the others, when a secret operation to bring them home succeeded. On November 4, 1979, in the aftermath of the revolution that drove the shah of Iran from power, a group of Iranian students stormed the U.S. embassy in Tehran and took a group of American diplomats hostage. The entire embassy staff, however, was not captured by the students. Six remained at large and ultimately sought refuge at the Canadian embassy. They were hidden for nearly three months in the residences of the Canadian embassy’s staff, including the home of Canada’s ambassador to Iran, Ken Taylor. Taylor quickly contacted Prime Minister Joe Clark with the news of the hidden American diplomats. Clark and his cabinet agreed with the ambassador’s decision to assist the Americans and assured President Jimmy Carter that Ottawa would help them leave Iran. The U.S. Central Intelligence Agency (CIA) then designed and led an operation to extract the Americans from Iran, although afterward the escape plan was portrayed as a solely Canadian effort. The cooperation of the Canadian government was required, since it had to supply fraudulent Canadian passports in the names of the six Americans in hiding. The CIA provided forged Iranian visas and, under the leadership of one of its members, Antonio J. Mendez, designed a cover operation that involved a fictional movie company seeking to film in Iran. Mendez and another CIA agent flew to Tehran, while the phony passports were sent to Tehran in a Canadian diplomatic bag. On January 28, 1980, the six Americans left Tehran on a commercial airline without incident. Fearing Iranian repercussions for their assistance, Ca-
184
■
Canadian Charter of Rights and Freedoms
nadian diplomats also left the country, and the Canadian embassy was closed. The story of the escape was quickly broken by a Canadian reporter. A tremendous response ensued in the United States, which hailed the Canadian government and citizens for their support. The acclaim took many forms, including billboards and posters thanking Canada hung from bridges. In particular, Canadian ambassador Ken Taylor was singled out for praise from the government of the United States. A Canadian movie was subsequently made celebrating the escape. The lead role played by the CIA in the operation remained a secret until 1997. Impact The Canadian role in the escape of the six American diplomats marked a particular high point in the relationship between Canadians and Americans. It also brought a modicum of relief to the Carter administration, which was under fire for its inability to rescue the fifty-three hostages then being held in the U.S. embassy. Further Reading
Adams, Claude, and Jean Pelletier. The Canadian Caper. Toronto: Macmillan of Canada, 1981. Bowden, Mark. Guests of the Ayatollah: The First Battle in America’s War with Militant Islam. New York: Atlantic Monthly Press, 2006. Mendez, Antonio J., and Malcolm McConnell. The Master of Disguise: My Secret Life in the CIA. Toronto: HarperCollins Canada, 2000. Steve Hewitt See also
Canada and the United States; Foreign policy of Canada; Iranian hostage crisis; Reagan, Ronald.
■ Canadian Charter of Rights and Freedoms Definition
Canada’s primary constitutional enumeration of civil rights and liberties Date Came into force on April 17, 1982, except section 15, which came into force on April 17, 1985 The Canadian Charter of Rights and Freedoms constitutes Part 1 of the Constitution Act, 1982. It established a set of civil liberties guaranteed to all persons in Canada or, in some cases, to all citizens of the country.
The Eighties in America
In 1982, the Canadian constitution was patriated from the United Kingdom by the Canada Act of 1982, and Canada became a fully sovereign nation for the first time. The Canadian version of the law enacting this sovereignty was the Constitution Act, 1982, and Part 1 of that act is known as the Canadian Charter of Rights and Freedoms. The charter guarantees rights “subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society.” That is, it allows the legislature to pass statutes imposing reasonable limits upon the guaranteed freedoms. The charter provides rights broadly, guaranteeing every individual equal protection and benefit of the law notwithstanding race, national or ethnic origin, color, sex, age, or mental or physical disability. However, it also addresses the historical claims of certain groups directly. The linguistic claims of the Québécois French are recognized, as are some claims of Canada’s aboriginal, or First Nations, peoples. The charter, which is written in both French and English, specifies that both languages are official languages of Canada and are to be treated equally, as well as providing speakers of each language with the right to be educated in that language. Meanwhile, the charter specifies that it does not interfere with established aboriginal treaty rights. Part 2 of the Constitution Act, 1982, further stipulates that First Nations representatives will be consulted prior to passage of any amendment to a section of the constitution dealing with First Nations rights. Just as section 1 of the charter specifies that statutes may prescribe reasonable limits to freedoms, section 33, also known as the notwithstanding clause, makes it possible for Parliament or a provincial legislature to override some charter rights, including the fundamental freedoms and legal rights. Linguistic rights and the right to vote cannot be overridden. The clause ensures that the federal constitution does not go too far in regulating matters that fall within the jurisdiction of the provinces, as well as preserving the national parliament’s supremacy over all other institutions of the federal government. Impact The introduction of the Canadian Charter of Rights and Freedoms coincided with the birth of a more politically mature Canada. With passage of the Canada Act of 1982, the nation ceased to rely on the consent of the British parliament to modify its constitution, and it was appropriate that the newly sover-
The Eighties in America
Cancer research
■
185
See also
Aboriginal rights in Canada; Canada Act of 1982; Education in Canada; Immigration to Canada; Minorities in Canada; Multiculturalism in education; Trudeau, Pierre.
■ Cancer research Definition
Scientific investigation of the class of malignant tumors threatening human health
Cancer research in the 1980’s dealt with understanding the nature of the disease, as well as developing improved methods for treatment. Funding for cancer research led to the discovery of molecular mechanisms underlying the disease. The first “designer drugs” directed against specific cancers were also developed.
Queen Elizabeth II signs the new Canadian constitution, thereby enacting the Canadian Charter of Rights and Freedoms into law, on April 17, 1982. (National Archives of Canada)
eign nation should include in its new constitution a list of civil rights and liberties. The charter symbolically and practically marked Canada as one of the world’s major democratic powers, and it provided fundamental protections both to Canada’s citizens and to all people within its borders.
With the discovery of ribonucleic acid (RNA) tumor viruses in the 1950’s and 1960’s, a “War on Cancer” was declared during the administration of President Richard M. Nixon. The idea that cancer was an infectious disease turned out to be largely incorrect, but that idea served as an impetus for research that would later prove more fruitful: During the 1970’s, studies of RNA tumor viruses led to the discovery of viral oncogenes, cancer-causing genes that could disrupt normal regulatory mechanisms in the infected cell. As that decade ended, J. Michael Bishop and Harold Varmus discovered that these genes are actually found in healthy cells as proto-oncogenes. The normal function of these genes is to regulate cell division, and they are characterized as growth factors, factor receptors, signal molecules, and tumor suppressors.
Further Reading
Beaudoin, Gérald-A., and Erron Mendes, eds. The Canadian Charter of Rights and Freedoms. 4th ed. Markham, Ont.: LexisNexis Butterworths, 2005. Francis, R. Douglas, Richard Jones, and Donald B. Smith. Destinies: Canadian History Since Confederation. 5th ed. Toronto: Harcourt Brace Canada, 2004. McRoberts, Kenneth. “Quebec: Province, Nation, or Distinct Society?” In Canadian Politics in the Twenty-First Century, edited by M. Whittington and G. Williams. 5th ed. Scarborough, Ont.: Nelson Thomson Learning, 2000. Sharpe, Robert J., and Kent Roach. The Charter of Rights and Freedoms. 3d ed. Toronto: Irwin Law, 2005. Esmorie J. Miller
Basic Research
During the 1980’s, the roles played by proto-oncogenes in cancer began to be clarified. Mutations in these genes result in aberrant regulation—in effect, a “short circuiting” of regulatory mechanisms. The particular mutation determines the type of cancer that may result. For example, a mutation in the gene that codes for HER2/neu—a receptor protein that is part of the process regulating cell growth—may cause certain forms of breast cancer; overexpression of this protein represents a poor prognosis for a patient. It was well known that certain types of cancer frequently run in families, suggesting an inherited basis for these forms of the disease. Geneticist Alfred Knudson had suggested in the 1970’s that just as mutations in certain genes may cause cancer, the func-
186
■
The Eighties in America
Cancer research
tion of other genes could be to inhibit, or suppress, cancer. In the 1980’s, Knudson was shown to be correct with the discovery of the first tumor-suppressor gene, called the retinoblastoma (Rb) gene because of its role in inherited forms of the eye disease retinoblastoma. The genes that encode the Rb protein and a second tumor suppressor, p53, were cloned by the end of the decade. The number of tumorsuppressor genes discovered during the decade approached two dozen. The question of how cancer cells remain “immortal” was partly solved during the 1980’s, opening a possible means for cancer therapy. In the 1960’s, John Hayflick had discovered that normal cells have a limited lifespan. In the 1980’s, Elizabeth Blackburn and her colleagues determined that one reason for cell mortality lies in the shortening of the chromosomes during each cycle of cell division. The tips of these chromosomes are capped with short deoxyribonucleic acid (DNA) sequences called telomeres, which serve as buffers against chromosomal loss. Blackburn discovered telomerases, enzymes that replace the telomere caps in fetal cells; telomerase activity shuts down as cells age, eventually resulting in cell death. It was subsequently discovered that cancer cells re-express telomerase activity, which allows them to survive and continue to replicate. Trends in Cancer Pathology Among males during the 1980’s, cancer rates continued trends that had begun in previous decades. Perhaps the most surprising of these trends was the significant decrease in stomach cancer to a level less than 40 percent of its rate a half century earlier. Lung-cancer rates, on the other hand, continued to increase, reaching a mortality level approaching 80 deaths per 100,000 males by the end of the decade. Not surprising, most of the increase was attributed to cigarette smoking. Mortality associated with prostate cancer demonstrated a slight increase, reaching a level of 20 to 25 deaths per 100,000 males. Among women, many of the same trends were apparent. The most striking change, however, was in the rate of cancer of the lung and bronchus. Mortality rates among women had tripled by the end of the decade, reaching a level of 30 deaths per 100,000 females and surpassing the mortality rate associated with breast cancer. Anticancer Drugs and Treatment The discovery that certain cancers express unusual surface proteins al-
lowed for the development of drugs that could target cancer cells by seeking out those proteins. Monoclonal antibodies were found to be effective in treating certain forms of lymphoma, as well as melanomas. The efficacy of mastectomy, the removal of a breast, as a means to eliminate the disease had long been controversial. A study completed in the 1980’s found that, in many cases, a lumpectomy—removal of a tumor from the breast—was as effective as the more radical procedure for treating the patient. Impact Prior to the 1980’s, cancer research had been primarily observational: scientists cataloged phenomena without understanding the mechanisms underlying those phenomena. During the 1980’s, however, the molecular basis of the disease began to be understood. The role played by cellular proto-oncogenes was particularly intriguing. As originally defined, these genes were shown to regulate the progress of a cell through its cycle of development, ultimately resulting in division. In some cases, however, the same proto-oncogenes that stimulate cell division may also inhibit the cell cycle, even inducing apoptosis (cell “suicide”), thereby serving as tumor suppressors as well as tumor inducers. It was during the 1980’s that the first chemotherapeutic agents directed against specific forms of cancer were discovered. The usefulness of molecules such as monoclonal antibodies still remains limited. However, the discovery that many cancer cells express specific forms of surface proteins made “designer” drugs directed against these forms of the disease practical. Among the drugs subsequently licensed was herceptin, an inhibitor of breast-cancer cells that express the HER receptor, one of the proto-oncogene products discovered in the 1980’s. Further Reading
Bishop, J. Michael. How to Win the Nobel Prize. Cambridge, Mass.: Harvard University Press, 2003. Autobiography of the scientist who helped discover cellular oncogenes. Coffin, John, et al. Retroviruses. Plainview, N.Y.: Cold Spring Harbor Laboratory Press, 1997. Story of retroviruses and the role they played in the discovery of oncogenes. Pelengaris, Stella, and Michael Khan. The Molecular Biology of Cancer. Malden, Mass.: Blackwell, 2006. Extensive discussion of the discovery of oncogenes and their role in the development of cancer. Richard Adler
The Eighties in America See also
Genetics research; Health care in Canada; Health care in the United States; Medicine; Transplantation.
■ Car alarms Definition
Warning devices activated when thieves steal or damage vehicles
Vehicular thefts escalated in the United States during the 1980’s, increasing the need for automobile alarms and security systems. During the 1980’s, the profile of the average automobile thief underwent a transition from joyriding adolescent to aggressive professional. For the first time in 1979, adults rather than juveniles represented the majority of vehicle thieves, who were often associated with so-called chop shops (workshops that specialized in breaking a car down into its component parts, so each part could be resold separately). By 1980, U.S. vehicle owners reported over one million vehicular thefts yearly, mostly in urban areas, resulting in several billion dollars in losses. In 1960, 90 percent of stolen vehicles had been recovered; the vehicle recovery rate in the 1980’s decreased to 9 percent, and few thieves were apprehended. Federal legislation throughout the 1980’s addressed automobile theft, requiring U.S. vehicle manufacturers in 1987 and afterward to produce cars with pre-installed alarms or with specific parts marked. Most car alarms were electronic; owners utilized switches, keypads, or keys to activate and deactivate them. Alarms made loud noises, flashed lights, or sent a radio signal to pagers when automobiles were breached. Sensors on windows, gas caps, and trunks also activated alarms. Pioneer distributed an ultrasonic car alarm operated by remote control. Professional thieves, however, quickly learned how the most popular alarms operated, and the most skilled thieves could disarm most alarms in a matter of seconds. Costs associated with purchasing and installing car alarms, often totaling several hundred to thousands of dollars, resulted in many drivers choosing not to purchase them. Car alarm installation was sometimes inconsistent. In 1986, Texas was the sole state implementing licensing procedures for alarm installation services. Consumer Reports rated car
Car alarms
■
187
alarms. Some motorists built car alarms based on instructions printed in Popular Mechanics and RadioElectronics. In 1981, the Consumer Federation of America estimated that motorists spent $295 million yearly for vehicle security equipment to thwart thieves. Many consumers considered antitheft devices that would impede ignition and fuel functions or immobilize steering wheels and pedals to be more effective than simple car alarms. A 1983 Mediamark Research report analyzed U.S. car-alarm usage, stating that fewer than 3 percent of automobiles were guarded by alarms, and it was mostly urban, middle-aged males that used car alarms. Some insurance providers offered premium discounts ranging from 5 to 20 percent to motorists who installed alarms in their vehicles. Starting in 1987, several states required insurers to discount rates for vehicles with alarms. By the late 1980’s, more consumers chose to purchase car alarms, as electronic technology became cheaper. Improvements, such as incorporating microprocessors, improved car alarms’ performance and minimized false alarms. Impact
Despite car alarms, vehicle thefts accelerated by the late 1980’s. In 1988, approximately 1.4 million automobiles were stolen in the United States, amounting to $7 billion in losses. Car alarms often did not deter thieves, who quickly snatched goods inside vehicles or swiftly transported vehicles to chop shops seconds after alarms notified owners of a crime. Many car alarms were ineffective, because owners frustrated by false alerts stopped using their alarms. Indeed, as the decade progressed, most people began to assume that any car alarms they heard were false alarms, so the alarms ceased to act as deterrents to criminals. The sound of a car alarm going off in the night became normal background noise in many U.S. cities during the 1980’s. Local governments even passed ordinances providing police with the legal authority to shut off car alarms when residents complained about the noise. Inventors developed tracking devices such as Lojack as alternatives to car alarms.
Further Reading
“Auto Alarm Systems.” Consumer Reports 51 (October, 1986): 658-662. Farber, M. A. “Amateurs Fading in World of Car Theft.” The New York Times, December 13, 1983, pp. A1, B4.
188
■
Cats
Gifford, J. Daniel. The Automotive Security System Design Handbook. Blue Ridge Summit, Pa.: Tab Books, 1985. Elizabeth D. Schafer See also Crime; Gangs; Minivans; Organized crime; Science and technology.
■ Cats Identification Broadway musical Authors Music by Andrew Lloyd Webber
(1948); lyrics by T. S. Eliot (1888–1965), Trevor Nunn (1940), and Richard Stilgoe (1943) Director Trevor Nunn (1940) Date Premiered on Broadway on October 7, 1982 One of the most successful musicals in history, Cats ran on Broadway for a total of 7,485 performances. The success in the United States of a British musical with no book to speak of redefined Broadway and influenced musicals throughout the 1980’s and beyond. By the early 1980’s, Andrew Lloyd Webber had a reputation for hit musicals that defied theatrical con-
The Eighties in America
ventions, and Cats broke several molds. The show was based on a collection of children’s verse, T. S. Eliot’s Old Possum’s Book of Practical Cats (1939), and while many previous musicals had been based upon children’s literature, the concept of an entire cast made up to resemble anthropomorphic cats was radical for its day. Told entirely in song and dance, the show exemplified Lloyd Webber’s love for spectacle. The set was a giant garbage dump. Props and wings of the stage were distributed throughout the theater, and cast members would variously appear and dance among the audience, giving them the illusion of entering a different world and being involved in the show. The story of Cats is simple. Eliot’s book is a set of short narrative poems, which the play strings together, using some of Eliot’s unpublished ideas as well, to form a longer narrative. The play takes place on the night of the Jellicle Ball, where different cats are nominated for the honor of going up to the heaviside layer to be reincarnated. Lloyd Webber’s experiment succeeded in part because of an unpublished poem. “Grizabella the Glamour Cat” tells of an old and disgraced beauty queen, and Eliot felt that the story was too depressing for children. However, in the midst of a fun and seemingly frivolous show, Lloyd Webber used Grizabella to illustrate one of his favorite themes: Those society marginalizes are often the most worthy of respect. For her anthem, Lloyd Webber used a melody he had written as a tribute to Puccini. Trevor Nunn, the show’s director, wrote a lyric based upon Eliot’s Prufrock and Other Observations (1917). The result was one of Lloyd Webber’s most famous songs, “Memory.” After singing the song, Grizabella ascends to the heaviside layer to be reborn.
Betty Buckley plays Grizabella in a scene from Cats performed during the 1983 Tony Awards ceremony. The musical won seven awards, including one for Buckley. (AP/ Wide World Photos)
Impact Cats received eight Tony Awards in 1983. It changed the way Broadway musicals were conceptualized, as shows throughout the 1980’s would use bigger sets, special effects, and fanciful concepts. Cats and its signature song
The Eighties in America
achieved a level of popularity unusual for musical theater and inspired many references, parodies, and imitations in popular culture throughout the decade. When it closed in 2000, it was the longestrunning show in Broadway history. “Memory,” meanwhile, was both a popular hit and an instant standard. It is said that, at any moment in the United States during the mid-1980’s, “Memory” was playing on a radio station somewhere, and it was one of the decade’s most requested songs. Further Reading
Eliot, T. S. The Complete Poems and Plays: 1909-1950. New York: Harcourt, 1952. Snelson, John. Andrew Lloyd Webber. New Haven, Conn.: Yale University Press, 2004. John C. Hathaway See also
Ballet; Broadway musicals; Music; Phantom of the Opera, The.
■ Cell phones Definition
Wireless telephones connected to networks capable of transmitting calls across wide areas
The development of cellular telephone networks in the 1980’s sparked a steady increase in the use of wireless communication during the decade, producing changes not only in communications but also in economic activity and social interaction generally. Prior to federal approval of the first large-scale cellular telephone network in 1983, wireless telephone technology was relegated primarily to marine, emergency, and military uses. Private mobile telephones were expensive, cumbersome, and extremely limited in functionality and coverage. Wireless communication “cells,” constructed around centralized relay towers, had been proposed in the late 1940’s, and cellular technology had slowly progressed since the first experiments of the 1960’s. During the 1970’s, the Motorola Corporation, under the direction of Martin Cooper, moved to the forefront of cellular telephone technology by developing a practical handheld mobile telephone, while American Telephone and Telegraph (AT&T) utilized its federally sanctioned domination of telephone service in the United States to develop the first large-scale trial
Cell phones
■
189
cellular networks. The viability of the Motorola handheld telephone combined with the breakup of AT&T in the early 1980’s gave Motorola the advantage in the race for federal approval of commercial cellular networks. The First Generation In September, 1983, the Federal Communications Commission (FCC) granted Motorola permission to market its handheld mobile telephone and develop a cellular network for its use. The DynaTAC 8000x, known colloquially as “the Brick,” was ten inches long, rectangular, and weighed twenty-eight ounces, yet its relative portability constituted a dramatic improvement over earlier mobile telephones, most of which required fixed installation. The FCC also approved use of the Advanced Mobile Phone System (AMPS) developed by Bell Labs in the 1970’s for the construction of cellular telephone networks. In the years that followed, numerous companies rushed to establish cellular telephone networks in the United States and Canada. Aided by “call handoff ” technology developed in the 1960’s, which allowed cellular telephone users to enjoy uninterrupted service while moving from one communications cell to another, these companies laid the groundwork for what became known as the first generation of cellular communications. Utilizing radio frequencies in the frequency-modulated (FM) spectrum, cellular networks were connected to the land-based telephone system and utilized the same system of area codes and seven-digit numbers as did land-based service. The use of cell phones remained restricted to a small minority of Americans during much of the 1980’s. Networks, although growing, were available only in major urban areas until the end of the decade. In addition, cell phones and cellular service remained prohibitively expensive for most consumers. The DynaTAC sold for approximately four thousand dollars, and service typically cost hundreds of dollars per month. Thus, early cell phones were utilized primarily for business purposes, becoming closely identified with young urban professionals, or yuppies. Growth and Development
Despite the relative exclusivity of cell phones, the number of cell phone users increased dramatically in the United States during the latter half of the 1980’s, as the relatively low cost to phone companies of establishing cellular networks continued to fuel the growth of those networks throughout the decade. In 1985, cellular tele-
190
■
phone subscribers numbered just over 340,000, and that figure doubled the following year. By 1989, there were more than 3.5 million cell phone users in the United States. As a result, in 1989, the FCC expanded the number of frequencies available for use by cellular networks, granting licenses to cellular telephone companies to operate in the portion of the FM range previously designated for ultrahigh frequency (UHF) television channels 70 through 83. The FCC’s action fueled the already dramatic expansion of cellular networks, increasing the number of cell phone subscribers in the United States to nearly 5.3 million in 1990. Despite the rapid expansion of cellular networks, however, service was still unavailable in many areas of the United States at decade’s end, and where it was available, it remained prohibitively expensive for most Americans. The social and economic ramifications of cell phone usage became evident as the number of users continued to grow. Conversations previously conducted from homes and offices were increasingly conducted in automobiles and public places, raising issues of privacy, etiquette, and highway safety. The availability of wireless communication meant that business could be conducted at a faster pace and at irregular times, contributing to both workplace productivity and increased stress levels among professionals. The ability of service providers to manage rapidly increasing cell phone traffic had emerged as a critical issue by the end of the decade. Security also became an increasing source of concern, as the analog signals of first-generation cell phones were transmitted unscrambled and were easily monitored with scanners. Impact
The Eighties in America
Central Park jogger case
Security and bandwidth concerns, along with advances in digital information technology, would lead to the creation of second-generation digital cell phone service in the 1990’s, and the phones themselves would diminish in size and increase in functionality. The development of cell phone networks and the introduction of the first handheld mobile telephone in the 1980’s were critical to the dramatic advances in wireless communications technology of the late twentieth and early twenty-first centuries. Along with the development of the Internet, then, early cell phones made the 1980’s the foundational decade for communications technologies designed to ensure that individuals were always reachable and always connected to the broader world.
Further Reading
Agar, Jon. Constant Touch: A Global History of the Mobile Phone. New York: Totem Books, 2005. Comprehensive global history of the cellular telephone. Includes detailed discussion of the evolution of cellular technology during the 1980’s. Galambos, Louis, and Eric John Abrahamson. Anytime, Anywhere: Entrepreneurship and the Creation of a Wireless World. New York: Cambridge University Press, 2002. Narrative history of the cellular telephone industry from a business perspective. Steinbock, Dan. The Mobile Revolution: The Making of Mobile Services Worldwide. Sterling, Va: Kogan Page, 2005. Provides a global perspective on the development of cellular telephone networks in the 1980’s; discusses the role of Motorola and the DynaTAC in the evolution of wireless communication during the decade. Michael H. Burchett See also AT&T breakup; Information age; Science and technology; Yuppies.
■ Central Park jogger case The Event
Trial of five African American teens for the violent rape and beating of a female jogger Date Attack took place on April 19, 1989 Place New York, New York The discovery of a brutally beaten and raped woman in New York City’s Central Park led to the arrest of five young African Americans, who were accused of spending the night “wilding,” leaving a trail of victims. Subsequent media coverage exploited the great racial fears and problems prevalent in America at the time. At 1:30 a.m., April 19, 1989, two men discovered a severely beaten woman, wearing a bra with her hands tied together by a shirt, in Manhattan’s Central Park. Police revealed that she had been severely beaten, suffering an almost 75 percent blood loss, multiple blows to her head and body, and extreme exposure. Before awakening with persistent amnesia, the twenty-eight-year-old investment banker, whose identity was not released to the public, remained in a coma for twelve days at Metropolitan Hospital. During the same April night, two male joggers reported being attacked by groups of African American and
The Eighties in America
Central Park jogger case
■
191
out full-page newspaper ads insisting on reinstating the death penalty so the boys could be executed for their supposed crimes. Pete Hamill’s incendiary New York Post article predicted that bands of crack-addled African Americans would start coming downtown from Harlem to kill white people. The five defendants were convicted and imprisoned. Impact In 1989, New York City was experiencing a serious rise in crime, increased crack-cocaine abuse, and heightened racial tensions; two thousand homicides, an all-time high, were reported that year. Other so-called wildings had occurred in 1983 and 1985, and in December, 1986, three African American men were beaten by a white crowd in Howard Beach, Queens. Members of various races living in New York were frightened of one another. The Central Park jogger case served as an emblem of the dangers of increasing violence, lawlessness, and tensions in U.S. cities in general and New York City in particular. It helped supporters to reinstate New York’s death penalty and to enact harsher juvenile-offender laws. Subsequent Events
Yusef Salaam, one of five teenagers accused in the Central Park jogger case, arrives at the New York State Supreme Court building in August, 1990. (AP/Wide World Photos)
Hispanic teenage boys roaming the park; other nearby incidents were also reported. At 10:40 p.m., several boys were arrested leaving the park. Five boys were subsequently charged with rape, assault, and attempted murder: Raymond Santana (fourteen), Kevin Richardson (fourteen), Antron McCray (fifteen), Yusef Salaam (fifteen), and Kharey Wise (sixteen). The police termed the boys’ behavior “wilding,” going out deliberately to cause trouble and spread fear. None of the suspects, each from a middle-class family, had previously been in trouble with the police, and no forensic evidence was found to link them to the crime. However, the media coverage was overwhelming, helping convict them in the eyes of the public. Donald Trump took
In 2002, while serving time for a different rape, Matias Reyes confessed that he had committed the Central Park rape. Reyes claimed that he acted alone, and his DNA matched the sole forensic sample taken at the scene in 1989. All five of the men imprisoned for the crime were exonerated in 2002. In 2003, the jogger, Trisha Mieli, revealed her identity and spoke publicly about the attack for the first time.
Further Reading
Mieli, Trisha. I Am the Central Park Jogger. New York: Scribner, 2003. Schanberg, Sidney. “A Journey Through the Tangled Case of the Central Park Jogger: When Justice Is a Game.” The Village Voice, November 20-26, 2002. Sullivan, Timothy. Unequal Verdicts: The Central Park Jogger Trials. New York: Simon & Schuster, 1992. Leslie Neilan See also African Americans; Brawley, Tawana; Crack epidemic; Crime; Hawkins, Yusef; Howard Beach incident; Racial discrimination; Rape.
192
■
The Eighties in America
Cerritos plane crash
■ Cerritos plane crash The Event
A private plane and a commercial jet collide over a residential neighborhood Date August 31, 1986 Place Cerritos, California The midair collision of two planes over Cerritos highlighted the dangers of flying in the heavily traveled airspace near major airports. In the 1980’s, one-third of U.S. aviation traffic was hosted by Southern California, which had the most congested airspace in the nation. Approximately 50 percent of near-miss airplane collisions occurred in that region, with 114 reported by pilots in the first eight months of 1986. Most pilots and air traffic controllers relied on visual observation of airspace to detect and evade nearby planes. Airline deregulation and President Ronald Reagan’s termination of striking air traffic controllers had impeded aviation safety efforts. On Sunday, August 31, 1986, Aeroméxico Flight 498 approached its destination, Los Angeles International Airport (LAX). That DC-9 jet, named Hermosillo in the Aeroméxico fleet, transported six crewmembers and fifty-eight passengers, including both Mexican and U.S. citizens, who had boarded the aircraft at either its departure site, Mexico City, Mexico, or at the airports where it had stopped en route to Los Angeles, including Guadalajara, Loreto, and Tijuana. Meanwhile, at 11:40 a.m., Pacific standard time, William Kramer, his wife, and an adult daughter departed in a Piper Archer II airplane from Torrance Municipal Airport, south of LAX, flying northeast toward Big Bear, California. As he neared LAX, Aeroméxico pilot Arturo Valdez Prom maintained radio contact with air traffic controllers outside Los Angeles and with his airline, stating at 11:50 a.m. that Flight 498 was on schedule to arrive at 12:05 p.m. Valdez radioed LAX at 11:51 a.m., when the Mexican jet flew inside that airport’s terminal control area (TCA). LAX air traffic controller Walter White monitored Aeroméxico Flight 498 on radar, telling Valdez he could lower the DC9’s altitude to six thousand feet. Another pilot then contacted White, who became distracted while responding to that pilot’s queries. He did not notice that Flight 498 and the Kramers’ plane were on a collision course. At an altitude of sixty-five hundred feet, Kramer’s
airplane struck the DC-9’s tail, knocking off the jet’s horizontal stabilizer, which was crucial to maintain control. Flight 498 plunged to the ground, crashing into a neighborhood of Cerritos, California, southeast of LAX, near Cerritos Elementary School. When White looked at the radar at 11:55 a.m., Flight 498 was gone. He unsuccessfully tried to contact Valdez eight times. Another pilot in the vicinity told White by radio that he saw smoke but could see no DC-9 in the sky. Rescue and Investigation
On the ground, surviving Cerritos residents rescued their neighbors from houses that were on fire or broken by wreckage. The jet’s impact disintegrated ten houses and damaged an additional seven homes. Property losses totaled $2.7 million. Emergency workers fought fires and located remains that afternoon, determining that all crew and passengers on both aircraft died. Exact casualties on the ground were not immediately known, because some people were not at home at the time of the crash. Investigators later determined that fifteen deaths occurred on the ground. The Cerritos High School gymnasium sheltered survivors. Red Cross personnel offered counseling and relief services, both in Cerritos and at LAX. California governor George Deukmejian, a former state senator from the Cerritos vicinity, visited the site on Monday. Los Angeles mayor Tom Bradley expressed his condolences and stated he would pursue efforts to improve aviation safety. U.S. representative Robert Dornan of Garden Grove, near Cerritos, also viewed the site. He had endorsed legislation in 1979 and 1985 that would have required airplanes to install automated devices warning of possible collisions. The Los Angeles County Coroner’s Office secured bodies by Monday evening and used dental records, fingerprints, and other information for victim identification. The Los Angeles Times published detailed coverage of the disasters, including DC-9 crew and passenger names, photographs, and a neighborhood map. National Transportation Safety Board (NTSB) investigators directed by John Lauber, Federal Aviation Administration (FAA) personnel, and Aeroméxico and Mexican government representatives assessed evidence in Cerritos. The LAX air traffic controllers’ competency was an immediate concern. Investigators ordered drug tests for LAX controllers and questioned White about his actions preceding
The Eighties in America
Cerritos plane crash
■
193
This Cerritos, California, neighborhood was devastated by the collision of Aeroméxico Flight 498 and William Kramer’s private plane. (AP/Wide World Photos)
the crash. White stated that Kramer had never communicated with the LAX tower. After investigating on site, analyzing the airplanes’ parts in a Long Beach Municipal Airport hangar, and examining flight recorders in Washington, D.C., the NTSB issued a report on July 7, 1987, finding Kramer responsible for the crash by flying into restricted TCA airspace without permission and at altitudes flown by commercial jets. The NTSB report also stated that White had not monitored air traffic effectively, relying instead on pilots to maintain the distance between aircraft while he performed multiple controller tasks simultaneously. The report depicted the air traffic controller system as faulty, noting several hundred reports of airspace violations prior to and after the Cerritos accident. FAA officials responded that the NTSB should not blame controllers for piloting mistakes. Approximately seventy survivors and relatives sued Aeroméxico, the FAA, and Kramer’s family. In April, 1989, a federal jury exonerated Aeroméxico and decided the FAA
and Kramer’s estate were liable, awarding $56.5 million to the plaintiffs. Impact The Cerritos crash intensified discussions of the commercial and recreational air traffic risks resulting from crowded airspace being monitored by limited air traffic control resources. Officials from the national to the county levels explored ways to prevent crashes. The eleven members of the Los Angeles City Council decided unanimously to encourage pilots flying private aircraft near Los Angeles voluntarily to communicate via radio with controllers and to keep their transponders in working order. Members of Congress urged that warning equipment to detect aircraft electronically should be installed on commercial and private airplanes and suggested securing updated air traffic control technology. The FAA issued stronger penalties for private pilots entering the TCA without approval, including license suspension, and stated that small aircraft fly-
194
■
Challenger disaster
ing near congested airports must operate transponders. The FAA also deployed enhanced radar, reorganized air traffic control centers in Southern California, and altered Los Angeles-area flight routes to separate small aircraft from large jets. Further Reading
Magnuson, Ed. “Collision in the ‘Birdcage.’” Time 128, no. 11 (September 15, 1986): 26-27. Provides quotations from witnesses, including a pilot who saw the jet fall, and insights regarding both flights. Mordoff, Keith F. “Safety Board Completes Field Investigation of California Crash.” Aviation Week and Space Technology 125, no. 11 (September 15, 1986): 36. Describes NTSB procedures for examining Cerritos evidence. O’Connor, Colleen. “Collision in Crowded Sky.” Newsweek 108, no. 11 (September 15, 1986): 34-35. Notes Kramer’s preflight preparations, including purchasing a TCA map. Oster, Clinton V., John S. Strong, and C. Kurt Zorn. Why Airplanes Crash: Aviation Safety in a Changing World. New York: Oxford University Press, 1992. Discusses the Cerritos crash in the context of other midair collisions. Work, Clemens P. “Too Many Planes, Too Little Sky.” U.S. News & World Report 101, no. 11 (September 15, 1986): 32-33. Provides statistics relevant to aviation safety and LAX airspace. Elizabeth D. Schafer See also
Air traffic controllers’ strike; Mexico and the United States; Science and technology; Sioux City plane crash.
■ Challenger disaster The Event Space shuttle explosion Date January 28, 1986 Place Kennedy Space Center, Florida
NASA’s space shuttle Challenger disintegrated about seventy-three seconds after launch, killing the seven astronauts aboard, including civilian S. Christa McAuliffe. The public’s shock over the disaster grew when the commission investigating its cause determined that the process NASA used to assess launch safety was seriously flawed. NASA suspended its piloted spaceflight program for thirtytwo months, while the space shuttle fleet was modified.
The Eighties in America
The January, 1986, launch of Challenger attracted considerably more public attention than most of the twenty-four previous U.S. space shuttle flights, because it was to be the first flight in the National Aeronautics and Space Administration (NASA) Teacher in Space Project. S. Christa McAuliffe, a thirty-sevenyear-old secondary school teacher from Concord, New Hampshire, had been selected in July, 1985, from a group of more than eleven thousand applicants, to become the first teacher to fly in space. McAuliffe, who was interviewed on television by Larry King, Johnny Carson, David Letterman, Regis Philbin, and others, immediately became a celebrity, and NASA received considerable favorable publicity in the months leading up to the flight. The mission was led by Commander Dick Scobee and piloted by Michael J. Smith. The crew of seven astronauts also included three mission specialists— Judith A. Resnik, Ellison S. Onizuka, and Ronald E. McNair, whose primary responsibility was the operation of orbiter systems—and two payload specialists—Gregory B. Jarvis and McAuliffe, whose primary responsibility was to conduct experiments. The mission was scheduled to deploy a Tracking and Data-Relay Satellite, to launch and recover the SPARTAN-Halley comet research observatory, and to broadcast two live science lessons to schoolchildren around the country. A Delay-Plagued Launch
Challenger’s launch was originally scheduled for January 22, 1986, but it was postponed to January 23 and then to January 24, because Challenger needed parts from the shuttle Columbia, but Columbia’s return to Earth had been delayed several times. The January 24 launch was canceled because of bad weather at the emergency landing site in Senegal. The emergency site was changed to Casablanca, Morocco, but because there were no landing lights at that site, the time of the launch had to be moved earlier in the day, so it would still be light in Casablanca when the shuttle lifted off in Florida. Weather conditions in the United States caused the launch to be pushed back to January 27. The January 27 launch was postponed because of a problem with the exterior handle on the orbiter’s hatch. On the morning of January 28, the launch was delayed for two hours when a liquid hydrogen monitor failed. Unusually cold weather, with the ambient air temperature near freezing, had prompted concern
The Eighties in America
from Morton Thiokol, the contractor that built the shuttle’s solid-rocket booster engines. Engineers warned that if the O-rings that sealed joints in these engines reached a temperature below 53 degrees Fahrenheit, there was no guarantee that they would perform properly. NASA officials, aware that the many delays were resulting in bad publicity, decided that it was safe to proceed with the launch. The Short Flight Challenger finally lifted off at 11:38 a.m. eastern standard time (EST) on January 28. A later examination of launch film showed the first sign of a problem less than 0.7 seconds into the flight, when puffs of dark smoke were emitted from the right-hand solid-rocket booster. Investiga-
Challenger disaster
■
195
tors later determined that the smoke resulted from a leak in a joint between sections of the booster. During the stress of liftoff, metal parts bent away from each other, the primary O-ring was too cold to seal, and hot gases vaporized both the primary O-ring and a secondary O-ring that served as a backup. Particles of aluminum oxide rocket fuel are believed to have temporarily sealed the gap. But, about fifty-eight seconds into the flight, Challenger encountered an intense wind shear, a sudden change in wind speed and direction. This was the most severe wind shear recorded up to that time in the shuttle program. The resulting force broke the temporary seal. Within a second, a plume of rocket exhaust penetrated the joint, striking the shuttle’s external
The crew of the space shuttle Challenger poses for its official portrait on November 15, 1985. Back row, from left: Ellison S. Onizuka, S. Christa McAuliffe, Gregory B. Jarvis, and Judith A. Resnik. Front row: Michael J. Smith, Dick Scobee, and Ronald E. McNair. (NASA)
196
■
Challenger disaster
The President Reacts to the
The Eighties in America
Challenger Disaster
President Ronald Reagan had been scheduled to deliver his state of the union address on January 28, 1986. He preempted that speech, however, to talk to the nation about the Challenger disaster: Today is a day for mourning and remembering. Nancy and I are pained to the core by the tragedy of the shuttle Challenger. We know we share this pain with all of the people of our country. This is truly a national loss. Nineteen years ago, almost to the day, we lost three astronauts in a terrible accident on the ground. But, we’ve never lost an astronaut in flight; we’ve never had a tragedy like this. And perhaps we’ve forgotten the courage it took for the crew of the shuttle; but they, the Challenger Seven, were aware of the dangers, but overcame them and did their jobs brilliantly. We mourn seven heroes: Michael Smith, Dick Scobee, Judith Resnik, Ronald McNair, Ellison Onizuka, Gregory Jarvis, and Christa McAuliffe. We mourn their loss as a nation together. . . . We’ve grown used to wonders in this century. It’s hard to dazzle us. But for twenty-five years the United States space program has been doing just that. We’ve grown used to the idea of space, and perhaps we forget that we’ve only just begun. We’re still pioneers. They, the members of the Challenger crew, were pioneers. And I want to say something to the schoolchildren of America who were watching the live coverage of the shuttle’s takeoff. I know it is hard to understand, but sometimes painful things like this happen. It’s all part of the process of exploration and discovery. It’s all part of taking a chance and expanding man’s horizons. The future doesn’t belong to the fainthearted; it belongs to the brave. The Challenger crew was pulling us into the future, and we’ll continue to follow them. . . . The crew of the space shuttle Challenger honored us by the manner in which they lived their lives. We will never forget them, nor the last time we saw them, this morning, as they prepared for the journey and waved good-bye and “slipped the surly bonds of earth” to “touch the face of God.”
fuel tank. Sixty-five seconds into the flight, the exhaust burned through the wall of the external tank, releasing hydrogen. At this point, both the astronauts and the flight controllers still believed the mission was proceeding normally. Seventy-three seconds into the flight, with the
shuttle at an altitude of forty-eight thousand feet, the external tank disintegrated and the right solid-rocket booster rotated, causing Challenger to veer from its intended path. The shuttle was immediately torn apart by air pressure far exceeding its design limit. Television monitors showed a cloud of smoke and vapor where the shuttle had been just moments before. The strongest parts, the crew cabin and the solid-rocket boosters, separated from the rest of the debris and continued arcing upward. Twenty-five seconds after the breakup, the crew compartment reached a peak height of sixty-five thousand feet and began plunging back toward the Atlantic Ocean. Most likely, some or all of the crew survived the breakup, because four of the personal air packs, which could provide oxygen after the cabin system failed, were activated. Two minutes and forty-five seconds after the breakup, the crew cabin impacted the ocean, producing a deceleration of more than two hundred g’s (that is, more than two hundred times Earth’s gravitational force), well beyond the design limit and sufficient to kill the crew. A distinctively shaped cloud of smoke remained visible in the air off Florida’s coast, and the image of that cloud appeared on television news coverage of the disaster—first live and then on tape—throughout the day and for much of the rest of the week.
Impact Although NASA had lost three astronauts during a ground test in preparation for the first crewed Apollo flight, the Challenger disaster represented the first time any American had perished during a spaceflight. Widespread interest in NASA’s Teacher in Space Project attracted more attention to this launch than most shuttle missions. Many schoolchildren, including those in McAuliffe’s school in New Hampshire, watched the launch live on televisions in their schools. Television coverage of the launch and disas-
The Eighties in America
ter made the Y-shaped smoke trail left in the disintegrating shuttle’s wake one of the most widely seen and troubling images of the decade. A special commission, appointed by President Ronald Reagan, attributed the accident to a design flaw in the seals on the solid-rocket booster engines. The commission found engineering reports, dated prior to the shuttle’s first flight, that indicated weakness in this design, and the commission concluded that NASA’s decision-making process was seriously flawed. Following the Challenger disaster, NASA grounded the remainder of the shuttle fleet while the risks were assessed more thoroughly, design flaws were identified, and modifications were developed and implemented. This delayed a number of important NASA missions, including the launching of the Hubble Space Telescope and the Galileo probe to Jupiter. It also represented a serious blow to NASA’s reputation, coloring the public perception of piloted spaceflight and affecting the agency’s ability to gain continued funding from Congress. Further Reading
Jensen, Claus. No Downlink: A Dramatic Narrative About the Challenger Accident and Our Time. New York: Farrar, Straus and Giroux, 1996. An account of the Challenger disaster and the investigation to determine its cause. Lieurance, Suzanne. The Space Shuttle Challenger Disaster in American History. Berkeley Heights, N.J.: Enslow, 2001. Describes the effect of the disaster on American space efforts; suitable for younger readers. Penley, Constance. NASA/Trek: Popular Science and Sex in America. New York: Verso, 1997. Includes detailed feminist critiques of the media representation of Christa McAuliffe and of NASA’s response to the Challenger disaster. Vaughan, Diane. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press, 1996. A 575-page account of the steps leading to the decision to launch Challenger. George J. Flynn See also
Halley’s comet; Science and technology; Space exploration; Space shuttle program.
Cheers
■
197
■ Cheers Identification Television comedy series Date Aired from September 30, 1982, to May 20,
1993 Cheers was one of the most awarded serial situation comedies of the 1980’s and by the end of its run became NBC’s longest-running sitcom. Cheers centered on an ex-relief pitcher and recovering alcoholic, Sam Malone (played by Ted Danson), who ran a neighborhood bar in Boston. It featured an ensemble cast playing a quirky assortment of patrons and employees, who included female love interest and waitress Diane Chambers (Shelley Long), acerbic waitress Carla (Rhea Perlman), and bartender Coach (Nicholas Colasanto). The patrons Norm (George Wendt) and Cliff (John Ratzenberger) were featured in running gags throughout the run of the show. The series was created by Glen Charles, James Burrows, and Les Charles, each of whom had a hand in both Taxi and The Mary Tyler Moore Show, also critically acclaimed sitcoms. Cheers was first aired by the National Broadcasting Company (NBC) on September 30, 1982. The show initially had low ratings and was almost canceled in its first season. Network executives gave the show another chance, however, and by 1984 it was at the top of the ratings thanks to quality writing, character development, and the excellent performance of the cast. The show kept viewer interest with a romantic subplot between Sam and Diane for the first five seasons, until Shelley Long left the show and was replaced by Kirstie Alley, who played Rebecca Howe. Other cast changes included the addition of a new bartender, Woody (Woody Harrelson) after the unexpected death of Nicholas Colasanto and the addition of patron Frasier Crane (Kelsey Grammer) in season three. The show ranked among the top ten shows in the weekly Nielsen ratings for seven of its eleven seasons and often earned the number-one spot in a given week. Cheers used flashbacks and referred to previous episodes to establish a serial nature that gave it a soap-opera feel at times. It also employed seasonending cliffhangers, which was rare for sitcoms at the time. The show was also somewhat unconventional for the 1980’s, since it was far from politically correct; Sam was a womanizer, Rebecca was searching desperately for a rich husband, and much of the
198
■
Cher
The Eighties in America
show focused on drinking. It tended to focus on pure comedy and to avoid social issues of the time, even though most of the characters were working class. Those serious moments that did crop up were quickly dismissed with one-liners. Impact Through a combination of a quirky ensemble of characters, story lines that exploited romantic tensions, and witty dialogue, and by eschewing the political correctness of the time, Cheers became one of the 1980’s most watched and industry-awarded situation comedies. Subsequent Events The show’s last episode aired on May 20, 1993, making it NBC’s longest-running series at the time. The episode also received the second-best Nielsen rating of all time for an episodic program. Over its run, the series was nominated for 117 Emmys and had 28 wins. After its demise, Kelsey Grammer went on to star in a spin-off sitcom, Frasier (1993-2004), which achieved equal success, dominating the Emmy Awards during much of the 1990’s. Further Reading
Thompson, Robert. Television’s Second Golden Age: From “Hill Street Blues” to “ER.” Syracuse: Syracuse University Press, 1997. Waldren, Vince. Classic Sitcoms: A Celebration of the Best in Prime-Time Comedy. Los Angeles: Silman-James Press, 1997. James J. Heiney See also
Cable television; Cosby Show, The; Designing Women; Facts of Life, The; Family Ties; Married . . . with Children; M*A*S*H series finale; Moonlighting; Sitcoms; Television; Wonder Years, The.
■ Cher Identification
American singer, songwriter, and actor Born May 20, 1946; El Centro, California Cheryl Sarkisian LaPiere found success as a recording star and star of a television variety show, but during the 1980’s she proved to the film industry that she could act as well. It was during this same period she learned how to deal with dyslexia. Cher realized her lifelong dream to be an actress when she was cast in the Off-Broadway production of
Cher poses with Academy Award winner Don Ameche at the 1986 awards ceremony. Her Mohawk-inspired outfit by designer Bob Mackie was discussed as much as the awards themselves. (AP/ Wide World Photos)
Come Back to the Five and Dime, Jimmy Dean, Jimmy Dean (pr. 1981) and later co-starred in the 1982 movie version. Critics savaged the play, but they were less caustic in their evaluations of Cher’s performance than they were in reviewing the other stars. Cher’s reception in the film adaptation was much more favorable, and she was nominated for a Golden Globe award. The movie industry was introduced to her talents as a serious actress when she portrayed Karen Silkwood’s lesbian coworker in the drama Silkwood (1983). She was again nominated for a Golden Globe, and this time she won. In Mask (1985), for which she won the Cannes Film Festival’s award for best actress, Cher played the biker mom of a young boy with a facial skull deformity. During 1987, three of her films were released: the comedy The Witches of
The Eighties in America
Eastwick, the thriller Suspect, and the comedy Moonstruck. She received the Academy Award for Best Actress for her role in Moonstruck. Cher’s varied career achievements proved her to be adaptable and flexible. One example of this adaptability was her response to having dyslexia, a reading disorder. To overcome the problems presented by the disorder, Cher had to read her scripts and other material more slowly and carefully than would an average actor. However, she benefited from an excellent memory, so rereading the material was not frequently required. Cher’s fame and reputation in the 1980’s were influenced by her unique fashion sense. Her signature bell bottoms were replaced with exquisitely beaded gowns by Bob Mackie, her fashion designer. One of the most controversial outfits he designed for Cher was a Mohawk outfit she wore as a presenter at the 1986 Academy Awards. The black ensemble included a headdress with two-foot-tall bird feathers and knee-high satin boots. It was easily the most talked-about outfit at an event almost as famous for its clothing as for its attendees. As the 1980’s drew to a close, Cher returned to her musical roots and released two albums, Cher, which was certified platinum, and Heart of Stone, which was certified triple platinum. She also helped start a trend when she marketed her own signature perfume, Uninhibited. Impact By the end of the decade, Cher had been nominated for two Oscars, of which she won one, and four Golden Globe awards for film, of which she won two. Her film celebrity and outlandish fashions made her an icon of the 1980’s. Since her television and music careers had already made her an icon of the 1970’s, Cher’s changing image came to represent some of the differences between 1970’s American culture and 1980’s American culture. She also demonstrated an ability to turn herself from a star into a brand, leveraging her success in one arena, such as film, to promote efforts in other arenas, such as musical performance—a strategy that resonated with the popular and economic cultures of the decade. Further Reading
Bego, Mark. Cher: If You Believe. Lanham, Md.: Taylor Trade, 2004. Coplon, Jeff. The First Time: Cher. New York: Simon & Schuster, 1998.
Children’s literature
■
199
Taraborrelli, J. Randy. Cher. New York: St. Martin’s Press, 1986. Elizabeth B. Graham See also
Academy Awards; Fashions and clothing; Film in the United States; Music; Music videos; Women in rock music; Women in the workforce.
■ Children’s literature Definition
Books written and published for
children During the 1980’s, children’s literature entered the era of big business, as increased sales and corporate consolidation were accompanied an increasingly corporate culture within the publishing industry. However, a decision by U.S. educators to shift to a literature-based curriculum prevented the corporate mind-set from resulting in a streamlined, blockbuster-centric catalog, because schools required a variety of books in a variety of voices. Newly published children’s literature prospered in the 1980’s, especially in the categories of informational books, picture books, beginning-reader picture books, and poetry books. Literature-based curricula in kindergarten through twelfth-grade schools expanded the market for newly published children’s literature, while the mergers and acquisitions of the 1980’s made more money available to pay top artists for their illustrations. Independent publishing houses with a long tradition, however, became mere imprints of ever larger businesses, and some disappeared altogether. Fewer books about minorities and about other cultures were published. Publishers also found it problematic to keep backlisted, previously published books on the shelves. Such books had lower annual sales than new books, but they appealed to adults’ nostalgia for their childhoods, so they kept selling year after year. However, censorship and the generally conservative climate of the 1980’s encouraged children’s book publishers to back down somewhat from the controversial fare popular in the 1960’s and 1970’s. Background The 1980’s was a decade of mergers and acquisitions in the publishing industry. Many well-known old publishing houses became imprints of larger companies. The impact of this general (continued on page 203)
200
■
The Eighties in America
Children’s literature
Selected American and Canadian Children’s Books, 1980-1989 Year Published
Title
Author/Illustrator
1980
Jacob Have I Loved
Katherine Paterson
The Fledgling
Jane Langton
A Ring of Endless Light
Madeleine L’Engle
Fables
Arnold Lobel
The Bremen-Town Musicians
Ilse Plume
The Grey Lady and the Strawberry
Molly Bang
Mice Twice
Joseph Low
Truck
Donald Crews
The Violin-Maker’s Gift
Donn Kushner
The Trouble with Princesses
Christie Harris; illustrated by Douglas Tait
A Visit to William Blake’s Inn: Poems for Innocent and Experienced Travelers
Nancy Willard; illustrated by Alice and Martin Provensen
Ramona Quimby, Age 8
Beverly Cleary
Upon the Head of the Goat: A Childhood in Hungary, 1939-1944
Aranka Siegal
Jumanji
Chris Van Allsburg
Where the Buffaloes Begin
Olaf Baker; illustrated by Stephen Gammell
On Market Street
Arnold Lobel; illustrated by Anita Lobel
Outside over There
Maurice Sendak
The Root Cellar
Janet Lunn
Ytek and the Arctic Orchid
Garnet Hewitt; illustrated by Heather Woodall
Dicey’s Song
Cynthia Voigt
The Blue Sword
Robin McKinley
Doctor DeSoto
William Steig
Graven Images
Paul Fleischman
Homesick: My Own Story
Jean Fritz
Sweet Whispers, Brother Rush
Virginia Hamilton
Shadow
Marcia Brown; original text in French by Blaise Cendrars
A Chair for My Mother
Vera B. Williams
When I Was Young in the Mountains
Cynthia Rylant; illustrated by Diane Goode
Up to Low
Brian Doyle
Chester’s Barn
Linda Climo
Dear Mr. Henshaw
Beverly Cleary
1981
1982
1983
The Eighties in America
Children’s literature
■
Year Published
Title
Author/Illustrator
1983 (continued)
The Sign of the Beaver
Elizabeth George Speare
A Solitary Blue
Cynthia Voigt
Sugaring Time
Kathryn Lasky
The Wish Giver: Three Tales of Coven Tree
Bill Brittain
The Glorious Flight: Across the Channel with Louis Bleriot
Alice and Martin Provensen
Little Red Riding Hood
Trina Schart Hyman
Ten, Nine, Eight
Molly Bang
Sweetgrass
Jan Hudson
Zoom at Sea
Tim Wynne-Jones; illustrated by Ken Nutt
Like Jake and Me
Mavis Jukes
The Moves Make the Man
Bruce Brooks
One-Eyed Cat
Paula Fox
Saint George and the Dragon
Margaret Hodges; illustrated by Trina Schart Hyman
Hansel and Gretel
Rika Lesser; illustrated by Paul O. Zelinsky
Have You Seen My Duckling?
Nancy Tafuri
The Story of Jumping Mouse: A Native American Legend
John Steptoe
Mama’s Going to Buy You a Mockingbird
Jean Little
Chin Chiang and the Dragon’s Dance
Ian Wallace
Sarah, Plain and Tall
Patricia MacLachlan
Commodore Perry in the Land of the Shogun
Rhoda Blumberg
Dogsong
Gary Paulsen
The Hero and the Crown
Robin McKinley
The Polar Express
Chris Van Allsburg
The Relatives Came
Cynthia Rylant; illustrated by Stephen Gammell
King Bidgood’s in the Bathtub
Audrey Wood; illustrated by Don Wood
Julie
Cora Taylor
Zoom Away
Tim Wynne-Jones; illustrated by Ken Nutt
The Whipping Boy
Sid Fleischman
A Fine White Dust
Cynthia Rylant
On My Honor
Marion Dane Bauer
Volcano: The Eruption and Healing of Mount St. Helens
Patricia Lauber
Hey, Al
Arthur Yorinks; illustrated by Richard Egielski
The Village of Round and Square Houses
Ann Grifalconi
Alphabatics
Suse MacDonald
Rumpelstiltskin
Paul O. Zelinsky
1984
1985
1986
201
(continued)
202
■
The Eighties in America
Children’s literature
Selected American and Canadian Children’s Books, 1980-1989
(continued)
Year Published
Title
Author/Illustrator
1986 (continued)
Shadow in Hawthorn Bay
Janet Lunn
Moonbeam on a Cat’s Ear
Marie-Louise Gay
Lincoln: A Photobiography
Russell Freedman
After the Rain
Norma Fox Mazer
Hatchet
Gary Paulsen
Owl Moon
Jane Yolen; illustrated by John Schoenherr
Mufaro’s Beautiful Daughters: An African Tale
John Steptoe
A Handful of Time
Kit Pearson
Rainy Day Magic
Marie-Louise Gay
Joyful Noise: Poems for Two Voices
Paul Fleischman
In the Beginning: Creation Stories from Around the World
Virginia Hamilton
Scorpions
Walter Dean Myers
Song and Dance Man
Karen Ackerman; illustrated by Stephen Gammell
The Boy of the Three-Year Nap
Diane Snyder; illustrated by Allen Say
Free Fall
David Wiesner
Goldilocks and the Three Bears
James Marshall
Mirandy and Brother Wind
Patricia C. McKissack; illustrated by Jerry Pinkney
Easy Avenue
Brian Doyle
Amos’s Sweater
Janet Lunn; illustrated by Kim LaFave
Number the Stars
Lois Lowry
The Great Little Madison
Jean Fritz
Afternoon of the Elves
Janet Taylor Lisle
Shabanu, Daughter of the Wind
Suzanne Fisher
The Winter Room
Gary Paulsen
Lon Po Po: A Red-Riding Hood Story from China
Ed Young
Bill Peet: An Autobiography
Bill Peet
Color Zoo
Lois Ehlert
The Talking Eggs: A Folktale from the American South
Robert D. San Souci; illustrated by Jerry Pinkney
Hershel and the Hanukkah Goblins
Eric Kimmell; illustrated by Trina Schart Hyman
The Sky Is Falling
Kit Pearson
’ Til All the Stars Have Fallen: Canadian Poems for Children
Selected by David Booth; illustrated by Kady MacDonald
1987
1988
1989
The Eighties in America
trend was felt by children’s publishers as well. For example, Random House—a publisher of children’s classics like The Cat in the Hat (1957), by Dr. Seuss— was purchased by Newhouse Publications, a large newspaper chain. In another example, Macmillan Publishing Company bought Atheneum and other publishing houses. Eventually, the Macmillan children’s book division grew to include eleven imprints. Simon & Schuster then bought Macmillan. Paramount bought Simon & Schuster. Soon Viacom, the cable television giant, snapped up Paramount, aquiring the eleven children’s literature imprints along with all of its other holdings. Meanwhile, many respected names in publishing ceased to exist altogether. For example, Dial Press, publisher of well-known authors and illustrators such as Steven Kellogg and Mercer Mayer, was disbanded by parent company Doubleday in the fall of 1983. That same year, twelve publishing firms, all parts of large corporations with interests both inside and outside of publishing, accounted for 45 percent of the $9.4 billion in book sales for the year. Concern was expressed throughout the industry that these mergers and acquisitions would cause a shift away from publishing books of quality but with limited appeal. Instead, publishers would choose only those titles that seemed most likely to sell many copies. Indeed, fewer books about diverse cultures or minorities were published during the 1980’s than had been published in the previous decade; meanwhile, however, a record number of children’s books were published. Clearly, a market for children’s books existed, but not every book could be a blockbuster. Meanwhile, bookselling power was becoming increasingly concentrated among four large bookselling chains: Waldenbooks, B. Dalton Bookseller, Barnes and Noble, and Crown Books. These chains emphasized turnover of new works and, increasingly, the discounting of certain titles. Old-guard publishers and booksellers expressed concern for the traditional backlist, previously published works with slow but steady sales. Such classics as The Wind in the Willows (1908), by Kenneth Grahame, and Winnie-the-Pooh (1926), by A. A. Milne, had sales too small to justify taking up shelf space in chain bookstores for most of the year: They tended to be stocked in quantity only around holidays, when parents bought gifts for their children. Fortunately for children’s book authors and illus-
Children’s literature
■
203
trators, reading instruction in U.S. elementary and high schools was shifting from the basal reader to a literature-based curriculum. According to proponents of literature-based reading, involving children aesthetically as well as intellectually in books designed to entertain would instill a joy of reading in a way that the limited vocabulary books that taught reading through phonetics and repetition did not. This joy in reading, the reasoning went, would ensure that children read more, and more effectively. Literature-based reading programs required that schools purchase more trade books, rather than textbooks, to be used in the classroom. School systems allotted a percentage of their textbook budgets for the purchase of these trade books. This trend may have helped counteract the threat posed to some children’s titles by the consolidation of publishing houses and retail outlets: It increased the demand for a variety of children’s books, meaning that any individual title was more likely to sell enough copies to justify its publication in the eyes of large publishers. Those publishers found themselves able to sell more children’s books in several categories, including nonfiction or informational books, picture books, beginning readers, and poetry. Children’s Nonfiction The 1980’s saw an increase in both the quality and the quantity of children’s informational books. Cataloged as nonfiction, these books had been the mainstay of school libraries, which purchased them of necessity so students could research assigned reports. Better writing and more colorful and interesting illustrations, however, increased the appeal of informational books. With a range of subject matter from the serious How It Feels When Parents Divorce (1984), by Jill Krementz, to the more playful Magic School Bus series (pb. beginning 1986), by Joanna Cole and Bruce Degen, quality nonfiction found a home in libraries, classrooms, and bookstores in the 1980’s. In the past, informational books had rarely been honored with Newbery or Caldecott Medals, which are awarded to children’s books by the American Library Association (ALA), but they began to be recognized more frequently. Ox-Cart Man (1979), by Donald Hall; When I Was Young in the Mountains (1982), by Cynthia Rylant; Lincoln: A Photobiography (1987), by Russell Freedman, and The Glorious Flight: Across the Channel with Louis Bleriot (1983) all won awards from the ALA during the 1980’s.
204
■
Children’s literature
In addition to these Newbery and Caldecott honors for nonfiction, the National Council of Teachers of English (NCTE) established the Orbis Pictus Award in 1989 to honor each year’s best children’s nonfiction. The award was named after the book Orbis Pictus: The World in Pictures, written and illustrated by Johann Amos Comenius in 1657 and considered to be the first book actually planned for children. Criteria for winning the award included accuracy, organization, design, writing style, and usefulness in classroom teaching. The first award was presented in 1990 to The Great Little Madison (1989), by Jean Fritz. The council also honored The News About Dinosaurs (1989), by Patricia Lauber, and The Great American Gold Rush (1989), by Rhoda Blumberg. Picture Books Picture books are defined by their format rather than by their content, so they may be poetic or prosaic, fictional or informational. Such books’ illustrations may be done in a wide range of media to achieve the effect that best complements the running text. The typical picture book is thirtytwo pages in length. Its trim size is larger than that of the average novel, to enable both text and illustration to fit comfortably on each double-page spread. In a picture book, art and text combine to achieve effects that neither could achieve alone. In the 1980’s, the literature-based reading curricula adopted in U.S. schools increased the demand for beautifully illustrated books with compelling texts. Moreover, newer picture books could be more colorful if not more inventive than their predecessors, thanks to increasingly sophisticated colorreproduction technologies that allowed reproduced artwork to resemble more closely its original form. Publishers’ willingness to pay a premium for good illustrations increased along with the increased demand. For example, Chris Van Allsburg, the illustrator of The Polar Express (1985), received an $800,000 advance for his illustrations of Swan Lake (1989), an adaptation of a famous ballet by Peter Ilich Tchaikovsky. The growing potential for high payment convinced a greater number of highly talented artists to illustrate children’s books. As a result, the field of children’s book illustration gained more prestige, drawing interest from even more artists. Beginning-Reader Picture Books Beginningreader picture books also experienced a surge in numbers during the 1980’s. Beginning readers are
The Eighties in America
books that children can read independently in order to practice their emerging reading skills. Some books fall into the category of beginning readers because of their predictable format. For example, Chicka Chicka Boom Boom (1989), written by Bill Martin, Jr., and John Archambault and illustrated by Lois Ehlert, uses song-like repetition to teach the letters of the alphabet. Other books are classified as beginning readers because of their controlled vocabulary. James Edward Marshall’s Fox and His Friends (1982) employs such a controlled vocabulary. Serializing a popular character or even a theme was a strategy commonly employed with this type of book. Children’s Poetry Poetry for children includes a wide variety of forms, such as anonymous nursery rhymes, transcriptions of folk and other songs, lyric or expressive poems, nonsense and other humorous verse, and narrative poems. Poetry has appeal for children because of its concise and memorable use of language, its intensity of feeling, and its quality of rhythm and sound. For these reasons, including poetry as part of a literature-based curriculum was natural. The 1980’s surge in sales of such books was a boon for poetry collections, such as the extensive and cleverly illustrated Random House Book of Poetry for Children (1983). It also entailed increased sales for picture books featuring single poems or songs, such as Song of Hiawatha (1983), an abridgment of Henry Wadsworth Longfellow’s longer poem that was lavishly illustrated by Susan Jeffers. The NCTE Award for Poetry was awarded three times in the 1980’s before shifting from an annual to a triennial schedule. Even with this switch, more of these awards were given out during the 1980’s than in any decade before or since. Children’s poetry was also awarded the Newbery Medal twice during the 1980’s: In 1982, the award went to A Visit to William Blake’s Inn: Poems for Innocent and Experienced Travelers (1981), by Nancy Willard, and in 1989, it honored Joyful Noise: Poems for Two Voices (1988), by Paul Fleischman. Censorship Efforts to censor books tripled in the conservative 1980’s. Adventures of Huckleberry Finn (1884), for example, was among books banned in New York State in the 1980’s. Maurice Sendak’s picture book In the Night Kitchen (1970) continued to be the subject of controversy as well, not because of its subject matter but because of the nudity of its protagonist, a small boy, in one of the book’s illustra-
The Eighties in America
tions. This atmosphere of caution and censorship may also have had an effect on the choice of children’s books published during the decade and may have encouraged the publication of informational books, beginning-reader picture books, and poetry, among other potentially less controversial fare. Impact The 1980’s produced several books that were both good business and good for children. The mergers and acquisitions that some critics feared would negatively impact the publishing industry did not as a whole reduce the quality of the books being published. Despite a reduction in books about minorities and other cultures, large amounts of money became available to pay top artists such as Chris Van Allsburg, who earned previously unheard of advances for their illustrations. The top illustrators like Van Allsburg and Sendak became celebrities, and children’s book illustration became a prestigious field. In addition, certain categories of children’s books did particularly well during the 1980’s, both because of the demand for books to be used in literature-based classrooms and because their often less controversial nature exempted them from the censorship practices of the decade. Further Reading
Billington, Elizabeth T., ed. The Randolph Caldecott Treasury. New York: Frederick Warne, 1978. History of the illustrator and namesake of the prestigious Caldecott Medal, awarded yearly for best illustration in a children’s picture book. Darigan, Daniel L., Michael O. Tunnel, and James S. Jacobs. Children’s Literature: Engaging Teachers and Children in Good Books. Upper Saddle River, N.J.: Merrill/Prentice Hall, 2002. This textbook, intended primarily for students planning to be teachers, offers excellent overviews of children’s book genres, as well as many thorough lists of children’s books and their authors. Lanes, Selma G. The Art of Maurice Sendak. New York: Abrams, 1980. An interesting and comprehensive study of a major children’s book author and illustrator who, though the celebrated recipient of various awards, has also been the subject of censorship for his illustrations. Leepson, Marc. “The Book Business.” Editorial Research Reports, 1985. Vol. 1. Washington, D.C.: CQ Press, 1985. This issue of Congressional Quarterly addresses the concerns about mergers and acquisitions in publishing during the 1980’s.
Children’s television
■
205
Temple, Charles, et al. Children’s Books in Children’s Hands: An Introduction to Their Literature. 2d ed. Boston: Allyn and Bacon, 2002. This textbook offers a different way of organizing children’s literature, with an emphasis on the literature-based classroom. Laurie Lykken See also
Book publishing; Children’s television; Education in Canada; Education in the United States; Poetry.
■ Children’s television Definition
Television programming designed primarily for children or for a mixed child and adult audience
During the 1980’s, television programming attempted to entertain, educate, socialize, and inform children in both the United States and Canada. The advent of cable television increased children’s viewing options, and relaxation of advertising guidelines made possible entire shows devoted to characters who were also commodities available for purchase. Since the early days of television broadcasting, children’s programming has been an integral part of the medium. Likewise, for several generations television has been an important part of children’s lives. During the 1980’s, however, the children’s television landscape changed, as children were treated increasingly as consumers, not only of material goods but also of information. Children no longer simply needed to be entertained by television shows; they also needed to be, in some respects, educated and informed. Although adult programs of previous decades had explored complex social or moral issues, children’s television shows rarely offered any discussion of these topics. However, in the 1980’s many children’s television programs began to address serious concerns, as they incorporated lessons meant to help children discover more about society and their role in it. Educational and Informative Programming
Two mainstays of children’s educational television in the United States were Sesame Street and Mister Rogers’ Neighborhood, both broadcast on the Public Broadcasting System (PBS). By the 1980’s, Sesame Street had
206
■
Children’s television
become a staple of children’s television. The show, which had garnered critical praise for its innovative approaches to early childhood education and positive representation of harmonious, multiethnic neighborhoods, continued this trend in the 1980’s, as it attempted to explore relevant social issues while still entertaining and educating children. Like many popular American programs, Sesame Street also aired in Canada. However, the Canadian version of Sesame Street was not strictly an American import; in fact, many episodes included up to fifteen minutes of original material produced in Canada. Building on its success from the previous decade, Mister Rogers’ Neighborhood continued to be an acclaimed children’s program during the 1980’s. Like Sesame Street, Mister Rogers’ Neighborhood attempted to stay abreast of current social and cultural topics while remaining grounded in the focused, inclusive approach to education that made the show a success. Another fixture on PBS was Reading Rainbow,
The Eighties in America
which debuted in 1983. The program, hosted by Levar Burton, strove to foster literacy in young children. PBS also produced a science education series, 3-2-1 Contact, which educated children about complex scientific principles through simplified explanations and demonstrations using everyday objects. The Canadian counterpart to 3-2-1 Contact was Wonderstruck. Hosted by Bob McDonald, Wonderstruck duplicated 3-2-1 Contact’s intent to provide scientific education through simplistic methods and explanations. Pinwheel, a children’s educational show that aired on Nickelodeon during the 1980’s, followed the traditional formula of using both human and puppet characters. Nickelodeon’s only original programming for several years, the show attracted many young viewers to the fledgling network. Pinwheel became one of Nickelodeon’s longest-running programs, with more than two hundred episodes aired. While Sesame Street and Mister Rogers’ Neighborhood
Host Fred Rogers of Mister Rogers’ Neighborhood poses surrounded by his puppets in 1984. (AP/Wide World Photos)
The Eighties in America
both aired in Canada during the 1980’s, the country also produced several of its own shows, many with a longevity equal to or greater than either of the PBS series. The Friendly Giant, a show designed for preschool children, aired for more than twenty-five years before its cancellation in the mid-1980’s. The show’s host, Friendly (played by Bob Homme), entertained viewers with stories and songs. The Friendly Giant’s “triple relationship,” featuring a sole human character interacting with two puppets, was a popular formula for many Canadian children’s shows. The triple relationship was also utilized in Mr. Dressup, a Canadian equivalent to Mr. Rogers’ Neighborhood that was a staple of children’s television programming on the Canadian Broadcasting Company (CBC) throughout the 1980’s. During each episode, Mr. Dressup (Ernie Coombs) would reach into the Tickle Trunk, pull out a costume or prop, and use it to educate or entertain viewers. Some children’s television programming of the 1980’s was not educational in the traditional sense but nonetheless attempted to inform and, to some degree, socialize children. Both the ABC After School Specials, broadcast by the American Broadcasting Company, and the Columbia Broadcasting System (CBS) series Schoolbreak Specials comprised encapsulated episodes that typically aired during the afternoon, when latchkey children were home from school, prior to adult-dominated, prime-time programming. The specials dealt with many issues that affected children, including domestic violence, alcohol or drug abuse, peer pressure, and divorce. No matter the subject matter of the episodes, the series depicted children who sometimes made mistakes or were faced with difficult decisions or situations. These shows provided significant contributions to children’s television during the 1980’s and have had a lasting cultural impact. Variety Shows, Dramas, and Comedies
Variety shows have been a staple of commercial television since its inception. Jim Henson’s The Muppet Show was a favorite variety show of many children, until the program was taken off the air in 1981. The discontinuation of The Muppet Show seemed to herald the end of variety shows for children; however, the hiatus was temporary. Canada stepped into the breach with You Can’t Do That on Television, which was broadcast on CTV in Canada and on Nickelodeon in the United States. The show was considered the first
Children’s television
■
207
children’s variety television program staffed almost exclusively by children. It was a blend of sketch comedy, heavy doses of satire, and lots of green slime. Together, Pinwheel and You Can’t Do That on Television represented the bulk of Nickelodeon’s programming during the American network’s early years. Other variety shows that achieved popularity in Canada during the decade included Just Kidding and Switchback, both of which combined comedy sketches with more serious material, including interviews with scientists, music artists, and sports celebrities. Dramas had long been popular in children’s television programming and were used to explore the complexities of life. One of the most successful and longest-running drama franchises in Canadian history began in the 1980’s with The Kids of Degrassi Street. Later series included Degrassi Junior High and Degrassi High, which chronicled the problems a group of children encountered growing up. The shows dealt with drug and alcohol abuse, academic struggles, social class differences, homosexuality, teen pregnancy, peer pressure, premarital sex, and interracial relationships. All of the Degrassi series featured children who were often confronted with difficult or even dangerous decisions, which they frequently had to make without adult intervention or guidance. Another successful Canadian children’s drama was The Edison Twins, produced by CBC from 1982 to 1986 and aired in the United States later in the decade. The series was a frank portrayal of the protagonists, fraternal twins Tom and Annie Edison, dealing with the realities of young adulthood. The show also featured the siblings’ younger brother and several of their friends; the twins’ relationships with these characters illustrated the importance of family and friends. Comedies had also been a part of children’s television programming for several decades. One unique aspect of many situation comedies (sitcoms) during the 1980’s was the prevalence of programs made for and starring children, a departure from previous decades, when adult comedy shows had dominated the television market. A shared trait of children’s sitcoms was the tendency to combine elements of comedy and drama or sentimentality; frequently an episode would conclude with a discussion of the lesson a character had learned. During the 1980’s, many children watched Full House, which depicted a widowed father living in San Francisco and
208
■
Children’s television
The Eighties in America
raising his three young daughters with Some 1980’s Children’s Television Shows the aid of two other single males. Many story lines revolved around the girls’ Program Airdates Network behavior or mishaps and the resulting Captain Kangaroo 1955-1984 CBS lessons about growing up. Although the characters were not all related, the Mr. Dressup 1967-1997 CBC value of family was a constant element Mister Rogers’ Neighborhood 1968-2001 PBS of the show. Sesame Street 1969PBS Another popular sitcom that apThe Kids of Degrassi Street 1979-1986 CBC pealed to children and attracted many Pinwheel 1979-1989 Nickelodeon viewers during the 1980’s was Growing 3-2-1 Contact 1980-1988 PBS Pains, which featured a traditional nuclear family. In many episodes, the parThe Great Space Coaster 1981-1986 Syndicated ents were in the background, with the Today’s Special 1982-1987 TVO story focused on the actions or interacFaerie Tale Theatre 1982-1987 Showtime tions of the siblings. Again, the chilFraggle Rock 1983-1987 HBO dren’s behavior was often the catalyst Reading Rainbow 1983PBS for a valuable life lesson. A spin-off of Sharon, Lois & Bram’s the immensely popular and culturally Elephant Show 1984-1988 CBC relevant Diff’rent Strokes, The Facts of Life focused on the lives of several female Kids Incorporated 1984-1993 Disney friends from different socioeconomic CBS Schoolbreak Special 1984-1996 CBS backgrounds who attended the same Zoobilee Zoo 1986-1987 Syndicated boarding school. The show made Kissyfur 1986-1987 NBC friendship one of its central themes, Pee-Wee’s Playhouse 1986-1991 CBS employing it as both a comedic and a Square One TV 1987-1992 PBS dramatic element. As in many other children’s sitcoms, the show’s adults Ramona 1988-1989 PBS were featured in limited roles, as the Chip ’n Dale Rescue Rangers 1989-1993 Syndicated action centered mostly on the younger Eureeka’s Castle 1989-1995 Nickelodeon characters attempting to cope with the Shining Time Station 1989-1997 PBS problems of growing up. The lessons they learned were intended to be lessons for their viewers as well. Not all comedy of the 1980’s was inLooney Toons characters—Bugs Bunny, Daffy Duck, tended to be moralistic. Pee Wee’s Playhouse debuted Porky Pig, Wile E. Coyote, and Road Runner, among in 1986 and went on to become one of the most popothers—were staples of children’s cartoons throughular children’s shows of the decade. The antithesis out the decade. However, several advocacy groups of most other children’s programming in the claimed that Warner Bros. cartoons contained exUnited States or Canada, Pee Wee’s Playhouse was the cessive violence that might be detrimental to chilfurthest thing from traditional children’s television, dren. The Bugs Bunny/Road Runner Show, particularly as evidenced by its silly, nonsensical sets, skits, the cartoons featuring Road Runner and Wile E. phrases, and characters. Coyote, was determined to be the most violent program on U.S. television by these groups. The deterAnimated Series Although several long-running mination was made simply by counting the number animated television programs were canceled or disof acts of violence on screen per minute, without recontinued during the decade (including The Jetsons, gard for the non-graphic, cartoonish nature of the Tom and Jerry, and Jonny Quest), cartoons continued violence in question. Nevertheless, this and other to be popular with children during the 1980’s. As charges forced some networks to discontinue or they had been for several decades, Warner Bros.’s
The Eighties in America
scale back the cartoon programming they aired. Despite this backlash, the Warner Bros. franchise continued to thrive in the United States and Canada throughout the decade. Several successful cartoon series of the 1980’s were modeled after lines of toys popular in both countries. Frequently, cartoons acted as, as one critic described them, “half-hour commercials” designed to sell character-related merchandise. New characters or equipment added to the toy lines were often incorporated into the cartoons, to increase consumer demand. Programs including He-Man and the Masters of the Universe, My Little Pony, G.I. Joe, Transformers, and Care Bears sometimes served more as vehicles to increase toy sales than as actual television entertainment. Many cartoons produced in the 1980’s followed the popular trend of combining entertainment with practical education. He-Man and the Masters of the Universe included a lesson at the end of each episode, instructing children on the rights and wrongs of everyday life. In G.I. Joe, a segment at the conclusion of each episode featured a different character sharing information about safety, responsibility, and good social behavior. Alvin and the Chipmunks, one of the top-rated children’s shows of the 1980’s, focused on the lead character’s inability to use common sense and good judgment and his brothers’ tendency blindly to go along with Alvin’s schemes. The Smurfs was one of the most popular children’s series of the 1980’s. The show was appealing on many levels: Not only did it include aspects of fantasy, but, like several other series of the decade, The Smurfs often focused on issues such as respecting others, kindness, and being safe. Mr. T, starring the actor made popular in the live-action program The A-Team, contained plenty of action and healthy doses of tough love and life lessons from Mr. T himself. Not all cartoons had moral or even educational lessons to convey; some animated programs were designed solely for entertainment. Teenage Mutant Ninja Turtles, based on an independent comic book, made no pretenses about teaching correct behavior; the main purpose of the show was to entertain children. Scooby-Doo, popularized during the previous decade and revived in several incarnations during
Children’s television
■
209
the 1980’s, was also designed only for children’s amusement. Inspector Gadget, a highly successful cartoon produced by the Canadian company Nelvana, was also purely for entertainment purposes. However, the show’s strong child character (Penny, Gadget’s intelligent niece who frequently had to save not only the day but also her bumbling, secret agent uncle) was undoubtedly an encouragement to children. Impact Programs including You Can’t Do That on Television, Full House, and Pee Wee’s Playhouse can be considered definitive examples of children’s television in the 1980’s, indicating that the days of Howdy Doody and I Love Lucy were long gone. Furthermore, the changes in children’s television programming during the 1980’s were not temporary. Rather, the alterations continued to reverberate into the next decade and beyond, affecting not only the children who grew up watching television during the 1980’s, but subsequent generations of viewers as well. Further Reading
Davis, Jeffery. Children’s Television, 1947-1990. Jefferson, N.C.: McFarland, 1995. Exhaustive study of animated, informative, and educational children’s television shows. Inglis, Ruth. The Window in the Corner: A Half Century of Children’s Television. London: Peter Owen, 2003. Chronological evaluation of children’s programs explores the shows and trends of the 1980’s. Palmer, Edward L. Children in the Cradle of Television. Lexington, Mass.: D. C. Heath, 1987. Focuses on different eras in television history, highlighting what differentiated the 1980’s from prior decades. Rainsberry, F. B. A History of Children’s Television in English Canada, 1952-1986. Metuchen, N.J.: Scarecrow Press, 1988. Comprehensive examination of the history of Canadian children’s television programming, providing a focus on many types of shows. Matthew Schmitz See also
Advertising; Cable television; Education in Canada; Education in the United States; Facts of Life, The; Television.
210
■
The Eighties in America
China and the United States
■ China and the United States Identification
Diplomatic and economic relations between China and the United States
The political, economic, and cultural relationship between China and the United States warmed continuously throughout the 1980’s. It suffered a serious setback, however, with the bloody suppression of student demonstrators in Beijing’s Tiananmen Square in June, 1989. The United States also maintained cordial relations with Taiwan even after the end of formal diplomatic relations in 1979. By January, 1980, the United States and the Communist People’s Republic of China (PRC) had enjoyed their first year of normal diplomatic relations. The United States hoped to play the PRC off against the Soviet Union to gain advantages in the Cold War. Beijing’s decision to join the U.S.-led boycott of the 1980 Summer Olympics in Moscow in response to the Soviet invasion of Afghanistan in 1979 impressed Washington. In November, 1980, the Chinese trial of the communist extremists the Gang of Four, which ended with their convictions in 1981, further convinced the United States that China under the leadership of Deng Xiaoping could be an American partner. China, for its part, was encouraged by the willingness of the United States to withdraw full recognition from Taiwan, which the Chinese asserted was a part of the People’s Republic. Taiwan, which referred to itself as the Republic of China, claimed to be the nation’s legitimate government. It had thus been necessary for the U.S. government to withdraw its recognition of Taiwan’s Nationalist government before it could fully recognize the People’s Republic as the legitimate government of China. The Americans still treated Taiwan as a diplomatic entity, however, conducting trade with the country and accepting its passports, for example. The People’s Republic and the United States
After strengthening political ties in 1980, China reacted angrily to continued U.S. arms sales to Taiwan in the spring of 1981. The reaction may have been intended as a test of incoming Republican president Ronald Reagan. Many conservatives in the United States still looked fondly at the staunchly anti-Communist Taiwanese government. Reagan was aware that the friendship of the People’s Republic could be a tremendous asset in the Cold War against the Soviet Union. He therefore decided to send U.S. sec-
retary of state Alexander Haig to Beijing to negotiate a resolution to the situation in June, 1981. Vice President George H. W. Bush also visited in May, 1982. On August 17, 1982, China and the United States agreed that the United States would decrease its arms sales to Taiwan, while China would pledge itself to a peaceful solution of its conflict with the Nationalists. Deng Xiaoping’s ongoing emphasis on economic liberalization and modernization made him popular in the United States. In 1982, Deng encouraged Western companies to invest in China’s new special economic zones, as well as in the Chinese manufacturing and hospitality industries, ideally through joint ventures. Granted Most Favored Nation status in 1980 by the U.S. Congress, China developed a robust trade relationship with the Americans. By 1985, China and the United States traded goods worth $3.86 billion each. By 1989, however, Chinese exports to the United States were worth $11.99 billion, while U.S. exports had reached only $5.76 billion, creating a U.S. trade deficit that worried American economists. U.S.-Chinese cultural relations improved steadily. China allowed many of its citizens to study in the United States, and a significant number arrived in 1983 and 1984. American intellectuals, scholars, and English teachers were invited to China, as were tourists. In 1984, the cordiality of U.S.-Chinese relations was demonstrated by President Reagan’s visit to Beijing and Chinese prime minister Zhao Ziyang’s voyage to Washington, D.C. In October, 1985, Vice President Bush visited China. Tiananmen Square Incident
Up to 1989, there were few issues troubling U.S.-Chinese relations. American criticism of China’s harsh policies in Tibet did not substantively affect U.S. policy. American ecological worries about the Three Gorges Dam project in China also did not influence the Reagan administration’s actions. American fears for Hong Kong were alleviated by the Sino-British agreement of 1984. When the newly elected President Bush visited Beijing in February, 1989, he received a warm welcome. A setback came in the spring of 1989. Since April 17, 1989, Chinese students had been demonstrating in Beijing’s Tiananmen Square, demanding democratic reforms for the country. On May 30, they brought a plaster “Goddess of Democracy and Free-
The Eighties in America
dom” into the square, modeled after the Statue of Liberty. On June 3-4, Chinese troops violently cleared Tiananmen Square. Video of a single, unarmed Chinese man placing himself in the path of oncoming tanks on June 5 was broadcast all over the world and came to symbolize the incident. Americans were horrified. Non-Communist sources estimate that several thousand protesters were killed. The United States reacted swiftly with a series of political and economic sanctions but showed considerable restraint. China’s status as Most Favored Nation was renewed after some debate in 1990. Taiwan and the United States The United States’ decision to break diplomatic relations with the Republic of China in 1979 weakened its bond with a Cold War ally. Relations between the United States and Taiwan in the 1980’s were quasi-official. After their Mutual Defense Treaty expired in 1980 and U.S. troops left Taiwan, the United States maintained a diplomatic and intelligence presence on the island. Taiwan’s defense gained an important boost when the United States agreed to the Six Assurances in July, 1982. Taiwan’s economy boomed in the early 1980’s, and its exports to the United States surged. By the mid-1980’s, Taiwan was exporting a large quantity of labor-intensive, low-technology products to the United States. U.S. imports of Taiwanese footwear alone were worth $2.2 billion in 1986. A significant U.S. trade deficit with Taiwan led to the Americans’ demand for a stronger Taiwan dollar, which rose from being worth 2.5 cents in 1985 to 3.5 cents in 1987. Taiwan’s exports to the United States shifted to heavy industrial and high-technology products. After 1988, full Taiwanese democracy accelerated the island’s economic growth. Impact The United States’ commitment to friendship with China throughout the 1980’s led to a rapidly warming political, economic, and cultural relationship. The U.S. opposition to the Soviet Union benefited from good relations between the United States and China. American interest in China surged, and Chinese students attended American universities. Americans visited mainland China. Against the atmosphere of American optimism regarding relations with Communist China that led to a visible diminution of U.S. official support for Taiwan, the violent Tiananmen Square incident was a sudden shock. The American public was aghast at
Chrétien, Jean
■
211
the killings, yet official U.S. sanctions of China remained muted. Some American critics and Chinese dissidents charged that in 1989, political and economic motives triumphed over human rights in the relationship between the United States and China. Further Reading
Foot, Rosemary. The Practice of Power: U.S.-Chinese Relations Since 1949. Reprint. Oxford, England: Oxford University Press, 2004. Chapter 9 covers the 1980’s; extremely well researched using many Chinese sources; views U.S.-Chinese relations in a global context. Link, Perry. Evening Chats in Beijing. New York: Norton, 1992. Paints a convincing picture of Chinese society and the role of young intellectuals up to the end of the 1980’s, which the author observed as a visiting scholar in Beijing from 1988 to 1989. Mann, James. About Face: A History of America’s Curious Relationship with China, from Nixon to Clinton. New York: Vintage Books, 2000. Chapters 6-10 cover the 1980’s; the author was a journalist in Beijing from 1984 to 1987. Argues that Americans misjudged China’s will to promote democratic change and considers the Taiwan issue. Photos, notes, index. Shen, Tong. Almost a Revolution. Reissue. Ann Arbor: Michigan University Press, 1998. Eyewitness account leading up to and including the Tiananmen Square massacre by a former student leader participating in the event. Photos. R. C. Lutz See also
Bush, George H. W.; Business and the economy in the United States; Cold War; Foreign policy of the United States; Globalization; Olympic boycotts; Reagan, Ronald; Soviet Union and North America; United Nations.
■ Chrétien, Jean Identification Canadian politician Born January 11, 1934; Shawinigan, Quebec
A steadfast advocate of Canadian national unity, Chrétien campaigned vigorously against the secessionist 1980 Quebec referendum and helped assure passage of the historic Constitution Act, 1982. After a brief retreat from politics, he went on to become prime minister of Canada in 1993.
212
■
Chrétien, Jean
The Eighties in America
Minister of Justice Jean Chrétien, right, confers closely with Prime Minister Pierre Trudeau during the 1981 constitutional conference. (Library and Archives Canada/Robert Cooper)
When the Canadian Liberal Party returned to power under the leadership of Prime Minister Pierre Trudeau in 1980, Jean Chrétien was chosen to serve as the nation’s minister of justice and attorney general. One of the principal goals of Trudeau’s Liberal government was to pursue constitutional reforms that would strengthen the Canadian confederation and guarantee fundamental rights to all Canadians. Understandably, Chrétien and the other members of Trudeau’s cabinet viewed the rise of French Canadian nationalism in Quebec and the prospect of Quebec’s secession from the confederation as matters of grave concern. As Canada’s minister of justice, Chrétien played a significant role in assuring the defeat of the Quebec referendum of May 20, 1980. Quebec was Chrétien’s home province, and he was profoundly opposed to its secession from Canada. In numerous speeches prior to the referendum, he appealed to the voters’ sense of national pride, evoking the Rocky Mountains and the Canadian Prairie as a common na-
tional heritage. He pointed to the shared economic advantages of a nation rich in oil and gas reserves, and he portrayed Quebec’s sovereignist leaders as eccentrics and egotists. In the end, Québécois voters favored the justice minister’s vision of confederation over the prospect of sovereignty, and the fragile unity of the nation was preserved. Chrétien then turned his attention to the task of constitutional reform. For years, plans to adopt a Canadian charter of rights and freedoms had floundered because of opposition from provincial governments. In November of 1981, however, following months of preliminary dialogue and negotiation, Chrétien met with provincial leaders in Ottawa and worked out an acceptable compromise—the socalled notwithstanding clause—that was later embodied in section 33 of the Charter of Rights and Freedoms, which was passed as part of the Constitution Act, 1982. Chrétien went on to serve as Canada’s minister of energy, mines, and resources from 1982 to 1984. In
The Eighties in America
1984, he sought election as leader of the Liberal Party but was defeated by his rival, John Turner. In a political autobiography titled Straight from the Heart (1985) Chrétien reflected at length on the significance of that loss. After serving briefly as deputy prime minister of Canada in 1986, he took a break from politics and practiced law until Turner resigned as party leader in 1990. Impact Historians have described Jean Chrétien as one of Prime Minister Trudeau’s most faithful lieutenants. Chrétien’s opposition to Québécois sovereignty and his leadership in the constitutional debates of the early 1980’s were instrumental to the realization of the Liberal Party’s push for a stronger federal government in Canada. Further Reading
Martin, Lawrence. The Will to Win. Vol. 1 in Chrétien. Toronto: Lester, 1995. _______. Iron Man: The Defiant Reign of Jean Chrétien. Vol. 2 in Chrétien. Toronto: Viking Canada, 2003. See, Scott W. The History of Canada. Westport, Conn.: Greenwood Press, 2001. Jan Pendergrass See also Canada Act of 1982; Canadian Charter of Rights and Freedoms; Lévesque, René; Meech Lake Accord; Minorities in Canada; Quebec English sign ban; Trudeau, Pierre; Turner, John.
■ Chrysler Corporation federal rescue The Event
Automobile giant Chrysler, facing bankruptcy, receives assistance from the federal government that helps it survive Date February 23, 1981 The controversial federal loan guarantees that saved Chrysler formed the first federal rescue of a business at a time when President Ronald Reagan advocated smaller government. Chrysler Corporation entered the 1980’s, along with Ford Motor Company and General Motors, as one of Detroit’s Big Three automakers. The company had a storied history, yet Chrysler’s products no longer appealed to many American car buyers. Inferior automobiles and poor management brought the company to the brink of bankruptcy in 1981.
Chrysler Corporation federal rescue
■
213
A History of Financial Woes Chrysler’s management system was unable to keep pace with the increasingly sophisticated automobile business. Companywide optimism, moreover, prevented management from acknowledging the depth of the problem, because employees were confident that sales would be sufficient to mitigate any mistakes the company might make. The first Chrysler car had appeared in 1924. It sold well, and the company had established a solid financial position by the end of World War II. However, under Chairman K. T. Keller, Chrysler frittered away its advantages. Keller insisted on maintaining many prewar practices, refusing to adapt corporate practices to the needs of the postwar marketplace. He disdained styling, for example, despite its popularity with consumers. Chrysler’s next chairman, Lester L. Colbert, struggled in vain to reverse the company’s dwindling momentum. Colbert increased plant capacity, trying to decentralize operations and to improve the manufacturer’s chronically poor relations with its dealers. In the 1960’s, Chrysler rebounded. Under the guidance of Lynn Townsend, the company increased its market share to 18 percent, while establishing substantial overseas sales. It also stockpiled cars in expectation of future orders, however, a “sales bank” tactic that would later damage the company. By the 1970’s, other car companies discontinued the sales bank strategy. Chrysler, however, was forced to continue building and storing cars that nobody had yet ordered in order to keep its plants running. By the time Lee A. Iacocca became chairman in 1978, Chrysler stood at the edge of bankruptcy. The Bailout
In the 1980’s, American car companies faced stiff competition from Japanese firms that emphasized fuel efficiency and quality. Imports accounted for about 30 percent of the American automobile market, and some analysts doubted that room remained for three car giants in Detroit. In 1980, Chrysler lost $1.71 billion. No company in American history had ever lost so much money. By February, 1981, Chrysler owed its suppliers more than $300 million. To preserve American jobs and one of the largest American companies, Iacocca sought loan assistance from the federal government. On its last full day in office, the administration of President Jimmy Carter arranged a deal for Chrysler. The situation thus presented newly installed president Ronald Reagan with a dilemma. The Re-
214
■
The Eighties in America
Claiborne, Harry E.
publican Reagan had crusaded to reduce the role of government in economic affairs. If he lived up to his free market beliefs, he would allow Chrysler to collapse. Such a collapse would have been the largest bankruptcy in American history. It would also have angered the many car workers who supported Republicans, including Reagan, in the 1980 elections and could have spelled the end of the new majority coalition that Reagan hoped to build. Ford also had horrifying losses, however, and a rescue of Chrysler would set a precedent for government assistance to other businesses. Reagan came into office without taking a clear stand on Chrysler or the automobile industry’s plight. On February 23, 1981, the Reagan administration sanctioned $400 million in federal loan guarantees for Chrysler. Support from the federal government did not ensure Chrysler’s survival. The company remained short of investment funds. It had reduced capital spending by 38 percent to qualify for federal assistance, at a time when Ford and General Motors were spending billions of dollars to develop new, smaller cars. According to the operating plan that it presented to the federal loan guarantee board, Chrysler expected to realize a profit, so long as it held 9 percent of the automobile market and had sales of at least $10 billion. In the second quarter of 1981, Chrysler had a 9.5 percent market share. It paid banks $71 million under a plan that allowed the company to write off most of its $1.3 billion of outstanding debt. Lenders accepted about half of the debt in Chrysler’s preferred stock. In February, 1982, Chrysler completed the second part of a twostep plan to pay off its debts. The February payment came six weeks ahead of the agreement’s deadline. Iacocca restructured the company to improve Chrysler’s chances for long-term survival. Chrysler had employed 85,000 workers in 1981, down from 160,000 in 1979. About half of these workers lost their jobs, as Iacocca slashed more than $1.2 billion from workers’ salaries and benefits. The 40,000 bluecollar workers, represented by the United Auto Workers, who kept their jobs did so by granting wage and benefit concessions to help the company survive. Under the terms of the concessions, Chrysler paid abut $3.50 per hour less than the other domestic automakers. It closed or consolidated twenty outdated plants, thereby reducing annual fixed costs by $2 billion. It managed to cut its break-even point in sales from 2.4 million units to 1.2 million units. It in-
creased fleet-average fuel economy to twenty-eight miles per gallon. The K-car platform (including the Dodge Aries, Plymouth Reliant, and Chr ysler LeBaron) helped vault the company from near oblivion. The front-wheel drive compact range accounted for 70 percent of Chrysler’s output. Impact By August, 1982, Chrysler had completely emerged from bankruptcy with two consecutive profitable quarters. It did so at a time when all other domestic automakers suffered from a steep drop in sales. However, automobile analysts remained pessimistic about the company’s long-term survival prospects. At the time of Chrysler’s rescue by the federal government, financial analysts suggested the company would have the best chance of survival if it merged with another manufacturer, possibly a foreign firm. The company would remain independent, however, until the late 1990’s. Further Reading
Hyde, Charles K. Riding the Roller Coaster: A History of the Chrysler Corporation. Detroit: Wayne State University Press, 2003. Complete history of the company from its beginnings through its merger with Daimler-Benz at the end of the twentieth century. Jefferys, Steve. Management and Managed: Fifty Years of Crisis at Chrysler. New York: Cambridge University Press, 1986. A mid-1980’s perspective on preand postwar Chrysler management and the firm’s string of mistakes in the 1950’s and 1970’s. Moritz, Michael, and Barrett Seaman. Going for Broke: Lee Iacocca’s Battle to Save Chrysler. Garden City, N.Y.: Anchor/Doubleday, 1984. Portrait of Iacocca and his struggles to rescue the Chrysler Corporation. Caryn E. Neumann See also
Business and the economy in the United States.; Iacocca, Lee.
■ Claiborne, Harry E. Identification U.S. federal judge Born July 5, 1917; McRae, Arkansas Died January 19, 2004; Las Vegas, Nevada
Judge Harry Claiborne was removed from the bench for evading his income taxes. Claiborne was the first federal judge to be impeached in more than fifty years.
The Eighties in America
In 1986, the U.S. Congress impeached Nevada federal judge Harry E. Claiborne, convicted him, and removed him from the bench. Claiborne had once been known as a powerful defense attorney in Las Vegas, earning his reputation by representing powerful casino owners such as Jack Binion. In 1978, Claiborne was appointed by President Jimmy Carter to be a federal district judge. After Ronald Reagan assumed the presidency in 1981, Claiborne’s Las Vegas connections earned him the unwelcome attention of Reagan’s Department of Justice, which investigated the judge’s ties to the owner of the infamous Mustang Ranch and Brothel, Joe Conforte. The Justice Department gathered evidence that Claiborne had accepted bribes from Conforte. In 1984, Claiborne was charged with accepting bribes and with income tax evasion for not reporting the bribes as income. Conforte testified against the judge, but a Reno court was unconvinced, and the trial ended in a hung jury. Unable to prove that the money Claiborne had taken from Conforte was a bribe, the government focused on the fact that the payments were not reported as income, and a second jury found Claiborne guilty of federal income tax evasion. While sitting in a federal prison, Claiborne continued to draw his federal judicial salary, prompting the Democrat-controlled House of Representatives to act. By a unanimous vote on July 22, 1986, the House impeached Claiborne on four charges of tax evasion and undermining the integrity of the judiciary. The impeachment was then sent to the Senate. During a Senate committee’s review of the evidence, Claiborne argued that his conviction was part of a vendetta by a Republican Justice Department and that his tax problems were the result of sloppiness on his part rather than any attempt to deceive the government. The committee presented evidence to the full Senate, which voted on his removal as a federal judge. The Senate convicted Claiborne of three counts of the four counts against him, with the aye votes ranging from eighty-seven to ninety. On the remaining count, the Senate was unable to achieve the two-thirds majority necessary to convict. Among Claiborne’s strongest supporters were the two Republican senators from Nevada, Paul Laxalt and Chic Hecht, both of whom voted against conviction on at least two counts of the impeachment. He also received the vote of Senator Orrin Hatch, a Utah Re-
Clancy, Tom
■
215
publican. Democrats from Ohio, Louisiana, and Arkansas were also among those who voted against removal on the first three counts. Impact Judge Claiborne’s impeachment and removal were followed by the impeachment and removal of two more federal judges. It demonstrated that Congress could act in a bipartisan manner, as a Democratic House and Republican Senate worked together to remove a Democratic judge from his position. Further Reading
Denton, Sally, and Roger Morris. The Money and the Power. New York: Vintage Press, 2002. Gerhardt, Michael. The Federal Impeachment Process. Chicago: University of Chicago Press, 2000. Volcansek, Mary. Judicial Impeachment. Champaign: University of Illinois Press, 1993. Douglas Clouatre See also Congress, U.S.; Conservatism in U.S. politics; Liberalism in U.S. politics; Supreme Court decisions.
■ Clancy, Tom Identification American popular novelist Born March 12, 1947; Baltimore, Maryland
The novels that Clancy published in the 1980’s reflected the concerns of the Cold War era and featured stories told from the points of view of characters serving in the military or the CIA. Later in the decade, as the Cold War drew to an end, Clancy’s novels began to address the post-Cold War era. Many readers, including fans and critics of Tom Clancy’s novels, credit the author with creating a new genre, the techno-thriller. Clancy’s novels combine military technological knowledge, political intrigue, espionage, and often terrorism. In 1984, Tom Clancy entered the publishing world with his first novel, The Hunt for Red October. First published by the Naval Institute Press, The Hunt for Red October owed much of its popularity to President Ronald Reagan, who praised the novel, calling it “the perfect yarn.” Reagan’s secretary of defense, Caspar Weinberger, also gave the book a positive review in The Times Literary Supplement. The novel soon made the best seller lists and was later made into a movie starring Alec Baldwin as Clancy’s hero, Jack Ryan.
216
■
The Eighties in America
Classical music
Impact The success of his novels and the movies based on them allowed Clancy to publish several other fiction series, movies, young-adult books, nonfiction books, and video and board games. Though Clancy himself claims not to be the originator of the techno-thriller, his influence is evident in the work of many writers and filmmakers who have followed him. Whether he created the genre or not, he is clearly a master of it. Further Reading
Baiocco, Richard, ed. Literary Companion to Contemporary Authors: Tom Clancy. Westport, Conn.: Greenhaven Press, 2003. Greenberg, Martin H. The Tom Clancy Companion. Rev. ed. New York: Berkley Trade, 2005. Terdoslavich, William. The Jack Ryan Agenda: Policy and Politics in the Novels of Tom Clancy—An Unauthorized Analysis. New York: Forge, 2005. Kimberley M. Holloway See also
Book publishing; Cold War; Ford, Harrison; Literature in the United States.
Tom Clancy. (John Earle)
■ Classical music Clancy published four more popular and successful novels during the 1980’s, including Red Storm Rising (1986), Patriot Games (1987), The Cardinal of the Kremlin (1988), and Clear and Present Danger (1989). All of Clancy’s novels of the decade were popular sellers, easily making their way to the top of best seller lists, and Clear and Present Danger was the bestselling book of the 1980’s. Focusing on the geopolitical issues prevalent in the 1980’s, Clancy’s novels gained a solid and loyal following of readers who continued to buy and read his books over the next two decades. Because Clancy’s novels had such a large audience and the appeal of a likable hero in Jack Ryan, several of his books were made into major motion pictures. These films include The Hunt for Red October (1990), Patriot Games (1992), and Clear and Present Danger (1994). After the first movie, Harrison Ford took over the role of Jack Ryan for the next two films. Ford’s portrayal of Jack Ryan in particular propelled Clancy’s novels and the movies to both popular and critical acclaim.
Definition
Compositional styles, composers, and works of art music
The experimental music of earlier decades, combined with increased competition from popular music genres such as rock and jazz, had practically eliminated the audience for new classical compositions. However, during the 1980’s, several twentieth century classical styles, including minimalism and postmodernism, reached aesthetic maturity and helped revitalize the genre. As a result of diminishing audiences for classical music during the previous decades, it became important in the 1980’s to bring listeners back to the concert halls. While music was becoming more physically accessible through new technology such as the compact disc (CD), audiences were demanding more conceptually accessible music as well. Composers of challenging new pieces found it difficult to repeat performances of their new compositions, and fewer pieces were entering the classical repertoire. Film sound tracks, such as that of Peter Shaffer’s Amadeus (1984), reminded audiences of a fondly remembered, simpler style.
The Eighties in America
Many innovations, however, began to revitalize the classical genre. Vinyl recordings were reissued digitally on CD, making classical music more accessible with a greater fidelity and allowing lengthy works to be presented without interruption. Digital synthesizers became the norm instead of the older analog devices. To attract students to the study of music, academic courses on rock history entered the university curriculum, taking their place alongside courses on classical music and jazz. Prominent composers from earlier generations, such as Aaron Copland (1900-1990) and Samuel Barber (1910-1981), remained influential; however, most were on the lecture circuit and writing books, rather than composing new works. John Cage (19121992) published several books, including X (1983) and Anarchy (1988). Leonard Bernstein (1918-1990) published a collection of essays, Findings (1982), and in 1985 the National Academy of Recording Arts and Sciences honored him with the Grammy Award for Lifetime Achievement. Minimalism and Its Influence
Minimalism, a compositional style that reduces music to its more basic elements, gained a new prominence in the concert hall. Steve Reich, whose works were primarily vocal and incorporated the use of electronic tape, found audience approval with such works as Tehillim (1981), The Desert Music (1984), and The Four Sections (1987). Philip Glass, a reigning leader of the movement, composed operas such as Satyagraha (1980) and Akhnaten (1983) and expanded his interest to theater and film as well. His debut recording on the CBS label, Glassworks (1981) for solo piano, was soon followed by the film score Koyaanisqatsi (1982); Songs from Liquid Days (1985) featuring lyrics by popular musicians David Byrne, Paul Simon, Laurie Anderson, and Suzanne Vega; and the theatrical work One Thousand Airplanes on the Roof (1988). Often considered a post-minimalist, John Adams blended minimalist techniques with elements from popular and traditional harmonic music. His symphonic poem Harmonielehre (1985) incorporated styles from the Romantic period in music history. The three-act opera Nixon in China (1987) was based on Richard Nixon’s 1972 trip to China to negotiate with the Communist Party. Adams’s other important works from this decade include Harmonium (1981), for chorus and large orchestra, and his orchestral composition Short Ride in a Fast Machine (1986).
Classical music
■
217
Postmodernism
Postmodernism in music is generally described as a return to traditional techniques in response to controversial modern movements. It can be presented in many forms. Composers may refer to the past through eclecticism—that is, incorporating quotations from earlier music or creating a collage of multiple quotations. Other composers may allude to past styles, while others deliberately resurrect nineteenth century Romanticism to form neo-Romanticism. These techniques were well in fashion during the 1980’s as a result of the efforts of composers Lucas Foss, George Crumb, and George Rochberg, although the most significant direct influences of these composers were upon previous decades. David Del Tredici, known primarily for his vocal works, is regarded as a leader in neo-Romanticism. A professor of music at the City College of New York beginning in 1984, he served from 1988 to 1990 as composer-in-residence with the New York Philharmonic. He received the Pulitzer Prize in 1980 for In Memory of a Summer Day (1980). John Corigliano drew on a variety of styles from all time periods to compose his works. His opera The Ghosts of Versailles (1987) was centered on the ghosts of the court of Louis XIV. Symphony No. 1 (1989) was a memorial to friends who died from AIDS and incorporated quotations from some of their favorite pieces. Peter Schickele made frequent use of musical quotations and a variety of styles in his works. He became best known under the pseudonym P. D. Q. Bach, in which role he presented comedic parodies of many different musical styles. His “Howdy” Symphony (1982), for example, was a parody of Haydn’s “Farewell” Symphony.
Women Composers Women emerged to the forefront of classical music in the 1980’s. Ellen Taaffe Zwilich’s work alluded to traditional compositional techniques while combining them with her unique modern style. The composition Three Movements for Orchestra (Symphony No. 1) received immediate acceptance with concert audiences and led to the award of a Pulitzer Prize in music in 1983—the first time that this prestigious honor was awarded to a woman. Joan Tower reached prominence as a composer in the 1980’s. Her first orchestral composition, Sequoia (1981), was an immediate success, leading to her position as composer-in-residence for the St. Louis Symphony Orchestra from 1985 to 1988.
218
■
Close, Glenn
The Eighties in America
Two significant works emerged from this period, Silver Ladders (1986) and the first Fanfare for the Uncommon Woman (1986). Impact The composers of the 1980’s used a combination of old resources and new techniques to bring audiences back into the concert halls. Minimalism and postmodernism in music would serve to soothe the listener’s ears while maintaining the integrity of each individual composer. The music of female composers also emerged as a standard, rather than an exception, in classical performance. Further Reading
Burkholder, J. Peter, Donald Jay Grout, and Claude V. Palisca. A History of Western Music. 7th ed. New York: W. W. Norton, 2006. One of the leading music history texts used by academic institutions. Material is presented chronologically. Gann, Kyle. American Music in the Twentieth Century. New York: Schirmer Books, 1997. Survey of music movements and their significant leaders from the beginning of the century through the 1990’s. Hall, Charles J. A Chronicle of American Music, 17001995. New York: Schirmer Books, 1996. Extensive listing of highlights in American music, listed by year. Simms, Bryan R. Music of the Twentieth Century: Style and Structure. 2d ed. New York: Schirmer Books, 1996. Excellent survey of important twentieth century music styles, structure, influential composers, and specific masterpieces. P. Brent Register See also
Art movements; Compact discs (CDs); Film in the United States; Glass, Philip; Jazz; Music.
■ Close, Glenn Identification American actor Born March 19, 1947; Greenwich, Connecticut
Throughout the 1980’s, Glenn Close’s prestige as a dramatic actor continued to increase. By the end of the decade, she had won a Tony Award and been nominated for three Oscars. Born to a highly religious physician who operated a charity clinic in the Belgian Congo, Glenn Close spent her early years in Switzerland and Africa. Sent home for high school, she attended a private acad-
Glenn Close poses with her People’s Choice Award for best actress in March, 1988. (AP/Wide World Photos)
emy, Rosemary Hall. (Michael Douglas, her co-star in Fatal Attraction, attended Choate; the two schools would merge in 1974.) During her high school years, Close, who was driven to act, organized a touring repertory theater group. She then attended William and Mary College, majoring in anthropology but also studying acting. At William and Mary, Close became a member of the distinguished honor society Phi Beta Kappa. In her youth, she also became interested in baseball and became a lifelong fan of the New York Mets, singing the National Anthem at the opening of the 1986 World Series. Close is a second cousin to actor Brooke Shields and a distant relative of Princess Diana.
The Eighties in America
After completing college, Close sought opportunities to work as an actor. In 1974, she obtained a position with New York’s Phoenix Theater, and her career was launched. Her first Broadway show was the 1974 production of Love for Love. In 1976, she appeared in the musical Rex. It was the musical Barnum (1980), however, that sent her to Hollywood. Director George Hill was taken with her performance and offered her a role in The World According to Garp (1982). Close was featured in seven major films and three television specials during the 1980’s, including The Big Chill (1983), The Natural (1984), and Dangerous Liaisons (1988). In two other films—Greystoke: The Legend of Tarzan, Lord of the Apes (1984) and Gandahar (1988; Light Years)—she dubbed the lines of Andie McDowell. On Broadway, in addition to Barnum, she appeared in Tom Stoppard’s The Real Thing (1983), for which she won a Tony Award, and Benefactors (1985). However, it was her performance as the dangerously obsessive Alex Forrest in the thriller Fatal Attraction (1987) that firmly established Close as one of Hollywood’s biggest and most glamorous stars, and by the end of the decade she had been nominated for two Academy Awards for Best Actress and one for Best Supporting Actress. It was not surprising, therefore, that in 1988 she won the People’s Choice Award. Impact Close was a versatile actor whose talent served her well both on stage and on screen, but her fame and reputation in the late 1980’s resulted largely from her portrayals of villains Alex Forrest and the marquise de Merteuil in Dangerous Liaisons. Her glamour was thus tempered by a willingness to play unsavory characters that was somewhat unusual for top-tier screen actors. She would continue in the next decade to embrace such radically different roles as Gertrude in Hamlet (1990) and Cruella De Vil in 101 Dalmations (1996). Further Reading
Thomas, David. A Biographical Dictionary of Film. 3d ed. New York: Alfred A. Knopf, 1995. Wilmeth, Don B., and Christopher Bigsby. Post World War II to the 1990’s. Vol. 3 in The Cambridge History of American Theatre. New York: Cambridge University Press, 2006. August W. Staub See also
Big Chill, The; Fatal Attraction; Film in the United States; Television; Theater.
Closing of the American Mind, The
■
219
■ Closing of the American Mind, The Identification
Best-selling critique of liberal arts education and the American university system Author Allan Bloom (1930-1992) Date Published in 1987 Bloom developed a coherent conservative philosophy of higher education and presented it as a sociopolitical criticism of American intellectual culture since World War II. Allan Bloom asserted in The Closing of the American Mind: How Higher Education Has Failed Democracy (1987) that American education, especially higher education, had abandoned its classical values in the humanities and social sciences. Rather than following its former ideal of the rigorous study of “great books,” well-defined curricula, and historically significant Western ideas, higher education—Bloom said—espoused trendy authors, experimental curricula, and dangerous new ideas. It had uncritically elevated the popular, tantalizing, and ignoble above the erudite, sublime, beautiful, and complex. The disinterested search for absolute truth, which Bloom claimed had motivated the Academy since the time of René Descartes and John Locke, had since the 1960’s been superseded by the denial of absolute truth, which Bloom associated with postmodernism. Bloom aligned himself with Socrates, Plato, and Aristotle, whom he saw as serious seekers of truth, and he opposed contemporary academics whom he portrayed as comparing and understanding various points of view without evaluating them objectively. He held that such indiscriminate toleration of other points of view led to a lack of discernment, which rendered the quest for truth impossible. Mounting a wholesale attack on both conservative and leftist philosophers of the twentieth century, Bloom rejected both analytic philosophy and deconstruction, because he believed that they both trivialized the monumental philosophical agenda that had occupied Socrates, Plato, Aristotle, Immanuel Kant, and Georg Wilhelm Friedrich Hegel. The deconstructionist method of Jacques Derrida, he claimed, was the last nail in the coffin of reason. Bloom traced his own intellectual lineage to the distinctive conservatism of the eighteenth century Enlightenment, which he saw as characterized by absolutism and keen judgment. He decried the rise of multiculturalism and linked it to moral and cultural
220
■
relativism, blaming anthropologist Margaret Mead for the former and sociologist Max Weber for the latter. He identified Karl Marx, Sigmund Freud, and Thomas Kuhn as among the sources of relativism. An absolutist in ethics, he condemned the social movements of the 1960’s and reproached leftists for making thinkers he saw as right wing, such as Friedrich Nietzsche and Martin Heidegger, speak for the Left. An elitist in aesthetics, Bloom hated rock and roll and subsequent derivative forms of music, which for him were merely sexual. He preferred the subtler emotions of classical music. Bloom’s arguments rang true with many who were dismayed at the continuing evolution of the Academy. His criticisms of education were neither new nor exclusively conservative: Similar criticism had been made when American universities began teaching American literature, rather than an exclusively English curriculum, and they had also been leveled in England against those who had introduced English literature into a previously Greekand Latin-dominated curriculum. Impact The Closing of the American Mind catapulted Bloom from being only a fairly well known social philosopher and translator of Plato to occupying a prominent place in the ranks of the conservative intellectuals of the Ronald Reagan era, including William Bennett, Robert Bork, Francis Fukuyama, E. D. Hirsch, and John Silber. Bloom’s book was frequently considered alongside Hirsch’s best seller, Cultural Literacy, which appeared the same year. Further Reading
Buckley, William K., and James Seaton, eds. Beyond Cheering and Bashing: New Perspectives on “The Closing of the American Mind.” Bowling Green, Ohio: Bowling Green State University Popular Press, 1992. Graff, Gerald. Beyond the Culture Wars: How Teaching the Conflicts Can Revitalize American Education. New York: W. W. Norton, 1992. Hirsch, Eric Donald. Cultural Literacy: What Every American Needs to Know. Boston: Houghton Mifflin, 1987. Stone, Robert L., ed. Essays on “The Closing of the American Mind.” Chicago: Chicago Review Press, 1989. Eric v. d. Luft See also
The Eighties in America
CNN
Bennett, William; Bork, Robert H.; Conservatism in U.S. politics; Education in the United
States; Gallaudet University protests; Mainstreaming in education; Multiculturalism in education; Political correctness; Standards and accountability in education.
■ CNN Identification
Twenty-four-hour cable television news channel Date Debuted on June 1, 1980 CNN was the first twenty-four-hour daily news channel to deliver news to a global audience. By the end of the 1980’s, people around the world relied on CNN as a primary news source. Cable News Network (CNN) was the brainchild of entrepreneur Ted Turner. Turner built a family outdoor-advertising business into a communications empire through the acquisition of television and radio stations. He aligned himself with the fledgling cable industry to transform local Channel 17 in Atlanta into a “superstation” whose broadcasts reached a national audience. He invested in satellite dish technology and occasionally confronted the Federal Communications Commission (FCC) over approval of his acquisitions and innovations. Turner bought Atlanta’s professional baseball and basketball teams, as well as the broadcast rights to Atlanta Flames hockey games, to build his catalog of programming and provide twenty-four hour content. With this background of accomplishments, Turner turned his attention to creating a twenty-four-hour cable news network. CNN on the Air CNN debuted on June 1, 1980. Despite the loss of the satellite originally scheduled to carry the network, 1.7 million cable subscribers received the signal through the Turner Broadcasting System (TBS) and saw live satellite feeds from around the world. Minutes into the first broadcast, CNN got its first scoop, airing live coverage of President Jimmy Carter’s visit to the hospital bedside of wounded civil rights leader Vernon Jordan. Borrowing elements from twenty-four-hour news radio, CNN’s format was the news “wheel.” Major news stories repeated throughout the day; new stories were added every so often. Breaking news always took precedence. CNN took the emphasis off the newscaster and placed it on the news itself. The net-
The Eighties in America
work invested in portable satellites and widening its cable network. The CNN news organization was a mix of veteran journalists, seasoned news managers, and low-wage college graduates who were infected by Turner’s enthusiasm and his determination to make CNN work. Chicken Noodle News Versus SNC
Broadcast news executives belittled CNN’s launch, dubbing it “chicken noodle news” for its unsophisticated production values and tight budgets. CNN recorded $7 million in revenue and $16 million in losses in its first year. In 1981, CNN2 was launched with thirtyminute condensed news segments. Within eighteen months of CNN2’s start, Ted Turner, near bankruptcy, fought off the launch of ABC-Westinghouse’s competitive cable news channel, the Satellite News Channel (SNC). Turner bought SNC, acquiring its cable slots and adding more than one million viewers to his own network.
Growth and Respect CNN added specialty news segments on business, medicine, entertainment, and politics. In 1981, CNN anchor Bernard Shaw was the first to break the news about the attempted assassination of President Ronald Reagan. In 1982, the fledgling news network fought for and won a place alongside the major network organizations in the White House press pool. In 1986, CNN was the only network providing live coverage of the launch of the space shuttle Challenger when it exploded shortly after liftoff. CNN continued to add programming and even other channels. The network acquired Crossfire, a political debate show in Washington, D.C. In 1985, it added popular radio talk-show host Larry King to its lineup with a nightly interview show that became a ratings success. Turner pushed the boundaries of international news coverage. In 1981, he visited Cuba and initiated the first live broadcast from that nation since 1958. The International News Hour covered events in one hundred different nations. In 1985, CNN International was launched as a twenty-four-hour global news service, first in Europe and by 1989 in Africa, Asia, and the Middle East. When the broadcast networks dropped foreign news bureaus to cut costs, CNN swooped in to pick them up. In 1986, CNN cameras covered the aerial bombing of Libya. In 1984, CNN was operating at a loss of $20 million a year. In 1985, the losses ended, as the company posted $123 million in revenues and $13 million in
CNN
■
221
profits. The financial turnaround of the company was accompanied by growing respect for its journalistic accomplishments. In 1984, 1987, and 1988, CNN received the George Foster Peabody Broadcasting Award for program quality and excellence. In 1987, CNN moved from its original home, an abandoned country club, to new facilities with cutting-edge technology, the CNN News Center. The move signified the presence of CNN as a successful innovator in news production and distribution. Impact CNN recognized a need in news consumers. Sometimes derided as “crisis news network” because audiences swelled during crisis coverage, CNN united millions of viewers in a shared experience. CNN’s coverage raised awareness of national issues,
CNN chair Ted Turner emphasizes that television news need not be bad news at a luncheon of the Advertising Club of Cincinnati in January, 1981. (AP/Wide World Photos)
222
■
and its commitment to international coverage exposed viewers to a global perspective. In 1989, the world watched CNN as tanks rolled into Tiananmen Square in China and the Berlin Wall came down. Critics supported CNN’s coverage of international news but condemned the lack of editorial process when news was delivered instantaneously. The trend toward instantaneous delivery continued, however, as the evolution of the media landscape was shaped both by CNN’s twenty-four-hour format and by its strategies of crisis coverage. The growth of cable television and of CNN were inextricably linked. Cable provided CNN with the means to reach a nationwide audience, and CNN brought audiences to cable. By 1986, cable was in 48.7 percent of the television households in the United States. Penetration of cable continued to rise, and CNN, carried on basic cable, continued to reach more households. As the first twenty-fourhour cable news channel, CNN provided a model for future competitors the Consumer News and Business Channel (CNBC), the Microsoft-National Broadcasting Company co-venture MSNBC, and the Fox News Channel. Further Reading
Auletta, Ken. Media Man: Ted Turner’s Improbable Empire. New York: W. W. Norton, 2004. A personal portrait of Ted Turner. _______. Three Blind Mice: How the TV Networks Lost Their Way. New York: Random House. 1991. Details the factors that led to the precipitous decline of broadcast network viewership. Excellent behind-the-scenes descriptions. Hack, Richard. Clash of the Titans: How the Unbridled Ambition of Ted Turner and Rupert Murdoch Has Created Global Empires That Control What We Read and Watch. Beverly Hills, Calif.: New Millenium Press. 2003. A thorough examination of two extraordinary men. The book details the origins of CNN and of the FOX Network. Nancy Meyer See also
The Eighties in America
Cold Sunday
Berlin Wall; Cable television; Challenger disaster; China and the United States; Libya bombing; Network anchors; Reagan assassination attempt; Turner, Ted.
■ Cold Sunday The Event
A cold wave disrupts the lives of millions of Americans Date January 17, 1982 Place The United States from the Rocky Mountains to the Atlantic and Gulf coasts On Cold Sunday, extreme cold, heavy snowfall, and high winds claimed lives and threatened the economic well-being of the United States. During the first two weeks of January, 1982, an extensive polar high-pressure system developed over eastern Canada, as jet stream winds in the upper atmosphere shifted unusually far north before dipping southward. Weather services predicted that a vast accumulation of frigid air would move from Canada into the Midwest and the Northeast. Residents of Chicago experienced the coldest day on record on January 10, when thermometers registered −26 degrees Fahrenheit. Winds flowing above the cold air as it passed over the Great Lakes caused an extremely heavy snowfall in New York and Minnesota. The arctic air then rolled into a low-pressure trough extending from Ontario to the Gulf of Mexico, resulting in temperatures far below average throughout that region. Heavy snows and hazardous ice spread through parts of the South, the Ohio Valley, and the Middle Atlantic states. Temperatures fell even further on January 17, a day that was dubbed “Cold Sunday” as a result of the record-breaking cold. Hardest hit were cities near the Great Lakes; they experienced temperatures ranging from −26 degrees Fahrenheit in Milwaukee to −52 degrees Fahrenheit in northern Minnesota. The day’s high temperature in Philadelphia, zero degrees Fahrenheit, proved to be the lowest maximum temperature ever recorded in the Delaware Valley. Winds in Colorado reached hurricane force, gusting up to 137 miles per hour. During the cold wave, a series of accidents, power outages, and other difficulties brought sections of the country to a near standstill. In Chicago, firefighters battled eight major fires on Cold Sunday, their efforts hampered by frozen hydrants and icefilled hoses. Furnace fuel oil congealed in storage tanks in the Midwest, even as natural gas consumption peaked in six major eastern cities. High winds and icy roads caused thousands of automobile accidents, and commuters in New York, Boston, and
The Eighties in America
Philadelphia faced disabled subways and trains on the Monday following Cold Sunday. The death toll rose, as over 280 deaths were attributed to the cold conditions between January 9 and 19. Dozens of victims in unheated homes succumbed to hypothermia (low body temperature) or heart attack. In all, the cold wave of the first two weeks of January set some one hundred low-temperature records. Impact
The cold wave in which Cold Sunday fell cost hundreds of millions of dollars. Snow-removal expenses drained city budgets, and families struggled to pay soaring fuel bills. Grocery prices rose in response to southern crop failures, businesses were forced to operate on shortened hours, and retail sales fell. Economists feared that the subzero tem peratures and continued bad weather would increase unemployment and inflation, as well as deepen the recession.
Further Reading
Ludlum, D. M. “Ten Days That Shook the Weather Record Book.” Weatherwise 35 (February, 1982): 50. “The Numbing of America.” Time, January 25, 1982, 12-16. Wagner, A. James. “Weather and Circulation of January 1982: A Stormy Month with Two Record Cold Waves.” Monthly Weather Review 110, no. 4 (April, 1982): 310-317. Margaret A. Koger See also
Agriculture in the United States; Business and the economy in the United States; Inflation in the United States; Natural disasters; Unemployment in the United States.
■ Cold War Definition
Period of tension and competition between two superpowers—the United States and the Soviet Union—lasting from 1945 to 1991
The Cold War defined U.S. foreign policy during the presidency of Ronald Reagan, who famously referred to the Soviet Union as an “evil empire.” As a result, the threat of nuclear annihilation haunted the American popular imagination throughout the 1980’s. At the end of the decade, however, the Cold War came to an end, as the Soviets insti-
Cold War
■
223
tuted liberal reforms and the Berlin Wall was torn down. At the beginning of the 1990’s, the so-called evil empire collapsed, and Russia and its former republics and satellite nations embarked on a project of rebuilding and transforming their governments. The 1980’s began only a week after the Soviet Union’s invasion of Afghanistan on December 25, 1979. The decade ended only three days after the end of communism in Czechoslovakia on December 29, 1989, capped an autumn defined by the collapse of communist regimes across Central Europe. In between, three American presidents governed during a decade that witnessed, first, a return of competitive and confrontational politics reminiscent of the worst days of the Cold War and, then, an abrupt about-face toward superpower cooperation, even in highly sensitive areas. Afghanistan, Carter, and the Cold War By 1980, the Cold War had passed through several distinct stages. Born out of a series of Soviet aggressive maneuvers and U.S. responses to them during the 1945-1947 period, the Cold War had crystallized with the American decision in March of 1947 to make the containment of communism the anchor of its postwar foreign policy. There followed a fifteen-year period of U.S.-Soviet competition and increasingly global, confrontational politics that ended only when the Cuban Missile Crisis in October of 1962 brought the two superpowers to the brink of thermonuclear war. A mutual desire to avert future confrontations led to a shift in U.S.-Soviet relations in the 1960’s. Pre-1962 confrontation politics gave way to a six-year interim period, during which both superpowers sought to minimize the danger of an accidental nuclear war, even while continuing to pursue their competition with one another for international influence. This approach was augmented in 1969 by President Richard M. Nixon and his national security adviser, Henry Kissinger. The Nixon administration made a concerted effort to achieve a détente, or relaxing of tensions, first with the Soviet Union and later with China. The Soviet invasion of Afghanistan put an official end to the détente era. President Jimmy Carter had criticized the détente policy as too one-sided when campaigning in 1976, on the grounds that Moscow had often exploited the United States’ desire for better relations to advance its self-interest. After his election, though, Carter had continued the policy,
224
■
Cold War
The Eighties in America
U.S. president Ronald Reagan talks with Soviet leader Mikhail Gorbachev during formal arrival ceremonies to welcome Gorbachev to the White House in December, 1987. The developing relationship between the two leaders helped bring about the end of the Cold War. (AP/ Wide World Photos)
although he rarely referred to it by name. In 1980, however, he proclaimed it to be over, announcing that the United States would boycott the upcoming Olympic games in Moscow in response to the Soviet occupation of Afghanistan and calling upon other nations to join a U.S. grain embargo of the Soviets until they withdrew their forces. Nor did the U.S. response end there. At a time when Ronald Reagan was promising to increase military spending by 7 percent per year during his 1980 presidential campaign against Carter, Congress reacted to Moscow’s action by substantially increasing the military budget. The Cold War was thus already getting warmer when Reagan became president in January, 1981. Reagan’s First Term As President Reagan’s first term began, the Soviet Union was juggling three major problems: A discontented population at home
was growing tired of waiting for long-promised improvements in availability of consumer goods; the war in Afghanistan was becoming increasingly openended; and the Soviet economy, already strained by domestic woes, was increasingly burdened by the need to subsidize Moscow’s client states in Eastern Europe. Reagan, a hawk by instinct, sought to exploit these weaknesses by ratcheting up the costs of Cold War competition on three fronts. First, Reagan continued the arms buildup in response to renewed Soviet aggressiveness that had begun during Carter’s last year, annually augmenting a military budget that had been substantially enlarged before he entered office. His strategy, designed to ensure U.S. military superiority over the Soviet Union, included two particularly provocative elements. On the offensive side, U.S. nuclear missiles were deployed in European sites so near the Soviet
The Eighties in America
Cold War
■
225
other issues, such as state-sponsored terrorism, were Union that, in the event of a confrontation, they often attributed to its machinations. Even the sucmight have to be launched “on warning,” rather cession to power in 1985 of the reform-minded than in response to a confirmed Soviet attack. MeanMikhail Gorbachev did not initially halt the flow while, on the defensive side, Reagan proposed comof Cold War rhetoric, with Reagan famously chalmitting significant resources to the development of lenging Gorbachev to prove his liberalism in Bera controversial, satellite-based antimissile defense lin (“Mr. Gorbachev, tear down this wall”). Consesystem called the Strategic Defense Initiative (SDI, quently, the first meeting of these two heads of state known colloquially as “Star Wars”). ended in a chilly swap of Cold War shopping lists, Second, Reagan chose to fight a war by proxy against the Soviet units in Afghanistan. Using Pakistan as a staging ground, the Reagan administration “Mr. Gorbachev, Tear Down This Wall!” funneled large amounts of military and economic assistance into the On June 12, 1987, President Ronald Reagan gave a speech at the hands of the various anti-Soviet forces Brandenburg Gate in West Berlin, West Germany, in which he chalin Afghanistan, including foreign Islenged Soviet leader Mikhail Gorbachev to prove his liberalism: lamic fighters who traveled to Afghanistan to join the insurgency. Among In the 1950’s, Khrushchev predicted: “We will bury you.” But the items supplied were Stinger misin the West today, we see a free world that has achieved a level siles, which were subsequently credof prosperity and well-being unprecedented in all human hisited with defeating the Soviet mission tory. In the Communist world, we see failure, technological by denying Soviet aircraft the combackwardness, declining standards of health, even want of mand of the skies upon which Mosthe most basic kind—too little food. Even today, the Soviet cow’s military strategy depended. Union still cannot feed itself. After these four decades, then, Finally, in order to erode Soviet inthere stands before the entire world one great and inescapfluence in the developing world, the able conclusion: Freedom leads to prosperity. Freedom reUnited States intervened indirectly in places the ancient hatreds among the nations with comity several low-intensity conflicts in develand peace. Freedom is the victor. oping nations. The strategy of rolling And now the Soviets themselves may, in a limited way, be back Soviet influence in such areas becoming to understand the importance of freedom. We hear came known as the Reagan Doctrine, much from Moscow about a new policy of reform and openand the principal battleground was ness. Some political prisoners have been released. Certain the Western Hemisphere, where the foreign news broadcasts are no longer being jammed. Some United States actively aided the righteconomic enterprises have been permitted to operate with wing Contras in the conflict in Nicaragreater freedom from state control. gua and sought to topple El Salvador’s Are these the beginnings of profound changes in the Soleftist government as well. The U.S. viet state? Or are they token gestures, intended to raise false support of the anti-Soviet fighters in hopes in the West, or to strengthen the Soviet system without Afghanistan also fell under this docchanging it? We welcome change and openness; for we betrine, as did such other U.S. initiatives lieve that freedom and security go together, that the advance as the 1982 deployment of a peaceof human liberty can only strengthen the cause of world keeping force in Lebanon, where Sopeace. There is one sign the Soviets can make that would be viet ally Syria was attempting to influunmistakable, that would advance dramatically the cause of ence the outcome of the ongoing civil freedom and peace. war. General Secretary Gorbachev, if you seek peace, if you Throughout his first term, Presiseek prosperity for the Soviet Union and Eastern Europe, if dent Reagan’s rhetoric generally you seek liberalization: Come here to this gate! Mr. Gorbamatched or exceeded his hawkish polchev, open this gate! Mr. Gorbachev, tear down this wall! icies. The Soviet Union was stigmatized as “the Evil Empire,” and assorted
226
■
Cold War
and U.S.-Soviet arms reduction talks collapsed when Gorbachev tied Soviet arms reductions to the abandonment of the U.S. SDI project. Reagan’s Second Term
Gradually, meetings between the two world leaders became more cordial, as it became clear that Gorbachev was sincere in his efforts toward reform. The communist leader introduced programs to restructure the Soviet Union’s command economy (perestroika) and open public discourse (glasnost). More important, by the end of Reagan’s second term in office, U.S.-Soviet relations had moved rapidly from confrontational policies and rhetoric, through a return to détente—including cordial summit meetings in Washington (1987) and Moscow (1988)—to cooperation in that most sensitive of areas, arms control. In December of 1987, the two superpowers nego tiated the Intermediate-Range Nuclear Forces (INF) Treaty, an arms-limitation treaty calling for the dismantling of short- and medium-range offensive missiles and the establishment of an international inspection system to police the process. Such a treaty would have been unthinkable only two and one-half years earlier, when Gorbachev came to power at one of the lowest points in U.S.-Soviet relations.
The Bush Presidency and the Soviet Collapse
Reagan’s successor, George H. W. Bush, continued to preside over the winding down of the Cold War, maintaining the relationship of personal trust with Gorbachev that Reagan had developed by the end of his term. In 1991, Bush concluded negotiations that had been begun by Reagan in 1982, when he signed the first Strategic Arms Reduction Treaty (START I), which placed limits on long-range nuclear weapons. He signed a second treaty, START II, in January, 1993, the same month his presidency ended. By then, the Soviet Union had lost its empire in a series of popular uprisings against Eastern European communist regimes during the summer and fall of 1989, and it had itself dissolved following a failed military coup against Gorbachev during the summer of 1991, which unleashed dissident forces in the Soviet Union’s non-Russian republics that Gorbachev was never able to overcome. With the dissolution of the Soviet Union, the United States declared itself the victor in the Cold War.
Impact The fall of the Soviet Union triggered a debate, all too often shaped by partisan considerations,
The Eighties in America
regarding the impact of Reagan’s policies on the collapse of communism there and throughout Eastern Europe. The more laudatory analyses of Reagan’s influence hold that, in standing up to Soviet aggressiveness in the 1980’s, President Reagan forced Gorbachev to reform at home, setting into motion the series of events that culminated not only in the Soviet Union’s withdrawal from Afghanistan in February of 1989 but also the U.S. victory in the Cold War. Detractors argue that the Soviet Union was already mortally damaged, largely by self-inflicted wounds, when Reagan assumed the presidency. This argument stresses the Soviet Union’s prior unwillingness to decentralize its inefficient command economy during the 1970’s, which resulted in the widespread domestic economic dissatisfaction that Gorbachev inherited in the mid-1980’s. It also focuses on the Soviet invasion of Afghanistan, which further strained an economy already overtaxed by the Soviet Union’s need to subsidize the equally inefficient economic systems of its clients in Eastern Europe. Viewed from this perspective, Reagan is occasionally reduced to being the man who happened to be in the White House when a modernizing leader assumed power in the Kremlin, loosened the reins of control on both the Soviet economy and cultural and political discourse, and engaged in a by-thenunavoidable scaling back of Soviet international adventurism. The truth probably lies somewhere between these two arguments. While the often infirm old guard continued to rule in the Kremlin during the early 1980’s (Leonid Brezhnev until 1982, Yuri Andropov from October, 1982 to February, 1984, and Konstantin Chernenko from February, 1984, to March, 1985), Reagan’s revival of an arms race raised the costs of continuing Cold War competition with the United States to levels the Soviet economy could no longer bear. The same arms buildup also added massive deficits to the federal budget in the United States, but the latter had a larger economy and— even in the global recessionary years of the mid1980’s—one better able to withstand the strain in the short term. It is therefore likely that Gorbachev was forced to move faster in liberalizing policies at home than he might otherwise have done and that the Politburo was pressured to go along with these policies. It is equally likely that he was pushed into arms reduction agreements by these economic reali-
The Eighties in America
ties at home as much as by his awareness of the dangers of nuclear confrontation and his growing trust of President Reagan. At the same time, however, given Gorbachev’s vulnerability, the U.S. “victory” over communism came at a cost: Gorbachev’s willingness to compromise with liberals at home and cold warriors in the United States were the reasons cited by those who sought to depose him in the summer of 1991. While he survived the coup, he did not survive much longer politically, and by August, 1991, power in the Kremlin was in Boris Yeltsin’s hands. Gorbachev’s fall had much to do with Reagan’s Cold War rhetoric and policies. It was very difficult for Gorbachev to consolidate his hold on power at home while simultaneously adopting a more dovish position toward the United States, an extremely hawkish opponent. After the Cold War, moreover, the debt accumulated by the United States to end it continued to grow for decades, becoming a seemingly permanent part of the U.S. federal budget. Further Reading
Brune, Lester H. Chronology of the Cold War, 19171992. New York: Routledge, 2006. A lengthy (seven-hundred-page), authoritative, and detailed summary of the Cold War that faithfully takes readers through its final moments during the 1980’s and into the implosion of the Soviet Union in 1991-1992. Cannon, Lou. President Reagan: The Role of a Lifetime. New York: Simon & Schuster, 1991. One of the best accounts of the man widely praised for winning the Cold War, written by his longtime biographer. Collins, Robert M. Transforming America: Politics and Culture in the Reagan Years. New York: Columbia University Press, 2007. More scholarly than Cannon’s work, this volume offers specific chapters on Reagan’s relations with the Soviet Union and his role in winning the Cold War. Hook, Steven W., and John Spanier. American Foreign Policy Since World War II. 16th ed. Washington, D.C.: CQ Press, 2007. A standard short text on the topic, with outstanding chapters on the revival of confrontation politics during the 1980’s and the Cold War’s conclusion at the decade’s end. LeFeber, Walter. America, Russia, and the Cold War, 1945-2006. Boston: McGraw Hill, 2006. Widely available work that carefully places the events of
Color Purple, The
■
227
the 1980’s into the context of the superpower conflict that dominated international affairs for nearly half a century. Joseph R. Rudolph, Jr. See also
Berlin Wall; Bush, George H. W.; Foreign policy of the United States; Middle East and North America; Military spending; Olympic boycotts; Olympic Games of 1980; Reagan, Ronald; Reagan Doctrine; Reaganomics; Soviet Union and North America.
■ Color Purple, The Identification Novel Author Alice Walker (1944Date Published in 1982
)
The Color Purple stirred great controversy upon its publication in 1982; it was both hailed and attacked for its characterization of gender roles, its interpretations of sexuality and religion, and its portrayal of the strength and results of the bonds of female friendship. The controversy generated by The Color Purple (1982) continued throughout the 1980’s. Author Alice Walker, a self-described “womanist” (a term meant to oppose the largely white and middle-class associations of “feminist”), defended the book against charges of male-bashing and reverse sexism. These accusations stemmed from Walker’s critical portrayal of male characters, particularly the central male protagonist, referred to namelessly in the first chapters of the book as “Mr.____.” Walker responded that her accusers failed to read the book through to the end, or to study it carefully enough to see her real message, a variation on universal salvation for men and women of all races and social stations. The novel’s central character, an African American woman named Celie, is sexually abused by her stepfather, Alfonso, from whom she births two children. She is forced to relinquish both. Alfonso then forces Celie to marry Mr.____, while targeting her sister, Nettie, as his next victim. Nettie runs away and ends up in Africa, sending letters to Celie that are intercepted and hidden by Mr.____. Throughout many chapters, Celie and other women in the novel are victimized by men who are subtly portrayed as being victims themselves. Celie writes to God and expects help, but it does not come.
228
■
The Eighties in America
Colorization of black-and-white films
Impact The multilayered gendered, racial, social, and religious aspects of The Color Purple made it one of the most controversial and widely dissected books of the 1980’s. A new dialogue on women’s issues— particularly African American women’s issues—was brought to public attention, causing a cultural focus on domestic violence, sexism, racism, and same-sex relationships. The book gained an even wider audience in 1985, when it was adapted into a movie by Steven Spielberg. Further Reading
Bloom, Harold, ed. Alice Walker’s “The Color Purple.” Philadelphia: Chelsea House, 2000. Dieke, Ikenna. Critical Essays on Alice Walker. Westport, Conn.: Greenwood Press, 1999. Light, Alison. Fear of the Happy Ending: “The Color Purple,” Reading, and Racism. London: Chelsea House, 1987. Twyla R. Wells See also African Americans; Book publishing; Domestic violence; Feminism; Homosexuality and gay rights; Literature in the United States; Multiculturalism in education; Rape; Women’s rights.
Writer Alice Walker in 1983, after winning the Pulitzer Prize for The Color Purple. (AP/Wide World Photos)
■ Colorization of black-and-white films Definition
Through writing the letters, however, she gains a means of expression and eventually strength. The arrival of her husband’s former wife signals a new life for Celie; the ex-wife, Shug, is a free spirit who shows Celie how to control her own destiny rather than be a passive victim of others. Eventually, the two women form a close spiritual and sexual bond. Mr.____ is left alone and desolate, cursed to live a rootless and loveless existence until he repents for his extreme mental and physical cruelty. Once he repents, his character is referred to as “Albert” and no longer functions as a nameless symbol of oppressive men. Throughout the novel, female friendship is characterized as a means of rising above oppression, patriarchy, racism, and violence. God is seen as a force rather than a being and as such provides real and spiritual comfort.
A process to add color to black-andwhite, sepia, or monochrome motion-picture images
Colorizing film tediously by hand was possible even for the earliest filmmakers of the 1890’s. By the 1980’s, however, computers made it possible to add color to entire films far more efficiently. The process was seen as a profitable way to attract television audiences to the many old black-andwhite films to which television networks owned broadcast rights. Critics, film historians, and movie purists loudly denounced the practice, however, creating a long-running controversy. Colorization by computer is a process invented by Wilson Markle. After striking a new film print from the original black-and-white negative, a videotape copy is then made from that print. Then, using a computer, color is added to each object in each frame of the print. As the original colors of complexions, hair, clothes, furniture, and other visible ob-
The Eighties in America
jects are not always known, the colorists must rely on common sense, aesthetics, and their own judgment. Often research into studio archives produces information or photographs of sets and costumes that allow color choices to be authenticated. The Pros and Cons of Colorization With the increased popularity of old movies on television in the 1970’s and 1980’s, it was clear to studio heads that the huge backlog of black-and-white films and television shows could fill many hours of air time, cost very little, and produce healthy profits. The audiences most coveted by television stations, however, were young people who had grown up watching most films in Technicolor. Black-and-white films did not appeal to them. Colorization was clearly the answer. The process had been used successfully to color the black-and-white pictures of the Moon taken during a 1970 Apollo mission. Since that time, the colorization process had been improved by several different companies, each developing slightly different computer technologies, such as Neural Net, pattern recognition, and background compositing, and interactive processes that allowed pixels of similar tones automatically to be given similar colors. The downside of colorization was its expense and labor-intensiveness. Colorizing a film or an old black-and-white television show was estimated at one time to cost $3,000 per minute of running time. The film or show had to be colored frame by frame. Single objects were digitally tinted in each frame, one at a time, until every object in that frame was colored. One old film or show could therefore cost $300,000 or more. Still, a colorized film shown on television could generate revenue of at least $500,000, and even more revenue might come from the sale of videocassettes, so colorization seemed a good business plan. Colorization’s High Point Television mogul Ted Turner bought all or parts of the film libraries of the Metro-Goldwyn-Mayer (MGM), Warner Bros., and Radio-Keith-Orpheum (RKO) movie studios. He commissioned Color Systems Technology to begin colorizing more than one hundred of his movies over the next few years to make them more appealing to television viewers. Yankee Doodle Dandy (1942) and Topper (1937) were two of the first black-andwhite films redistributed in color. Controversy erupted when Turner said he intended to colorize the iconic Citizen Kane (1941).
Colorization of black-and-white films
■
229
Critics, historians, film directors, and fans all decried the plan, calling colorization “cultural vandalism” and film “bastardization.” Turner responded that he had been joking; he had no intention of colorizing Citizen Kane. Actually, the film was still under the control of the Orson Welles estate, whose permission was needed for anyone to tamper with it in any way. Turner did, however, proceed to colorize several other movies, including the venerable Casablanca (1942). Sufficient outcry among movie directors and others in the film industry caused Congress to create the National Film Registry in 1988. This registry was a list of movies, chosen by the Library of Congress at a rate of twenty-five per year, that were deemed to be culturally, historically, or aesthetically significant. The National Film Preservation Act of 1988 made it illegal to distribute or exhibit a colorized version of a black-and-white film included in the registry, unless the film was labeled with a suitable disclaimer. The Controversy
Those who considered film an art form considered colorization an immoral appropriation of the original filmmaker’s conception. They contended that black-and-white films were works of art in a form created by the filmmaker and were not to be altered by anyone else for purely monetary gain. Even those who saw films as collective artworks, rather than realizations of personal vision, asserted that colorization was simply ugly, that the technology was not advanced enough to produce satisfactory results, and that the black-and-white originals were more aesthetically pleasing than their colorized versions. Filmmakers, however, had no legal rights over their films, which in almost all cases belonged to studios and production companies, not to directors. Consequently, the corporation that owned a film could colorize it regardless of its creator’s desires. Film directors especially felt that colorization destroyed the artistic integrity of their blackand-white films. They felt that if studios were allowed to add color, there would be nothing to prevent them from adding different sound tracks, introducing additional scenes, or even reediting the entire film.
Impact As the decade waned, so did the interest in colorized films, especially since, as critics had pointed out, most of them had washed out colors and overly soft contrasts. The colorized films could obviously not match the high quality either of the original
230
■
The Eighties in America
Comedians
black-and-white cinematography they replaced or of contemporary films originally shot in color. The cost of colorizing remained high, even with advances in computer technology, and the integration of color into the films was less than satisfactory. As a direct result of the controversy, though, the U.S. government began to maintain an annually growing list of films that were considered to be part of Americans’ cultural heritage, increasing both funding for and interest in film preservation and film history. As television channels began to proliferate and the need for reasonably priced programming increased, colorized television shows seemed more appealing. The cost of colorizing favorite black-andwhite television shows was far less than making new shows in color, and there were no residuals to pay to actors, directors, and others who had either passed on or were contractually excluded. In spite of the strong argument in favor of colorizing old shows, McHale’s Navy was one of only a very few television shows to be colorized during the 1980’s. Though many disagreed about its value, colorization offered a process to reintroduce old black-and-white films and television shows to present and future generations. Further Reading
Grainge, Paul. “Reclaiming Heritage: Colourization, Culture Wars, and the Politics of Nostalgia.” Cultural Studies, October, 1999, 621-638. Discusses the controversy over film colorization’s impact on American culture and its regard for early movies. Mathews, Jack. “Film Directors See Red Over Ted Turner’s Movie Tinting.” The Los Angeles Times, September 12, 1986, sec. 6, p. 1. A detailed account of the reactions of filmmakers to Ted Turner’s decision to colorize black-and-white films he had purchased from large Hollywood studios. Comments from both proponents and opponents, including Woody Allen, Billy Wilder, and colorizing company executives give a balanced discussion of both views. Sherman, Barry L., and James R. Dominick. “Perception of Colorization.” Journalism Quarterly 65 (Winter, 1988): 976-980. Researched data on mostly favorable audience response to colorized films with audiences finding them more contemporary than black-and-white films. Young, James O. “Still More in Defense of Colorization.” Journal of Aesthetics and Art Criticism 50
(Summer, 1992): 245-248. Discusses arguments for and against film colorization, supporting the view that it is not morally equivalent to tampering with such instantiated artworks as paintings or sculpture. Jane L. Ball See also
Computers; Film in the United States; Special effects; Television; Turner, Ted.
■ Comedians Definition
Performers of humorous material on stage and in film, television, and recordings
By the 1980’s, comedians were performing live not only in small comedy clubs, nightclubs, and theaters but also in huge sports arenas. Meanwhile, the advent of cable television allowed them to reach even wider audiences. By the 1980’s, comedians’ role had become important in the entertainment world; audiences flocked to see them. Comic repertoires had expanded significantly during the 1960’s and 1970’s. Some comedians’ routines continued to comprise relatively uncontroversial tall tales, jokes, and one-liners about mothers-in-law or “walking into a bar,” but others encompassed largely untapped topics like race relations, political figures, and sex. It was not lost on impresarios that comedians could attract large paying crowds in huge venues. Popular comedians continued to perform on television and in movies, while the rise of cable television channels such as Home Box Office (HBO) allowed them to reach vast audiences with their routines virtually uncensored. Comedic Subgenres At first, young comics finding their voices during the second half of the twentieth century tended to follow the style of earlier comedians like Jack Benny. By the 1980’s, however, they began varying their material. Nevertheless, they tended to fall into recognizable subgenres. Observational comedians, for example, talk about their own everyday lives (whether actual or vastly exaggerated). They make fun of normal society, often focusing on and ascribing great importance to life’s seemingly trivial minutia. Observational comedians comment on their perceptions of family, friends, and even strangers. Almost any familiar activity or practice is suitable for their comedy. For example, when car-
The Eighties in America
pool lanes became fairly common on freeways in the 1980’s, one comic told a joke about driving with two pet dogs wearing tiny hats so he could use the carpool lane. Delivery plays an important part in observational comedy. Character comedians assume a persona other than their own on stage. The character adopted by the comedian is often a stereotype easily recognizable to the audience. The humor associated with such performances is often recognition humor, as a skilled comic may simultaneously capture and parody a voice or attitude that audience members have encountered in their lives. At other times, performers will use personas to set up expectations and then violate them, using surprise either to get laughs or to add depth to an act. Whoopi Goldberg demonstrated mastery of this technique in a stand-up routine in which she assumed the persona of a young Valley girl. Her intonation and delivery perfectly mimicked actual Valley girls, but she strayed into serious territory when her persona described becom-
Comedians
■
231
ing pregnant and terminating her pregnancy. Televised sketch comedy shows also provide venues wellsuited to character comedians: Eddie Murphy’s success on Saturday Night Live owed much to his skill portraying characters such as a grown-up version of child star Buckwheat, the animated character Gumby, and Mr. Robinson (an inner-city version of children’s television host Fred Rogers). Prop comedians rely on their skill at slapstick and improvision to interact with objects or costumes in humorous ways. Sometimes considered unsophisticated, these comedians depend on silliness, exaggeration, pratfalls, outlandish outfits, and other tricks considered to be passe by some successful 1980’s comedians. Still, Phyllis Diller, using fright wigs, garish garments, and a maniacal laugh, became extremely successful in the 1960’s and 1970’s, and remained successful in the 1980’s. Gallagher, whose trademark was smashing watermelons with a sledgehammer, was perhaps the most successful prop comedian of the 1980’s.
Comedian Richard Pryor (left, with chicken) appears on The Tonight Show with host Johnny Carson in October, 1986. (AP/Wide World Photos)
232
■
The Eighties in America
Comedians
Some 1980’s comedians made jokes about dark subjects like death, rape, drugs, war, and terrorism. With the Vietnam War only recently over and many social and political issues on the minds of Americans, comedians saw dark comedy as a way for the nation to face its “demons” and laugh at them. Although latenight talk-show host Johnny Carson was not known for dark comedy, even he managed sometimes to draw on such subjects for a laugh; he got one when he told an audience that hair and fingernails continue to grow for three days after death—but phone calls taper off. Satirists, meanwhile, used news and current events to make light of issues on people’s minds. Satire, an age-old technique of writers and performers, appealed to the more cerebral comedians. Successful 1980’s Comedians In 2004, television’s Comedy Central channel compiled its list of the one hundred greatest stand-up comedians of all time. Eight of the top twenty comedians from that list performed at their peak in the 1980’s. At the top of the list was Richard Pryor. An African American, Pryor told stories illuminating racial issues and customs. These stories were often laced with vulgarities, racial epithets, and other potentially controversial language. He made recordings, appeared on television, and by 1983 solidified his success by starring in successful motion pictures. Bill Cosby, another African American, also started his career performing in comedy clubs. In 1965, he became the first African American male to star in a television drama when he was cast in I Spy. In the 1980’s, Cosby produced and starred in one of the decade’s most successful situation comedies (sitcoms), The Cosby Show. His comedy was warm, witty, observational, and narrative; it was never indecent or vulgar. Thus, it ran counter to a dominant trend of the decade, as many comics embraced vulgarity as part of their act. Roseanne Barr was one of several highly successful female comedians of the decade. She assumed the persona of the typical American working-class housewife, whom she called a “domestic goddess,” and appeared on many television shows that spotlighted her wry comments. She soon made her persona into the basis for a sitcom, Roseanne, in which she played housewife Roseanne Connor. Johnny Carson continued as host of The Tonight Show throughout the 1980’s, having established himself as the “king of late night.” Carson himself was a
skilled comic, quick-witted and charming, whose stand-up monologues addressed contemporary events and personages. His show was far more important, however, as a showcase for other comedians. Carson became the ultimate gatekeeper of national comedic success in the United States. Any stand-up performer who did well enough on The Tonight Show to be invited to sit on the couch and talk to Carson for a few minutes afterward would find that his or her career had been made, and any performer who desired national success had to secure an invitation to perform on the show. Impact As the number of television channels increased, there was more airtime to fill but not a surplus of money to pay for new content. Stand-up comedy was extremely cheap, requiring minimal crews, a single-person cast, and no sets or special effects to speak of. Despite that, it was extremely popular with audiences, so it offered broadcasters a great deal of bang for the buck. Even fully produced half-hour sitcoms were significantly less expensive than hourlong dramas. Comedians thus became an ever more sought-after commodity. Though television variety shows lost favor, comedians were employed to star in or host thirty-minute sitcoms, talk shows, specials, and award shows. Further Reading
Ajaye, Franklin. Comic Insights: The Art of Stand-Up Comedy. Beverly Hills, Calif.: Silman-James Press, 2002. Interviews of comedians such as Roseanne Barr and Jay Leno, as well as comedy club owners, agents, and others, who discuss the business of comedy, comedians’ inspirations and motivations, and practical tips for becoming a comedian. Epstein, Lawrence. The Haunted Smile: The Story of Jewish Comedians in America. New York: Public Affairs Books, 2001. A history of Jewish comedians’ impact on American entertainment, including such comics as Andy Kaufman, Richard Belzer, Alan King, and Woody Allen. Littleton, Darryl J. Black Comedians on Black Comedy: How African Americans Taught Us to Laugh. New York: Applause Theater and Cinema Books, 2006. Covers some history of African American comedy along with biographical information on such 1980’s comedians as Eddie Murphy, Damon Wayons, Richard Pryor, and Bill Cosby. Jane L. Ball
The Eighties in America See also Action films; African Americans; Cosby Show, The; Film in the United States; Letterman, David; Martin, Steve; Murphy, Eddie; Murray, Bill; Sitcoms; Talk shows; Television; Williams, Robin.
■ Comic Relief The Event
Televised live comedy fund-raiser for the homeless Date Aired March 29, 1986 Comic Relief was a fund-raising event designed to aid homeless people in eighteen states and twenty-three cities. Produced by Bob Zmuda, Comic Relief was based on a similarly titled British show, which aired in the United Kingdom in 1985. The British show, produced by comedy screenwriter Richard Curtis, was itself inspired by an earlier 1985 event, Live Aid, which was a mammoth benefit rock concert organized to collect money to relieve hunger in Ethiopia. Live Aid aired internationally and was an enormous success, spurring Curtis to organize an event of his own to raise money for African relief. He recruited comedians and comedy writers to participate in another televised benefit, modeled after Live Aid but with a comedic rather than a musical format. The following year, Zmuda enlisted Billy Crystal, Robin Williams, and Whoopi Goldberg to act as emcees for an American version of the event, which was to be a live, three-hour comedy show. The proceeds from the event went to help homeless people in the United States, whose growing numbers were a significant social problem of the 1980’s. The premium cable channel Home Box Office (HBO) agreed to air the show live and to record it for future airings. HBO provided free access to the show, so television viewers could watch it regardless of whether they subscribed to the channel. The first show—which featured a combination of live stand-up comedy acts, films of the homeless, stories of their struggles, and recorded pleas by celebrities for donations—was the first national telethon for homeless relief in the United States. Forty-seven comedians performed in the broadcast, which raised more than $2.5 million. The success of the show motivated its organizers to turn it into an annual event, and a total of thirteen Comic Relief USA shows were organized from 1986 through 1998. All were filmed for rebroadcast, enabling them to raise more money over time.
Comic strips
■
233
Impact During the twentieth century, Comic Relief USA raised more than $50 million. The money was used to aid homeless people in the United States and to fund humanitarian aid to people in African nations. The show’s success enabled organizers to bring relief to many people in need, and it gave reality to Comic Relief’s official slogan, “Where there is laughter, there is hope.” Subsequent Events After a seven-year hiatus, the original American Comic Relief team reunited in 2006 to raise money to aid the victims of Hurricane Katrina. Further Reading
Gold, Todd, ed. Comic Relief: The Best of Comedy for the Best of Causes. New York: Avon Books, 1996. Redburn, F. Stevens, and Terry F. Buss. Responding to America’s Homeless: Public Policy Alternatives. New York: Praeger, 1986. Leslie Neilan See also Africa and the United States; Comedians; Homelessness; Live Aid; Poverty; USA for Africa; Williams, Robin.
■ Comic strips Definition
Sequential narrative cartoon drawings, often published in newspapers or other periodicals
Comic strips in the 1980’s continued their postwar decline, as newspapers allotted less space for comics. Fewer strips were published, and those strips that were published had to shrink to fit in the space allotted to them. National comic strips were still able to attract considerable attention, however, and some comic strip creators became minor celebrities. Comic strips declined in popularity somewhat during the 1980’s, but the “funnies page” remained the second-most-viewed page in any given newspaper, after the front page. Comic strips of the 1980’s ranged in tone and content from political satire to fantasy to realism. Some generated merchandising opportunities, while others proved controversial. Political Comic Strips
During the conservativedominated Ronald Reagan era, many political comic strips remained staunchly liberal. Garry Trudeau’s Doonesbury was the nation’s premier political strip.
234
■
Comic strips
Although it lost some of its countercultural edge, shunning the drug humor that had been a mainstay earlier in its run, Doonesbury was still capable of generating controversy. In 1980, during the presidential election campaign, the strip ran a week-long attack on Republican candidate Reagan’s intelligence, entitled “The Mysterious World of Reagan’s Brain.” More than two dozen newspapers refused to publish part or all of the series, claiming that such pointed political commentary had no place in the comics section (as opposed to the political cartoon section of the editorial page). Indeed, some newspapers chose to move Doonesbury to their editorial pages or otherwise to segregate it from their main comics pages. At the other end of the decade, in a sequence beginning in 1989, character Andy Lippincott, among the earliest gay male characters in mainstream comics, sickened of AIDS. (He would die in 1990.) Trudeau set a precedent for star cartoonists when he took a lengthy vacation, or “hiatus,” in 1983 and 1984. Trudeau was not the first strip creator to take a vacation, but he did not employ the techniques of earlier cartoonists to fill the space occupied by his strip during his absence. He neither stockpiled strips in advance to run during the hiatus nor hired a ghostwriter to fill in anonymously for him. Instead, his departure was openly acknowledged—earning the scorn of older creators, such as Charles Schulz of Peanuts—and his syndicate printed “reruns” of previously published Doonesbury strips in place of new material. On his return, Trudeau finally took his core cast—the Everyman Michael Doonesbury, the activist Mark Slackmeyer, the football jock B.D., and others—out of college and into the working world. For a successful strip, Doonesbury attracted surprisingly few imitators. One of the few was Berkeley Breathed’s Bloom County, which ran from late 1980 to 1989. Bloom County mixed Doonesbury’s political satire and liberal advocacy with child and animal characters reminiscent of humor strips such as Pogo and Peanuts. Breathed followed Trudeau in 1987 as the second strip cartoonist to win a Pulitzer Prize for editorial cartooning. Apolitical Fantasy Comic Strips
Despite the success of Doonesbury and Bloom County, most comic strips stayed well away from politics. Bill Watterson’s Calvin and Hobbes, which ran from late 1985 to the last day of 1995, focused on the adventures of a sixyear-old boy named Calvin and his stuffed tiger
The Eighties in America
Hobbes. Calvin and Hobbes contrasted a child’s imagination—Hobbes is alive when perceived by Calvin, a motionless toy when perceived by others—with mundane reality in a way reminiscent of such earlier twentieth-century works as Windsor McKay’s Little Nemo in Slumberland. Watterson combined stunningly creative visual artistry with an uncanny ability to capture in a few well-chosen words and images the essence of childhood logic, winning wide popularity and acclaim. He also attracted attention for his stubborn refusal to license his characters to appear as toys or in other media, despite enormous pressure from his syndicate. Watterson’s refusal to commercialize starkly contrasted to one of the pop culture phenomena of the 1980’s, Jim Davis’s Garfield, a strip centered on an overweight cat and his dim-witted owner and designed to offend as few people as possible, even at the price of blandness. Garfield, which began its run in 1978, became a marketing juggernaut, as paperback strip compilations bestrode the best seller lists and the image of the fat orange cat became ubiquitous and was even parodied in the form of Bloom County’s defiantly unlovable Bill the Cat. Garfield was also the subject of several animated television specials, as well as a Saturday morning cartoon. It was the springboard for Davis to launch another and far less successful strip, U.S. Acres, which ran from 1986 to 1989. Another of the most successful and distinctive strips of the 1980’s was the single-panel comic The Far Side by Gary Larson. Featuring a decidedly idiosyncratic perspective that embraced absurdism and grotesquery in equal measure, the strip often embraced the perspectives of animals at humans’ expense, as in a strip in which polar bears are attacking an igloo and one exclaims, “I just love these things! Crunchy on the outside and a chewy center.” Larson, too, registered his disapproval of Garfield, when he drew a strip featuring a python with a large lump in its stomach—lying directly behind Garfield’s water dish. Realistic Comic Strips More realistic strips competed with those featuring imaginative children and talking animals. Continuity-heavy “soap opera” strips such as Rex Morgan, M.D. and humor stalwarts such as Blondie and Beetle Bailey continued to run as they had for decades, while attracting little attention outside their fan bases. One of the most popular strips of the 1980’s, Cathy, by Catherine Guisewite, had debuted in
The Eighties in America
Compact discs (CDs)
1976 as a demographically targeted strip originally based on Guisewite’s own life. Nearly every Cathy strip revolved around the main character, a single professional woman, and her ordeals involving weight, career, and boyfriends. Joining Guisewite as another of the few successful women on the comics page was Canadian Lynn Johnston, creator of For Better or For Worse. For Better or For Worse, with a large cast of characters revolving around Elly and John Patterson and their children, first appeared in 1979. It was particularly notable for characters that aged in real time and for making fewer concessions to the American audience than did most foreign strips. Like Guisewite, Johnston drew on her own life history and family for her material. In 1985, she became the first woman and the first Canadian to win the Reuben Award for Outstanding Cartoonist of the Year. The late 1980’s also saw a surge in comic strips devoted to African Americans, traditionally underrepresented on the comics pages of mainstream newspapers. Ray Billingsley’s Curtis made its first appearance in 1988, to be followed the next year by Stephen Bentley’s Herb and Jamal and in 1990 by Robb Armstrong’s Jumpstart.
Further Reading
Weekly Comics
■ Compact discs (CDs)
Most newspaper comics appeared daily, with a larger strip on Sundays. Weekly periodicals such as New York’s Village Voice provided an outlet for comic strips whose subject matter was too controversial or whose formats were too experimental for daily newspapers and the dominant syndicates. Comics appearing in the alternative media included Alison Bechdel’s lesbian epic Dykes to Watch Out For, which began in 1983, and Ben Katchor’s surreal Julius Knipl: Real Estate Photographer, which began in 1986. In 1983, film director David Lynch began publishing one of the oddest of the alternative strips, The Angriest Dog in the World. The strip featured exactly the same images every week; only the speech bubbles changed. The alternative market provided greater creative control than daily newspaper syndication but little money.
Impact Comic strips were a widely circulated cultural product during the 1980’s, providing a common topic of conversation that ranged across ethnic and demographic groups. Although the comics page was slow to change, it reflected such 1980’s developments as the mainstreaming of the counterculture and the increased cultural presence of working women, gay people, and African Americans.
■
235
Nordling, Lee. Your Career in the Comics. Kansas City, Mo.: Andrews and McMeel, 1995. An exhaustive study of the newspaper comics business, geared for the aspiring cartoonist. Walker, Brian. The Comics: Since 1945. New York: Harry N. Abrams, 2002. A standard history of the medium since World War II that includes many reproductions. The author is both a comics scholar and a comic strip creator himself, part of the team that produces Beetle Bailey and Hi and Lois. Watterson, Bill. Calvin and Hobbes: Sunday Pages, 1985-1995. Kansas City, Mo.: Andrews McMeel, 2001. This catalog from an exhibit at the Ohio State University Cartoon Research Library contains reflections by Watterson on his inspirations and the process and business of making newspaper comic strips. William E. Burns See also
Bloom County; Journalism.
Definition
Digitally encoded, laser-read discs for storing music and information Manufacturer Sony and Philips Date Introduced in 1982 CDs replaced vinyl records as the primary medium of music storage and distribution to consumers. Their popularity helped revitalize the recording industry, and their versatility as a storage medium resulted in the discs being adapted to store computer data and applications as well as music. When the first commercially available compact discs (CDs) arrived on record stores’ shelves in 1982, the music industry was experiencing one of its increasingly frequent sales slumps. While there was always a handful of million-selling albums or singles, the industry as a whole seemed to be stagnating. Simultaneous, for instance, with the compact disc’s debut was Warner Bros. Records’ headline-making overhaul of its artist roster, in which the company dropped Arlo Guthrie, Van Morrison, and other highly regarded but relatively low-selling performers in an effort to increase its financial viability. MTV, which would soon revolutionize and revitalize the industry, had not yet been launched, and neither hip-
236
■
Compact discs (CDs)
The Eighties in America
hop nor any other new musical genre had emerged to capture the public imagination the way disco had in the 1970’s. CDs altered this situation: They were marketed as being so superior to records in audio quality that they made audiophiles dream of the potential impeccability of digital audio recordings, and of purchasing such recordings to replace their imperfect analog vinyl records. Slow Initial Adoption Following as they did the failure of other similarly heralded formats to catch on with the public, CDs were initially greeted with skepticism. They could, after all, be played only on then-expensive CD players, and consumers who had previously invested in the audio-visual equipment necessary to play quadraphonic vinyl records, eighttrack cassettes, or laserdisc films were understandably hesitant to take another financial plunge into a pool that might soon dry up. Compounding their caution were the initially limited number of titles available in the CD format and the even more limited number of fully digital recordings. The industry invented a code, placed on each CD’s jewel case, to indicate whether the recording was completely digital (DDD), digitally remastered from an analog original (ADD), or simply a digitized version of an analog recording (AAD). Music afficionados were well aware that an AAD recording on a CD would be of no higher quality than a vinyl recording and might well be worse. Classical music was the genre whose enthusiasts provided the initial impetus behind the push for CD technology. One legend even ascribes the CD’s original seventy-four-minute length to Sony vice-president Norio Ohga’s desire for the format to accommodate Beethoven’s Ninth Symphony. This was in part because the ideal live classical experience is wholly acoustic, whereas most rock performances are expected to be amplified and otherwise filtered through electronic systems. Thus, fidelity in tone is of more concern to the average classical fan than to the average rock fan. As a result, the number of classical titles outnumbered the titles available in rock or other popular styles, and classical labels commissioned new, fully digital recordings of the most famous works. Controversy over Quality
In the absence of firsthand consumer experience with CDs, rumors began to circulate that exaggerated the format’s virtues. Chief among these virtues was the compact disc’s
During the 1980’s, compact discs became the primary means of distributing recorded music. (PhotoDisc)
alleged imperviousness to physical damage. Consumers were erroneously assured that no amount of surface scratching would impair playback and that a compact disc would play perfectly even if smeared with peanut butter. Rather than counter such assertions, advocates of vinyl argued that the best analog recording and playback equipment yielded fuller, “warmer” sound than digital recording was capable of achieving. (Because digital recording translates sound into a binary language rather than simply capturing it, all such recordings necessarily filter or modify some aspect or portion of the original sound, whereas analog recording is capable in principle of capturing a complete, unaltered sound.) Vinyl’s advocates asserted that the putative superiority of that medium made such flaws as surface noise and occasional skipping worth enduring. Eventually, however, neither the susceptibility of compact discs to scratching nor the potential audio superiority of vinyl mattered. The equipment necessary even to approach the level of sound fidelity vi-
The Eighties in America
nyl’s advocates claimed possible was prohibitively expensive. Moreover, the average listener lacked the aural training to detect the imperfections inherent to digital recording, whereas the flaws inherent to records were much easier to notice. CDs, meanwhile, were not impervious, but they were less susceptible to damage than were records, which was all that mattered to a consumer choosing between the two media. CDs Gain Popularity
As CDs became more popular and more titles were issued in CD format, discs began to take up more shelf space in music stores. Of necessity, the amount of space available for records and audiocassettes decreased. By the end of the 1980’s, references to the “death of vinyl” had become common, with smaller stores phasing out vinyl altogether. For a time, cassettes, which had surpassed vinyl in sales before the rise of CDs, became the highest-selling prerecorded music medium. Unlike vinyl, however, the cassette yielded noticeably inferior sound, a deficiency that, when coupled with the cassette’s fragility and short lifespan, made it ripe for displacement as well. It became clear that CDs were preferred over vinyl by the majority of consumers and that the new medium was not destined to be a passing fad. Many owners of large record collections began to sell their vinyl albums to used-record stores and to replace them with new digital editions. Record companies, meanwhile, capitalized on this trend in various ways. They released digitized analog recordings of albums right away, then released digitally remastered versions of the same album a few years later, often with “bonus” material or previously unreleased tracks. Thus, fans often purchased multiple versions of the same album. The companies also exploited the size of the new format: Realizing that large amounts of music could be condensed into a relatively small space, they began to release comprehensive boxed sets of multiple CDs. Such compilations rejuvenated interest in older artists (Bob Dylan, the Rolling Stones, David Bowie) and cemented the popularity of more youthful ones. Bruce Springsteen’s three-disc box Live/ 1975-85 spent seven weeks at the top of Billboard’s album chart, despite its relatively high price. This trend would gather momentum in the 1990’s, eventually leading to the availability of exhaustive boxed sets of practically every major performer in every conceivable genre.
Compact discs (CDs)
■
237
Impact The increased storage capacity and arguably superior audio reproduction of compact discs inspired musicians of every genre to explore the possibilities unique to the digital age, while the discs’ durability, portability, and relative affordability made music more accessible and user-friendly than ever before. In later decades, the rise of digital audio technology would be as important to producers as to consumers, as the low cost of creating recordings with personal computers and distributing them on CDs would transform the nature of the recording industry. Further Reading
Baert, Luc, Luc Theunissen, and Guido Vergult, eds. Digital Audio and Compact Disc Technology. Burlington, Mass.: Butterworth-Heinemann, 1995. Detailed technical introduction to the principles of digital audio technology from the Sony Service Center in Europe. Coleman, Mark. Playback. Cambridge, Mass.: Da Capo, 2005. Ambitious attempt not only to trace the history of recorded music from Thomas Edison’s cylinders to Internet-friendly MP3 files but also to do so by explaining the influence of recorded music on society and vice versa. Evens, Aden. Sound Ideas: Music, Machines, and Experience. Minneapolis: University of Minnesota, 2005. A scholarly and technical examination of the artistic implications of the compact disc’s replacement of vinyl as the predominant medium of both making and experiencing recorded music. Gronow, Pekka, and Ilpo Suanio. International History of the Recording Industry. Translated by Christopher Moseley. New York: Continuum International, 1999. Traces the development of the recording industry, focusing on its key innovations and its most influential artists and record companies. Maes, Jan, and Marc Vercammen, eds. Digital Audio Technology: A Guide to CD, MiniDisc, SACD, DVD(A), MP3, and DAT. Burlington, Mass.: Elsevier Science and Technology, 2001. A technical and historical examination of the state of digital audio technology written by two of Sony Europe’s technical-support experts. Arsenio Orteza See also
Classical music; Computers; Consumerism; Music; Science and technology.
238
■
Computers
■ Computers Definition
Electronic devices that receive, process, store, and output data based on programmed instructions
In the 1980’s, microcomputers ceased to be mere hobbyists’ toys and became a significant part of personal, business, and scientific computing. Rapid advances in small computer hardware and software resulted in the development of numerous high-quality microcomputers. In particular, the introduction of the IBM PC in 1981 made the microcomputer acceptable for business applications, and much of the success of the microcomputer in the 1980’s can be attributed to the success of the IBM PC and its clones. The desire to solve computationally intensive problems led to many important developments in supercomputer hardware in the 1980’s. Cray, Control Data Corporation, Intel, and the Connection Machine Company all produced new supercomputers during this period, making significant improvements in central processing units (CPUs), memory, disk drives, and buses. While supercomputer software development was not as impressive as that of hardware, software developers made advances in multiprocessor operating systems and were able to optimize compilers. Many of the advances made first in supercomputers in the 1980’s, such as vectorprocessing CPUs, would appear in mainframes and microcomputers in the 1990’s. The development of mainframe hardware and software continued at a steady pace during the 1980’s, but for the most part it did not keep up with the advances in microcomputers and supercomputers. One exception was the development of relational database software. Oracle 2 (there never was an Oracle 1) was released in 1979, and DB2 was released by International Business Machines (IBM) in 1980. Both products steadily increased their share of the database market. The importance of managing data became a central theme of computing in the 1980’s, setting the stage for the modern concept of a data warehouse. In the 1980’s, both local and wide area networking made great strides. The advances in Transmission-Control Protocol/Internet Protocol (TCP/IP) in the 1980’s set the stage for the explosion of the Internet in the 1990’s. The 1980’s also marked significant advances in the development of operating systems, including DOS, UNIX, and Windows. Inter-
The Eighties in America
estingly, although little attention had been paid to computer security before the 1980’s, the first major online attacks appeared during this time, resulting in initial efforts to render computers secure and protect users’ privacy. Microcomputers The first microcomputer, the MITS Altair, was sold in 1975. Many hobbyists used these computers in the late 1970’s, and both hobbyists and home users purchased Apple Computer’s Apple II after its release in 1977. In the 1980’s, many new and innovative hardware products were released. Disk storage devices such as the 3.5-inch floppy disk drive, hard disk drive, and CD-ROM drive were introduced and improved during the 1980’s. The first modem to interface with microcomputers was introduced in 1984 by Hayes, and the first commercially available laser printer was sold in the same year by HewlettPackard (HP). Although first introduced in 1963, the mouse was popularized in the mid-1980’s. Intel introduced the increasingly powerful 80286, 80386, and 80486 chips. These chips and similar high-performance CPU and memory chips manufactured by other companies allowed the computer industry to produce many inexpensive microcomputers. Some of the individual microcomputers first introduced in the 1980’s include the IBM PC in 1981, IBM PC clones in 1982, the IBM XT in 1983, and the IBM AT in 1984. The first portable computer, the Osborne I, was introduced in 1981. The Apple Macintosh was released in 1984, and Dell Computer Corporation was founded in the same year. Growth in the personal computer industry was explosive in the 1980’s, going from 300,000 units sold in 1981 to more than 3 million units sold in 1982. Despite this growth, not all microcomputers were successful. The Apple III was released in 1980 and had no market penetration. The Apple Lisa, released in 1982, was very expensive and also had poor sales. The IBM PS/ 2 was released in 1987. It stressed a proprietary architecture that was not widely adopted. Software development for microcomputers in the 1980’s was an equally explosive field. The DOS microcomputer operating system, introduced by Microsoft in 1981, was simple but powerful. Several windowed operating systems, with easy-to-use graphical user interfaces (GUIs), were developed in the 1980’s, based on the GUI developed at the Xerox Palo Alto Research Center (Xerox PARC) in the 1970’s.
The Eighties in America
One of the first serious business applications for microcomputers was the spreadsheet program, the first of which was VisiCalc, released in 1979. Microsoft marketed the superior spreadsheet MultiPlan in 1982, which in turn was greatly improved upon by Lotus Software’s Lotus 1-2-3 in 1983. Microsoft’s version was ported to Windows as Excel in 1987. The first word-processing program for microcomputers, Scribe, was created in 1980 by Brian Reid; the most popular presentation system, PowerPoint, was introduced in 1987 by Bob Gaskins, and a well-liked communications program, ProComm, was created in 1986 by Datastorm Technologies. Microsoft Office for the Mac appeared in 1989, and a version for Windows appeared a year later. Many graphics programs were developed for the Mac, including the Aldus PageMaker desktop publishing system in 1985, and the Adobe Postscript printing language was created in 1984. Supercomputers After introducing the Cray-1 in 1976, Seymour Cray developed the CRAY X-MP in 1982 by interconnecting multiple Cray-1’s with a high-speed switch. While the X-MP was very fast, it was the Cray-2, released in 1985, that held most of the performance records for a supercomputer until the 1990’s. Cray started work on his ill-fated galliumarsenide-based Cray-3 in the late 1980’s. Only one Cray-3 was produced, and while it was a powerful supercomputer, it could not match the price-toperformance ratios of parallel processors like the Connection Machine and the Hypercube. In 1983, Danny Hills created the Connection Machine, a computer with thousands of simple processors, and in 1985 Intel manufactured its iPSC Hypercube, a computer with hundreds of standard microprocessors connected with a high-speed backplane.
Computers
■
239
adopted the protocol as well. DecNet was developed during the 1975-1992 period as alternative to TCP/IP. During the 1980’s, adoption of these and other networking standards made interconnecting computers from different vendors a reality. Local area networks (LANs) were defined with the IEEE 802 standard, created by the Institute of Electrical and Electronics Engineers in February, 1980. The Open Systems Interconnection (OSI) Reference Model was introduced in Europe and later adopted in the United States as the best general model for wide area networking. In 1985, the Internet Architecture Board held workshops on TCP/IP for the computer industry that greatly increased the use of this protocol. The support of the DoD and the Internet Architecture Board made TCP/IP a de facto network standard. Connecting the computers of the world together with TCP/IP provided the backbone that created
Networking and the Internet
Three major network protocols were developed during the 1980’s. IBM’s Systems Network Architecture (SNA) was developed from 1974 to 1989. One implementation of the protocol, Virtual Telecommunications Access Method (VTAM), was used to tie IBM mainframes together. TCP/IP was defined by 1975. It was tested and developed over the next eight years, and on January 1, 1983, it became the principal protocol of ARPANET, the Department of Defense (DoD) network that eventually grew into the Internet. TCP/IP’s popularity was confirmed in 1986, when the fledgling National Science Foundation Network (NSFNet)
In 1983, rather than name a Man of the Year, Time magazine named the computer the Machine of the Year, recognizing its effects upon all aspects of 1980’s culture. (Courtesy, Time, Inc.)
240
■
The Eighties in America
Computers
what came to be called the Internet. Tim BernersLee created a hypertext language to support the sharing of textual information among researchers at CERN in 1980. In 1989, he decided to connect hypertext to the Internet, thereby creating the World Wide Web. Operating Systems In addition to the advances in microcomputer operating systems, highlighted by the development of DOS by Microsoft, there were advances in UNIX and the several types of windowed GUI operating systems in the 1980’s. The UNIX operating system was first developed in 1969 at American Telephone and Telegraph (AT&T). Bill Joy, one of the cofounders of Sun Microsystems, helped develop an upgrade to the original AT&T UNIX, Berkeley Software Distribution (BSD) UNIX, while working at the University of California at Berkeley. Joy also led the development of SunOS in 1982, Sun’s operating system until the 1990’s. Work began on Mach Kernel UNIX at Carnegie Mellon University in 1985, and a prototype was released in 1987. It influenced later versions of UNIX, as well as Windows NT. In 1988, the Open Software Foundation began development of OSF UNIX, which appeared as a commercial UNIX through a number of vendors in the 1990’s. At the start of the 1980’s, computers had limited GUIs. In 1980, the XEROX Alto introduced a windowed operating system with a friendly GUI. Later that year, the XEROX Star became the first commercial workstation with a windowed operating system. In 1984, MIT released its X Windows GUI for UNIX, and while it was not very successful commercially, it influenced many later systems. Apple introduced the ill-fated Lisa, a microcomputer with much of the functionality of the Star, in 1983, but it was unsuccessful because of its high cost. Apple then introduced the Macintosh in 1984, and Microsoft helped make it a success with the development of some innovative applications like Word. Microsoft announced the first version of its Windows operating system in 1984 and released Windows 1 in 1985. Windows 1 was limited to a set of fixed, non-overlapping windows. Windows 2 added tiled windows and other features in 1987. Databases
While database software existed from the earliest days of computers, the 1980’s marked a period of exceptional growth for such software. IBM introduced its hierarchical Information Manage-
ment System (IMS) in 1968. During the 1980’s, this was the most important mainframe database management system in use. The dBase nonrelational database was developed during the early 1980’s and became the most popular microcomputer database of the period. Edgar Codd had defined the relational technique for database management in 1970 while at IBM, but it did not receive much attention until the early 1980’s. During the 1980’s, several significant relational database management systems appeared, including IBM’s DB2 (1980) and Oracle’s V3 (1982). RBASE (1982) was the first relational database program created for microcomputers. Microsoft began work on its own desktop relational database management system in the late 1980’s, and after a long development period it released Access in 1992. The 1980’s marked the beginnings of objectoriented databases, with several research prototypes being developed. By the end of the decade, it was apparent to many that a data warehouse was necessary to manage all the spreadsheet, database, and image data being generated by the government, scientists, and industry. Computer Security
Security of the first computers was provided by a combination of physical security and password authentication. This method worked reasonably well until the 1980’s, when the introduction of microcomputers and networking greatly increased the number and types of possible attacks. The first microcomputer virus, Elk Cloner, attacked some Apple IIs in 1982. The first IBM PC virus was a boot sector virus called Brain, created in 1986 in Pakistan by the Alvi brothers. In 1987, Fred Cohen defined the term “computer virus” and wrote a paper about antivirus software. More types of computer attacks appeared in the 1980’s, and they led to the first antivirus software company, Certus, founded by Peter Tippett in 1991 and later sold to Symantec.
Impact The advances in computer networking in the 1980’s laid the foundation for the Internet and the World Wide Web in the 1990’s. The advances in microcomputer and supercomputer hardware in the 1980’s led to an increase in low-cost computer hardware and the inexpensive personal, business, and scientific computers in use today. The development of easy-to-use windowed operating systems in the 1980’s laid the foundation for the development of Windows 3.1 and Office by Microsoft in 1992, significantly affecting the way individuals interact with
The Eighties in America
computers. The release of several relational databases led to an improved way of handling data, and further research done in the 1980’s marked the beginnings of the concept of a data warehouse. Further Reading
Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996. Short but engrossing history of computers. Hiltzik, Michael. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperCollins, 1999. Thorough coverage of the developments at the Xerox PARC labs. Ralston, Anthony, Edwin D. Reilly, and David Hemmendinger, eds. Encyclopedia of Computer Science. 4th ed. Hoboken, N.J.: Wiley, 2003. One of the standard reference works in its field. Very accurate articles cover all areas related to computers, including many aspects of computer history in the 1980’s. Rojas, Raul, ed. Encyclopedia of Computers and Computer History. Chicago: Fitzroy Dearborn, 2001. Contains more than six hundred articles about computers, including those made during the 1980’s, from scholars in computer science and the history of science. Wurster, Christian. The Computer: An Illustrated History. Los Angeles: Taschen America, 2002. A history of computers, interfaces, and computer design that includes pictures of nearly every computer ever made. George M. Whitson III See also
Apple Computer; Information age; Micro-
soft.
■ Conch Republic Definition
Fictional country created as a publicity stunt Place Key West, Florida On April 23, 1982, the city council of Key West, Florida, staged a fictional secession from the United States and created the Conch Republic. While this was mostly just a stunt, it built tourism for the area and helped gain a measure of governmental acknowledgment for an area of the country that was largely ignored.
Conch Republic
■
241
In 1982, the U.S. Border Patrol was looking for ways to control the flow of illegal drugs and illegal immigrants into the United States through Florida. It decided to set up roadblocks on the main road to and from Key West, Florida, and to search every car that went through. This strategy had an unintended consequence: It interrupted the flow of tourists to the island city. Jets began to fly to Key West as a result of the roadblocks, but they still represented a problem for the city. The city council of Key West asked the Border Patrol to remove the roadblocks, petitioned the federal government, and sued in court to have them removed. None of these tactics succeeded. Finally, the council, in conjunction with Mayor Dennis Wardlow, declared the city’s independence. The new “nation” took its name from a common term for the inhabitants of Key West, Conches, and the Conch Republic was born. Wardlow became the prime minister of the republic. He adopted a strategy that was a variation on the plot of Leonard Wibberley’s novel The Mouse That Roared (1955): He declared war on the United States, immediately surrendered, then applied for foreign aid from the U.S. government as a conquered enemy nation. As could be expected, these actions created a great deal of publicity for Key West. There was a great outpouring of support for the city’s people, and this support caused the government to rethink its roadblocks. They were soon removed, giving Key West what it had originally sought. Impact The staged secession of Key West brought the island a great deal of free publicity, increasing the popularity of the area as a tourist destination. It also caused the government to pay closer attention to the effects of its actions on Key West’s inhabitants. Thus, despite its apparent absurdity, the strategem proved successful. Further Reading
Gannon, Michael. Florida: A Short History. Gainesville: University Press of Florida, 1993. King, Gregory. The Conch That Roared. Lexington, Ky.: Weston & Wright, 1997. Michael S. Frawley See also
Immigration to the United States; Slang and slogans.
242
■
Confederacy of Dunces, A
■ Confederacy of Dunces, A Identification Prize-winning novel Author John Kennedy Toole (1939-1969) Date Published in 1980
The reading public’s imagination was captured as much by the tragic story behind A Confederacy of Dunces as it was by the novel itself. The New Orleans-set story was published in 1980, eleven years after its author—despairing of its publication—committed suicide. A Confederacy of Dunces was written in the 1960’s by John Kennedy Toole. The book originally was accepted by the New York publisher Simon & Schuster, but Toole’s editor insisted on changes he refused to make; subsequently, the company dropped the book. Toole’s suicide in 1969 was largely considered to be a result of his plunge into a deep depression after the rejection of a novel he was convinced was his masterpiece. The novel was noted for its protagonist, the roistering Ignatius J. Reilly, an intellectual of massive appetites and numerous eccentricities and phobias who was to some extent autobiographical. Somewhat sheltered by his mother and only at the age of thirty required to seek employment, Reilly finds work at a pants factory and operating a hot dog cart, but in the main the narrative tracks him as he is drawn into a series of wild and fantastic adventures featuring various idiosyncratic characters and situations in the lively French Quarter of New Orleans. While maintaining an intellectual disdain for modernity and purporting to adopt a renunciatory medieval philosophy as his guide, Reilly paradoxically revels in the anarchic and diverse world of New Orleans in the 1960’s, participating in that city’s freewheeling counterculture. When Toole’s tenacious mother, Thelma, pressed the novelist Walker Percy to read her son’s manuscript, he became its enthusiastic champion, writing an appreciative foreword to the 1980 edition. On the basis of the story behind the novel and the novel’s own colorful, inventive depiction of the denizens of New Orleans in the 1960’s, A Confederacy of Dunces quickly became both a literary and a commercial success. Ahead of its time in the 1960’s but in tune with the baby-boom generation that began to exert its cultural power in the 1980’s, Toole’s magnum opus was posthumously awarded the Pulitzer Prize in fiction in 1981.
The Eighties in America Impact The publisher that initially rejected A Confederacy of Dunces was widely perceived in 1980 to have been close-minded and retrograde. As a result, the novel became a cultural cause célèbre, and the tragedy of the author’s suicide became a literary legend, even to the point of overshadowing the novel itself. Despite this competition from the story of the author’s life, the novel’s appealing depiction of a carnival of New Orleans outsiders, eccentrics, and sexual renegades procured it a place in the canon of Southern literature. New Orleans especially took the novel to its heart: A bronze statue of Ignatius J. Reilly was placed under the clock at the Chateau Sonesta Hotel in Canal Street, New Orleans. Further Reading
Fletcher, Joel L. Ken and Thelma: The Story of “A Confederacy of Dunces.” Gretna, La.: Pelican, 2005. Nevils, Rene Pol, and Deborah George Handy. Ignatius Rising: The Life of John Kennedy Toole. Baton Rouge: Louisiana State University Press, 2001. Margaret Boe Birns See also
Book publishing; Literature in the United
States.
■ Congress, U.S. Definition
The bicameral legislative branch of the U.S. government consisting of the House of Representatives and the Senate
During most of the 1980’s, both houses of Congress were not controlled by the same political party. While the House of Representatives maintained a Democratic majority throughout the 1980’s, the Senate had a Republican majority from 1981 until 1987. Divided party control of Congress during most of the 1980’s and the resumption of Democratic control of the Senate in 1987 were significant influences on both legislative activity and executive-legislative relations during the decade. Such major issues as tax cuts, defense spending increases, deficit reduction, nuclear arms control, judicial appointments, and the Iran-Contra affair were dominated by partisan politics. By the late 1980’s, the Democratic Congresses and Republican presidents had become nearly incapable of reaching effective compromises or cooperating in major policy decisions, especially
The Eighties in America
concerning the rapidly growing federal deficit. This situation gave rise to political gridlock, as the government found it extremely difficult to accomplish anything at all. Reagan’s First-Term Budget and Tax Cuts
In the 1980 presidential election, Republican presidential nominee Ronald Reagan easily defeated Democratic incumbent Jimmy Carter. Riding on Reagan’s coattails in the national elections, the Republicans gained thirty-four House seats and twelve Senate seats. Consequently, the Democratic majority in the House of Representatives was reduced, divided, and weakened, while the Republicans won control of the Senate for the first time since 1952. Democratic representative Tip O’Neill of Massachusetts remained Speaker of the House, and Republican senator Howard Baker of Tennessee became the majority leader. During its 1981-1982 session, Congress, especially the House, was primarily concerned with Reagan’s first budget. It emphasized major reductions in taxes, domestic spending, and economic regulation, as well as increased defense spending. Despite their losses in the 1980 elections, the Democrats retained a comfortable majority in the House of Representatives. Furthermore, Tip O’Neill was a liberal northern Democrat determined to protect his party’s domestic-policy legacy rooted in Franklin D. Roosevelt’s New Deal and Lyndon B. Johnson’s Great Society. O’Neill was also skeptical of Reagan’s assurance that a combination of tax and domestic spending cuts, deregulation, and higher defense spending would reduce inflation and unemployment, stimulate widespread prosperity, and eventually result in a balanced budget. Reagan’s staff, however, circumvented O’Neill by lobbying and negotiating with moderate and conservative southern Democrats in the House, known as Boll Weevils. Realizing that Reagan had carried most of their districts by wide margins and was most popular among Southern white voters, the Boll Weevils cooperated and compromised with Reagan and House Republicans. The Boll Weevils agreed to vote for Reagan’s tax cuts, defense spending increases, and some domestic spending cuts if Reagan protected agricultural programs from major cuts. Reagan agreed, and he likewise moderated other domestic spending cuts in order to secure the votes of moderate northeastern and midwestern Republi-
Congress, U.S.
■
243
cans in Congress, known as Gypsy Moths. In particular, Gypsy Moths opposed major cuts in environmental, cultural, and educational programs. After Congress passed Reagan’s compromised budget bill, its next major legislative task was the Economic Recovery Tax Act of 1981, commonly known as the Kemp-Roth Act. Cosponsored by two Republicans, Representative Jack Kemp of New York and Senator William Roth of Delaware, and supported by Reagan, this act sharply reduced both income tax rates and the number of tax brackets, while indexing tax rates to inflation. Reagan signed the bill into law on August 13, 1981. The Kemp-Roth tax cut was a product of supply-side economics, the economic theory that influenced the Reagan administration and most Republicans in Congress. Advocates of supply-side economics claimed that major tax cuts would stimulate greater savings, investments, and economic growth, thereby benefiting most Americans and eventually yielding higher tax revenues that would reduce or eliminate budget deficits. By the late 1980’s, critics of supply-side economics blamed its tax policies for the widening economic inequality among Americans, as well as the nation’s increasing budget deficits. The national debt had more than tripled over the course of the decade, as the government was forced to borrow more money to finance the debt, thereby growing it at the same time. This vicious cycle, critics asserted, threatened future economic growth. The Social Security Controversy President Reagan and the Republicans in Congress were largely successful in decreasing the nation’s tax burden, but they were less successful in making substantial cuts in social welfare benefits (also known as entitlements), especially Social Security for retired and disabled Americans. Wanting state governments to assume more domestic responsibilities and rely less on federal aid, Reagan suggested that the federal government would assume complete responsibility to finance and regulate Medicaid for the poor if the states assumed full responsibility for the Aid to Families with Dependent Children (AFDC) and food stamp programs. Congress rejected this proposal. Reagan, however, achieved his objectives of gradually eliminating the General Revenue Sharing (GRS) program, a major federal aid program for cities and states, and sharply reducing federal spending on public housing and urban renewal.
244
■
Congress, U.S.
The Reagan administration and Republicans in Congress experienced a severe legislative defeat in 1981 and suffered long-term political damage when they attempted to reduce Social Security benefits for future retirees and the disabled. Led by Tip O’Neill, Representative Claude Pepper of Florida, and Senator Daniel P. Moynihan of New York, Democrats in Congress stridently denounced this proposal as a betrayal of American senior citizens. Concerned that the controversy over this Social Security proposal endangered the rest of Reagan’s policy agenda, Senate Majority Leader Howard Baker quickly persuaded Reagan to withdraw his proposal. For the rest of the 1980’s, Republicans avoided proposing any major changes in Social Security benefits because they feared that Democrats could exploit such proposals as a campaign issue. Nevertheless, the lingering controversy over Social Security and the effects of the 1981-1982 recession, especially high unemployment, contributed
The Eighties in America
to Democratic victories in the 1982 congressional midterm elections. The Democrats gained twentysix seats in the House of Representatives, while the Republicans gained one seat in the Senate. With a larger, more unified majority, House Democrats became more assertive and more effective in rejecting additional reductions in domestic spending and taxes, as well as in forcing the Republicans to accept smaller increases in defense spending. As the economy steadily improved in 1983 and 1984, so did Reagan’s approval ratings and Republican prospects in the 1984 elections. The Democratic Party nominated former vice president Walter Mondale as their candidate for president and Representative Geraldine Ferraro of New York for vice president. Mondale’s major campaign promises were to raise taxes in order to reduce the deficit and to negotiate a bilateral freeze on nuclear weapon production with the Soviet Union. Mondale won only his home state of Minnesota, while Reagan carried ev-
Jim Wright, the new Speaker of the U.S. House of Representatives, swears in the members of the One Hundredth Congress during opening ceremonies on January 6, 1987. (AP/Wide World Photos)
The Eighties in America
ery other state and nearly 59 percent of the popular vote. In the wake of Reagan’s landslide reelection, the Republicans gained sixteen House seats and lost two Senate seats. Republican senator Bob Dole of Kansas succeeded the retiring Howard Baker as Senate majority leader. With these election results, Reagan and the Republicans in Congress were confident that the Democrats would be more cooperative in the legislative process. Congress and Reagan’s Second Term
The two major domestic policy issues for Congress during Reagan’s second term were deficit reduction and tax reform. Most Democrats in Congress reluctantly and ambivalently voted for passage of the Balanced Budget and Emergency Deficit Control Act of 1985 (also known as the Gramm-Rudman-Hollings bill). Supported by Reagan and sponsored by Republican senators Phil Gramm of Texas and Warren Rudman of New Hampshire and Democratic senator Ernest Hollings of South Carolina, this law planned to eliminate the deficit by 1991 through automatic, comprehensive, annual budget cuts. In 1986, the Supreme Court struck down the automatic provisions of the law as unconstitutional. High budget deficits remained the top domestic policy issue for Congress during the 1980’s and early 1990’s. By 1986, Congress developed a broad, bipartisan consensus to reform and simplify income and corporate taxes by again reducing the number of tax brackets and reducing or eliminating certain deductions. Beyond insisting that the net effect of such reform must not be to raise taxes, Reagan generally deferred to Congress to negotiate and determine the details of a tax reform bill. Bipartisan leadership in achieving passage of the Tax Reform Act of 1986 was personified by Democratic representative Dan Rostenkowski of Illinois, chairman of the powerful House Committee on Ways and Means, and Republican senator Bob Packwood of Oregon, chairman of the Senate Committee on Finance. In the 1986 congressional elections, the Democrats gained three House seats and eight Senate seats. These results enabled the party to win control of the Senate and elect Senator Robert Byrd of West Virginia as Senate majority leader. Tip O’Neill retired, and Democratic representative Jim Wright of Texas became Speaker of the House. The leadership of both houses of Congress, but especially the Senate, became more combative and confrontational in
Congress, U.S.
■
245
relationships with President Reagan and with the Republican minorities. In particular, the Democratic Senate conducted an extensive investigation of the Iran-Contra affair and rejected Reagan’s nomination of Robert H. Bork to the Supreme Court in 1987. The Senate investigation eventually revealed that Reagan knew about and approved of the National Security Council’s illegal sale of weapons to Iran and the use of the funds from those sales to aid the Contra rebels in Nicaragua. Meanwhile, Speaker Wright imposed stricter party discipline on Democrats and was more assertive in his relationship with Reagan than O’Neill had been. Pressured by Republican representative Newt Gingrich of Georgia, the House investigated Wright for ethical violations. Wright resigned from Congress in 1989 and was replaced as Speaker by Democratic representative Thomas Foley of Washington. Congress and George H. W. Bush in 1989
Although incumbent Republican vice president George H. W. Bush easily defeated Michael Dukakis, the Democratic governor of Massachusetts, in the 1988 presidential election, the Democrats increased their majorities in Congress. They gained two seats in the House and one seat in the Senate. Democratic senator George Mitchell of Maine became Senate majority leader. During Bush’s first year as president, the Senate rejected his nomination of John Tower to be secretary of defense. There were also prolonged conflicts between Bush and Congress over deficit reduction, taxes, entitlement spending, and defense spending.
Congress and Foreign Policy The election results and public opinion polls of 1980 indicated that Americans wanted a more assertive, military-oriented foreign policy in response to the Iranian hostage crisis, the Soviet invasion of Afghanistan, the Communist Party’s repression of the Solidarity movement in Poland, and the threat of growing communist influence in Central America, especially Nicaragua. Consequently, Republicans and most Southern Democrats in Congress supported President Reagan’s policies of higher defense spending; aid to anticommunist guerillas; installing new nuclear missiles in North Atlantic Treaty Organization (NATO) countries, especially West Germany; developing the Rapid Deployment Force (RDF) to project American military power in the Persian Gulf region;
246
■
The Eighties in America
Congressional page sex scandal
invading Grenada; and refusing to make major concessions to the Soviet Union in nuclear arms negotiations. As a result, U.S. military resources grew substantially during the early 1980’s. Congress, however, pressured Reagan to withdraw American troops from Lebanon following the terrorist bombing of a barracks in Beirut in 1983 and adopted the Boland Amendment in 1984 to prohibit the provision of American military aid to the Nicaraguan Contras. After Mikhail Gorbachev became the Soviet leader in 1985, more Americans and members of Congress began to favor reductions in defense spending and a more conciliatory policy toward the Soviet Union. Reagan and Gorbachev negotiated and signed the Intermediate-Range Nuclear Forces (INF) Treaty in 1987, and the Senate ratified it in 1988. The INF treaty was used as the basis for achieving a more comprehensive nuclear arms reduction agreement with the Soviet Union known as the Strategic Arms Reduction Treaties (START I and START II). In 1989, Congress generally supported the continuation of START negotiations by President George H. W. Bush, as the Cold War ended with the disintegration of the Soviet Union and the communist governments in Eastern Europe. It also supported Bush’s invasion of Panama to remove dictator Manuel Noriega. Impact From 1981 until 1987, the U.S. Congress was divided between a Democratic House of Representatives and a Republican Senate. The Democratic House compelled the Reagan administration to moderate its efforts to reduce taxes, domestic spending, and economic regulation, while the Republican Senate enabled the White House to appoint many conservative judges and delay serious treaty negotiations with the Soviet Union on nuclear arms control. After the Democrats won control of both houses of Congress in 1986, the policy-making relationship between Congress and Presidents Reagan and George H. W. Bush became more combative and less cooperative. Further Reading
Baker, Ross K. House and Senate. New York: W. W. Norton, 1989. Study of the differences between the U.S. House of Representatives and the Senate that includes details of Congress during the 1980’s. Davidson, Roger H., and Walter J. Oleszek. Congress and Its Members. Washington, D.C.: CQ Press,
1990. Study of Congress that includes legislative issues and leaders from the 1980’s. Derbyshire, Ian. Politics in the United States from Carter to Bush. New York: Chambers, 1990. Broad survey of American politics from 1976 to 1989. Johnson, Robert David. Congress and the Cold War. New York: Cambridge University Press, 2006. Comprehensive study of congressional influence on U.S. military and diplomatic policy during the Cold War. Bibliographic references and index. Smith, Hedrick. The Power Game: How Washington Works. New York: Ballantine Books, 1988. Detailed analysis of politics between Congress and the president during the 1980’s. Sean J. Savage See also
Beirut bombings; Bork, Robert H.; Bush, George H. W.; Conservatism in U.S. politics; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Foreign policy of the United States; Iran-Contra affair; Liberalism in U.S. politics; O’Neill, Tip; Reagan, Ronald; Reagan Doctrine; Reagan Revolution; Reaganomics; Social Security reform; Tax Reform Act of 1986; Tower Commission; Wright, Jim.
■ Congressional page sex scandal The Event
Scandal involving members of the U.S. House of Representatives who had sexual relations with under-age assistants Date July, 1983 Place Washington, D.C. The scandal was the first of many to surface in American politics during the 1980’s, exacerbating post-Watergate skepticism regarding the integrity of elected officials. On July 14, 1983, the House Committee on Ethics recommended that two representatives, Illinois Republican Dan Crane and Massachusetts Democrat Gerry Studds, be reprimanded for engaging in sexual relationships with congressional pages. Pages, a select group of administrative assistants working for Congress, are under eighteen years of age. The committee accused Crane of engaging in a relationship with a seventeen-year-old female page in 1980 and Studds of a similar relationship with a seventeen-
The Eighties in America
year-old male page in 1973. Because the age of consent in the District of Columbia was sixteen, the relationships were not illegal, but they raised serious ethical concerns regarding potential abuses of power and authority. Reprimand was the mildest possible punishment the House could impose on its members. Dissatisfied with the Ethics Committee’s recommendation, Republican representative Newt Gingrich of Georgia introduced a motion to expel Crane and Studds from the House altogether. As a compromise between these two extremes, the House voted 420 to 3 to issue a censure, a formal condemnation of the representatives’ conduct. Crane subsequently acknowledged having a consensual relationship with a page and issued a tearful apology to his fellow representatives. Studds admitted to exercising poor judgment by engaging in sex with a subordinate but insisted that his relationship with the page was consensual, legal, and private. The page later appeared publicly with Studds in support of his stance. Crane won the Republican nomination for his congressional seat in 1984 but lost the general election and returned to Illinois to resume practice as a dentist. Studds won reelection to the House in 1984 and was subsequently reelected to six more terms. Gingrich, having developed a reputation as an enforcer of ethical conduct in the House, subsequently brought charges against House Speaker Jim Wright in 1987, forcing him to resign from his position as Speaker. Impact The Congressional page sex scandal marked the first time that the U.S. Congress had voted to censure a member for sexual misconduct. The bipartisan scandal intensified the public distrust of government that pervaded American politics following the Watergate scandal, and it prompted a lengthy exchange of allegations and investigations of misconduct by public officials from both major political parties that continued to resonate throughout American politics into the twenty-first century. The scandal also led to reforms of the congressional page program, including the establishment of a minimum age of sixteen for pages. Subsequent Events Studds, the first openly gay member of Congress, remained in the House of Representatives until his retirement in 1997, and in 2004 he married his longtime male companion in a legal ceremony in Massachusetts. He died in Octo-
Conservatism in U.S. politics
■
247
ber, 2006. Gingrich was elected Speaker of the House in 1995 and was subsequently accused of various ethical violations of his own, which contributed to his resignation from Congress in 1999. Further Reading
Hilton, Stanley G., and Anne-Renee Testa. Glass Houses: Shocking Profiles of Congressional Sex Scandals and Other Unofficial Misconduct. New York: St. Martin’s Paperbacks, 1998. Tolchin, Martin, and Susan J. Tolchin. Glass Houses: Congressional Ethics and the Politics of Venom. Boulder, Colo.: Westview Press, 2003. Michael H. Burchett See also Abscam; Congress, U.S.; Elections in the United States, 1984; Iran-Contra affair; Scandals.
■ Conservatism in U.S. politics Definition
A political ideology that tends to support tradition, authority, established institutions, states’ rights, liberal individualism, and limiting the political and fiscal power of the federal government
Some twenty-five years in the making, the conservative movement achieved political success in the 1980’s under President Ronald Reagan. It effectively attacked the welfare liberalism of the 1960’s and, in foreign affairs, pursued a more aggressive policy against communism. Conservatism in the 1980’s can be properly understood only by placing it within the context of the revolt against so-called Big Government liberalism that began in the mid-1950’s under the leadership of William F. Buckley and his journal, National Review. Buckley’s “fusionist” collaboration of conservatives and free market libertarians foreshadowed the coalition President Ronald Reagan would assemble in the 1980’s. Also crucial was the growth of the New Right, a populist movement that had its origins in the Barry Goldwater presidential campaign of 19631964 and that also stressed free market values combined with opposition to the expansion of the federal government. In the 1970’s, the rise of the Religious Right, closely allied to the New Right and shaped by such organizations as the Moral Majority, also contributed to the growth of conservative dominance within the Republican Party. Additionally, the 1970’s
248
■
Conservatism in U.S. politics
saw the emergence of neoconservatism, a movement of disaffected liberals who believed that the expansion of the welfare state promoted by liberalism had gone too far. While not opposed in principle to a powerful central government, they believed that excessive social engineering had begun to erode the traditional self-reliance of the American people. The Reagan Revolution
All these groups found common ground in the so-called Reagan Revolution of the early 1980’s, in part because Reagan’s own political philosophy combined elements of social conservatism, populism, and free market, classical liberalism. Virtually all of these conservatives also supported a more aggressive policy of opposition to the global spread of communism, as embodied in the Soviet Union. Two decades of grassroots organizing had produced a groundswell of popular support among the American working and middle classes for what became known as Reaganism. Indeed, the most salient feature of conservatism in the 1980’s was the shift of millions of traditionally Democratic voters to the ranks of the Republican Party, either by reregistering or simply by allegiance at the polls. The term “Reagan Democrat” was coined to describe Democratic Party members who voted for Reagan in 1980 and 1984. Many of these converts were blue-collar workers and socially conservative Catholics who believed that the Democratic Party no longer spoke to their needs and aspirations. The conservative movement was especially effective in convincing these voters that the Republican Party was the party of the American Dream and of what it represented as traditional moral values. The Republicans promised lower taxes to promote free enterprise, fiscal responsibility, a return to so-called family values, opposition to abortion on demand, and a foreign policy that would no longer seek simply to contain the Soviet threat but to roll it back. Among the conservative groups that gathered under the Reagan tent, the New Right and the neoconservatives were perhaps the greatest beneficiaries. During his first two years in office, Reagan successfully promoted legislation designed radically to reduce welfare spending, to cut tax rates, and to reduce government regulation of business. Historians are still debating the long-term effects of much of this legislation, but during the early 1980’s it appeared to most Americans that Reaganism was in-
The Eighties in America
deed a revolution and that the expansion of the welfare state had been stopped, if not reversed. In foreign policy, the neoconservatives, many of whom were appointed to key positions in the Departments of Defense and State, exerted growing influence over the Reagan agenda, especially in building support for covert anticommunist operations in countries such as Nicaragua and Afghanistan. Discontent in the Ranks As Reagan entered his second term, it became apparent that serious fractures were developing in the conservative coalition. Most disaffected were the traditionalist, or Old Right, conser vatives for whom Reagan’s attempts to check the progress of big government did not go far enough and for whom the foreign policy crafted by the neoconservatives was little more than a misguided new imperialism. This wing of the movement, sometimes characterized as “paleoconservative,” favored radical decentralization of the federal government and a return to states’ rights, restrictions on immigration, economic protections for American workers, and a non-interventionist policy abroad. Philosophically speaking, the paleoconservatives were opposed to the neoconservative vision of America as a nation whose identity was fundamentally defined by the ideals of the Declaration of Independence. For the paleoconservatives, the identity of the nation was more than an abstract set of ideas, but was profoundly rooted in an Anglo-Saxon political tradition that reached back to the Magna Carta. They also insisted that the United States was rooted in local custom and tradition, a common language and literature, and a constitution that would have been inconceivable without the British common law tradition that preceded it. True conservatism could flourish, in this view, only if power were returned to the states and local communities. As for foreign policy, in the paleoconservative view, American-style democracy was not a set of portable ideals that could be transplanted around the world. Impact Despite some dissension within its ranks, the conservative coalition of the 1980’s remained largely in place. It was still early enough in the shift to conservatism in U.S. politics that the multifaceted conservative movement believed it had more in common than not: The incompatibilities between social conservatism, fiscal conservatism, and political conservatism that would later become important
The Eighties in America
were largely unacknowledged during the decade. As a result, Vice President George H. W. Bush was able to win election to the presidency in 1988. Whether as a result of conservative anti-tax policy or not, the American economy had rebounded during the mid1980’s and was still experiencing rapid growth: Bush was able to capitalize upon this advantage during his campaign and, more important, to cast himself as a Reagan populist. In fact, Bush was a centrist and did little to advance the conservative agenda. Nevertheless, the Reaganite coalition had brought new prestige to conservative ideas and was responsible for a decidedly conservative shift in the American electorate. As a result, Bill Clinton, Bush’s Democratic successor, was able to get elected largely because he adopted much of the conservative agenda, in both domestic and foreign policy. As a result of the influence of 1980’s conservatism, welfare liberalism and extensive federal regulation of the economy were largely discredited. Further Reading
Ehrman, John. The Rise of Neoconservatism. New Haven, Conn.: Yale University Press, 1995. A useful and generally unbiased discussion of the rise of the neoconservative movement from its origins in the 1970’s to its rise to power in the 1980’s. Gottfried, Paul, and Thomas Fleming. The Conservative Movement. Boston: Twayne, 1988. Discusses the emergence of post-World War II American conservatism from a paleoconservative perspective. Wooldridge, Adrian, and John Micklethwait. The Right Nation: Conservative Power in America. London: Penguin Press, 2004. Examines American conservatism from a British perspective; especially good on the 1980’s era. Jack Trotter See also
Bush, George H. W.; Cold War; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Foreign policy of the United States; Liberalism in U.S. politics; Reagan, Ronald; Reagan Democrats; Reagan Doctrine; Reagan Revolution; Reaganomics; Tax Reform Act of 1986; Welfare.
Consumerism
■
249
■ Consumerism Definition
A preoccupation with the purchase of consumer goods and the ideologies that support or endorse that preoccupation
After the social unrest of the 1960’s and the energy shortages of the 1970’s, adult Americans longed for more settled and more affluent times. When the economy improved in the 1980’s, those lucky enough to benefit launched on a program of conspicuous consumption that came for many to define the decade. The extravagant inaugural festivities accompanying Ronald Reagan’s 1981 assumption of the U.S. presidency were in retrospect a hallmark of the decade ahead. The 1980’s heralded the return of formality and ostentation in American society, as well as in dress, in keeping with Reagan’s social customs. High school proms, elaborate weddings in formal settings, coming-out parties, charity balls, and private black-tie dinners proliferated, with women dressing for these events to appear extravagant and lavish. Nancy Reagan’s elegance and Princess Diana’s love of fine fashion were important influences. The predilection for things “natural,” which prevailed in the previous decade, expanded to include the most expensive natural materials: cashmere was preferred to wool, linen was chosen over cotton, and silk clothing was worn everywhere. Ornamentation was the rule of the day, with cabbage roses, animal prints, polka dots, tassels, beads, chains, ribbons, scarves, shawls, and patterned stockings all being consumed and displayed prominently. Binge buying and credit became a way of life, and high-end labels were snapped up. The novelist Tom Wolfe coined the term “the splurge generation” to describe the baby boomers, who, with their children, were avid consumers. A Culture of Consumption Since more women entered the workforce in the 1980’s than in any other decade, there was more money available to doubleearning families to spend, as well as a greater demand for professional clothes. A group of American designers—Donna Karan, Ralph Lauren, and Liz Claiborne—offered women padded shoulders and broad lapels to express their new commercial power. Although television was available for home consumption by the 1950’s, in the 1980’s the clothes worn on programs greatly influenced fashion. A new
250
■
The Eighties in America
Consumerism
type of program, the prime-time soap opera, included Dallas and Dynasty—shows that featured the wealthy and extravagant lifestyles of two families of oilmen and cattle ranchers. These shows influenced not only fashion but home interiors as well. Women and men found it difficult to redecorate their homes without an interior designer, and people entered that profession in record numbers. Auctions of famous artworks reached record prices. By 1987, Van Gogh’s Sunflowers sold for $39.9 million and his Irises for $53.9 million. The Museum of Modern Art in New York began renovations that would double its size, and cities like San Antonio built multimillion-dollar museums. In music, pop, rock, country, and especially rap and hip-hop became popular, as music videos, especially those broadcast on cable channel MTV, exerted an enormous influence on the development and marketing of new music. The digital compact disc (CD) changed the entire industry and made fortunes for music companies. A study conducted by the University of California, Los Angeles, and the American Council on Education in 1980 found that those entering college were more interested in status, power, and money than enrolling students had been in the previous fifteen years. The Dow Jones Industrial Average tripled in seven years and quickly bounced back from the 1987 stock market crash, driving a student preference for business management as the most popular major. New Markets Social attitudes toward minority groups in the United States remained complex during the 1980’s, but overt racism became less socially acceptable. As people of color began to be taken more seriously, companies began to see minority communities as potential new markets for their products. Thus, concepts of multiculturalism began to influence advertising. Although the advertising agencies explained this trend as a desire to include everyone, it constituted the first recognition that many minorities had achieved middle-class lifestyles and had begun to subscribe to the same consumerist values as the rest of the American middle class. New uses for technology developed rapidly in the 1980’s, and the term “consumer electronics” came into use to describe an exploding sector of technology that included personal computers, electronic games, stereo equipment, handheld mobile phones, and many data storage technologies such as com-
pact discs. Although popularity of video games started in the late 1970’s, video-game technology developed during the 1980’s kept the market hot. Personal computers became popular in households as well as at work, and consumers snapped up Sony Walkmans and videocassette recorders (VCRs). Apple’s Macintosh computer was introduced in 1984 and became commercially successful, as did other computers of the decade including the IBM PC, Atari ST, and Commodore 64. Microsoft introduced the early versions of the Windows operating system, which dominated market for several decades following the 1980’s. The film industry boomed in the 1980’s, but it became more focused, as Hollywood competed with home entertainment technologies by concentrating on producing a limited number of mass-market blockbusters rather than a wider variety of modestly successful films that appealed to more specific audiences. However, the Sundance Institute opened in 1981 to promote independent filmmakers, and the first Sundance Film Festival was held in 1986, spawning a national craze for film festivals that provided venues and opportunities for new directors who could not compete directly with Hollywood blockbusters. Special effects in movies advanced in sophistication as computer technology developed, a trend that decisively shaped Hollywood’s output history. Film consumers who loved videocassettes frequented the new video rental outlets that became national chains. In 1981, VCR sales rose 72 percent in twelve months. Science-fiction films surged in popularity, best exemplified by Steven Spielberg’s E.T.: The Extra-Terrestrial (1982), which broke records for gross receipts and became the biggest earner of the decade. Another science-fiction filmmaker, George Lucas, had reaped incredible profits by exploiting the possibilities of film merchandising, creating an extensive line of toys based on his Star Wars trilogy (1977-1983). The rest of Hollywood quickly responded to Lucas’s success, creating tie-in merchandise in association with films whenever possible, especially merchandise aimed at youngsters. Impact
Although other decades—especially the 1950’s—were known in the United States for their commodity consumption, the 1980’s was one of the first to be marked not merely by consumption but by unabashed consumerism. Baby boomers became known for their self-obsession and demand for in-
The Eighties in America
stant gratification, and the most common venue for both qualities was the marketplace. Commodities had been marketed throughout the twentieth century as standing for particular lifestyles, but in the 1980’s, the purchase of commodities itself became a popular lifestyle. Plays such as Other People’s Money (1988) and movies such as Wall Street (1987) commented pointedly on a culture of greed, while the Reagan administration and its conservative allies trumpeted the benefits to the economy of a middle class freely spending its increased disposable income. Further Reading
Glickman, Lawrence B., ed. Consumer Society in American History: A Reader. Ithaca, N.Y.: Cornell University Press, 1999. Comprehensive anthology of focused essays covering the specific periods and issues in the history of U.S. consumer society. Bibliography and index. Hurley, Andrew. Diners, Bowling Alleys, and Trailer Parks: Chasing the American Dream in the Postwar Consumer Culture. New York: Basic Books, 2001. A study of the marketing of middle-class lifestyles to working-class Americans; focuses in part on the racial and gendered aspects of postwar consumer culture. Nye, David E., and Carl Pederson, eds. Consumption and American Culture. Amsterdam: VU University Press, 1991. Collection of essays discussing the function of consumption and consumerism in American culture; published immediately after the 1980’s and focused particularly on American history from the perspective of the events of that decade. Strasser, Susan. Satisfaction Guaranteed: The Making of the American Mass Market. New ed. Washington, D.C.: Smithsonian Institution, 2004. History of consumer habits, markets, and advertising in the United States, focusing especially on the creation of brand names and brand loyalty, as well as the constant drive of the market to create and exploit new desires. Sheila Golburgh Johnson See also Beattie, Ann; Bonfire of the Vanities, The; Business and the economy in the United States; Environmental movement; Fads; Family Ties; Power dressing; Reagan, Nancy; Reagan, Ronald; Reagan Revolution; Reaganomics; Wall Street.
Cosby Show, The
■
251
■ Cosby Show, The Identification Television comedy series Producers Marcy Carsey (1944),
Tom Werner (1950), and Bill Cosby (1937) Date Aired from September 20, 1984, to April 30, 1992 The Cosby Show portrayed the daily lives of an uppermiddle-class African American family. The show revitalized the television sitcom, rejuvenated the NBC network and Thursday night television, and created debate regarding its lack of discussion of racism. The initial idea to create The Cosby Show came after comedian Bill Cosby presented a monologue on child rearing on The Tonight Show. While the monologue was well received, the National Broadcasting Company (NBC) hesitated to add the show to its program schedule. The program was then offered to, and rejected by, the American Broadcasting Company (ABC). The networks’ executives were reluctant to accept the show for a variety of reasons. They feared that the situation comedy was a dying genre and that American viewers would in any case not be interested in a show with a completely African American cast. Cosby’s prior career in television had met with mixed results, so he could not be counted on to produce a hit. At the eleventh hour, however, just in time to be added to the fall lineup, NBC decided to order a pilot and five additional episodes. Story Line
On September 20, 1984, the first thirtyminute episode of The Cosby Show aired. The show, taped before a live audience, centered on the Huxtable family of 10 Stigwood Avenue, a brownstone in Brooklyn, New York. Its situations generally involved immediate household members: Heathcliff “Cliff” Huxtable, played by Cosby, an obstetrician/gynecologist; Claire, played by Phylicia Rashad, a corporate attorney; and their children Sondra (Sabrina Le Beauf), a college student and later a parent; Denise (Lisa Bonet), the renegade; Theo (Malcolm-Jamal Warner), the academically struggling only son; and little girls Vanessa (Tempestt Bledsoe) and Rudy (Keshia Knight Pulliam). Later episodes also included stepgranddaughter Olivia (Raven-Symoné) and her father, Lt. Martin Kendall (Joseph C. Phillips); son-in-law Elvin (Geoffrey Owens) and twins Winnie and Nelson; and grand-
252
■
Cosby Show, The
The Eighties in America
in school, as well as the danger that the older children would never leave home. Each situation was represented with a gentle humor that appealed to audiences of the 1980’s. Cliff’s relationships with his son, daughters, wife, and parents engaged viewers. Their teasing, testing, back-and-forth sparring, and obvious affection allowed Cosby and the other performers effectively to blend wit, comic timing, and acting. The show’s humor was enhanced by believable characters with distinct personalities who sometimes made mistakes. While Cliff tried to be a perfect parent and spouse, one who was above reproach, he, too, was reminded that he was young once and that he was fallible.
The cast of The Cosby Show poses during the 1984-1985 television season. Back row, from left: Tempestt Bledsoe, MalcolmJamal Warner, and Phylicia Rashad. Front row: Lisa Bonet, Keshia Knight Pulliam, Bill Cosby, and Sabrina Le Beauf. (Hulton Archive/Getty Images)
parents Russell (Earl Hyman) and Anna (Clarice Taylor) Huxtable. Special guests included Stevie Wonder, Placido Domingo, Sammy Davis, Jr., and many other celebrities. The composition of the family, its socioeconomic status, the setting of the story, and many of the show’s themes paralleled the real lives of Cosby’s own family members. The artwork, literature, and music in the background of the show reflected the family’s African American culture, but story lines generally focused on experiences that could be portrayed as universal. Central topics centered on interfamilial relationships, school, and dating. The concerned and caring Huxtable parents were portrayed raising well-bred children and tackling everyday family issues. They taught lessons about fiscal responsibility using Monopoly money, held a funeral for a dead goldfish, and dealt with Theo’s pierced ears and poor grades
Critical Reaction The show was extremely popular, but it was not without its critics. Some people censured the show for its failure to depict racial tensions between African Americans and whites and its avoidance of other aspects of the African American struggle, including poverty and AIDS, as well as overt racism. Others felt that even the choice of characters portrayed the world through rose-colored glasses, because there were few affluent, double-income, professional African American families in the early 1980’s. While this criticism existed, however, proponents strongly supported the show for its positive depiction of an African American family. Advocates saw the benefits of portraying positive African American parental role models and a stable family in which the mother and father could banter and play with their children without ceding their authority. The writers and actors of The Cosby Show produced 201 episodes, aired over eight seasons, that were nominated for multiple awards and won many of them. Their awards included six Emmys, three Golden Globes, several Image Awards from the National Association for the Advancement of Colored People (NAACP), four Young Artist Awards, and a Peabody Award. Impact
The Cosby Show, featuring an entirely African American cast, succeeded without slapstick clownery in becoming one of the most popular network television shows in history. The immediate popularity of the show astounded television executives, who had feared the imminent demise of the family sitcom. The Cosby Show revived interest in the genre, and it was later credited with leading the way for NBC’s later successful sitcoms, notably Frasier, Seinfeld, and Friends. The Cosby Show quickly reached
The Eighties in America
the top of the Nielsen ratings: In its first year, it rocketed to third place, and for the next four years, 19851990, it was the number one program on television. Its large viewing audience helped NBC dominate the other networks, particularly on Thursday nights, which became home to the network’s most popular prime time lineup. In 1987, The Cosby Show’s producers created a spin-off show, A Different World, featuring Denise as a student at Hillman, a fictional African American college. Bonet’s Denise remained a series regular for only one season, but the show ran for another five seasons after her departure. A Different World was more willing, and even eager, to address the sorts of social issues that The Cosby Show often eschewed. Meanwhile, after The Cosby Show ended its eight-year run, it was sold into syndication, where reruns earned NBC millions of dollars per episode. Further Reading
Bogle, Donald. “The Cosby Show.” Blacks in American Films and Television: An Encyclopedia. New York: Garland, 1988. Good overview of the television show and its impact. Dyson, Michael Eric. Is Bill Cosby Right? Or Has the Black Middle Class Lost Its Mind? New York: Basic Civitas Books, 2005. Examines and questions Bill Cosby’s views on African Americans and his support of color-blind politics. Fuller, Linda. The Cosby Show: Audiences, Impact, and Implications. Westport, Conn.: Greenwood Press, 1992. A lengthy study of the show and its impact. Hunt, Darnell. “Cosby Show, The.” In Encyclopedia of Television, edited by Horace Newcomb. 2d ed. New York: Fitzroy Dearborn, 2004. Summarizes the show and gives many further readings. Inniss, Leslie B., and Joe R. Feagin. “The Cosby Show: The View from the Black Middle Class.” Journal of Black Studies 25, no. 6 (July, 1995): 692-711. Reviews the positive and negative African American responses to The Cosby Show. Merritt, Bishetta, and Carolyn A. Stroman. “Black Family Imagery and Interactions on Television.” Journal of Black Studies 23, no. 4 (June, 1993): 492499. Reviews African American life on television in 1985-1986. Cynthia J. W. Svoboda See also African Americans; Comedians; Family Ties;
Facts of Life, The; Sitcoms; Television.
Cosmos
■
253
■ Cosmos Identification Educational television series Producer and host Carl Sagan (1934-1996) Date Aired from September 29, 1980, to
December 21, 1980 Cosmos stimulated a resurgence in popular science programming for American public and commercial television, as well as making astronomer Carl Sagan a nationally recognizable personality. On September 29, 1980, the Public Broadcasting System (PBS) aired “The Shores of the Cosmic Ocean,” the first episode of a thirteen-part television series about the universe designed and hosted by noted astronomer and science educator Carl Sagan. Entitled Cosmos, the series was notable for the sheer range of topics it addressed, an innovative use of high-quality graphics—drawn both from historical sources and from contemporary illustrators—music carefully chosen to enhance the viewing experience, and the respectful approach taken to both the intelligence of the viewing audience and the value of sentient life, whether on Earth or elsewhere. It was also the most expensive project undertaken by public television networks up to that time. The basic subject of the series was the evolution of life on Earth and the place of humans within the larger structures of the universe. These topics were presented alongside the history of scientific discovery and the biographies of significant scientists from classical times to the twentieth century. Sagan thus interweaved the larger history of scientific inquiry with the personal histories of the men and women responsible for advancing that inquiry. Cosmos was the most widely viewed documentary limited series, and it received awards from the American Council for Better Broadcasts and the Academy of American Films and Family Television, as well as three Emmy Awards. The contents of the series were assembled and issued as a book, which quickly became a nonfiction best seller and was reprinted in 1983 and 1985. Popular periodicals such as Time magazine recognized Sagan’s work by placing him on the cover of its October 20, 1980, issue and dubbing him “the cosmic explainer,” while others reproduced selections from the series’ artwork, notably the view of the Milky Way galaxy seen from above. The reactions of Sagan’s colleagues in astronomy and allied sciences were more mixed. The show
254
■
Costner, Kevin
debuted at a time when the public image and reputation of the sciences in the United States was beginning to recover from serious criticisms leveled against them in the 1960’s and 1970’s. Technology had been blamed, with some justice, for causing or worsening numerous problems of the planetary environment. The reputation of science, as Sagan described it in an interview with Rolling Stone magazine, was that of a subject that “sounds as if it were the last thing in the world that any reasonable person would want to know about . . . .” Sagan, however, viewed humans as “a way for the cosmos to know itself.” He contended that science was not only fun but also an essential and comprehensible element of a changing global civilization. He thus offered a refreshing and thoughtful perspective on the place of science in American culture that proved immensely popular and durable. Impact The massive popularity of Cosmos demonstrated that the American public was receptive to quality science education programs. It laid the foundation for both public television and documentary and scientific cable channels to capitalize on this potential audience, which they did by developing many other educational science shows during the remainder of the 1980’s. Further Reading
Cott, Jonathan. “The Cosmos.” Rolling Stone, January 8, 1981, 43, 45-46, 48, 50-51. Head, Tom, ed. Conversations with Carl Sagan. Jackson: University Press of Mississippi, 2006. Sagan, Carl. Cosmos. New York: Random House, 1980. Robert B. Ridinger See also Astronomy; Science and technology; Space exploration; Television.
■ Costner, Kevin Identification American actor and filmmaker Born January 18, 1955; Lynwood, California
In a handful of popular 1980’s movies, actor Kevin Costner gained the attention and adulation of movie fans, who saw in him an attractive new face and an attitude of grace, decency, and danger. Until the mid-1980’s, Kevin Costner was virtually unknown in Hollywood, where some dubbed him “the
The Eighties in America
face on the cutting-room floor.” He had appeared in a few low-budget movies, a television commercial for Apple’s Lisa desktop computer, and a few box-office disappointments. In 1987, however, he starred in both The Untouchables and No Way Out, and his career took off. Previously, Costner had filmed the low-budget Sizzle Beach, U.S.A., which was made in 1974 but not released until 1986, after the actor had been featured in other films. His first lines as a mainstream actor were in 1982’s Frances, starring Jessica Lange. However, Costner’s relationship with director Graeme Clifford became strained, and his scenes were eliminated. Similarly, Costner was part of the ensemble cast in 1983’s The Big Chill (his character’s funeral brings a group of friends together), but his scenes were considered largely irrelevant to the main plot and cut. His only appearance in the finished film was a shot of his wrists as his character’s corpse was dressed. Costner’s bit parts in One from the Heart (1982) and Table for Five (1983) also were removed prior to those films’ theatrical releases. He did have a bit part in Night Shift (1982) that made it into the final film. His performances in the public-television drama Testament and the films Fandango and American Flyers (all released in 1983) also survived. None of them were hits, nor was Silverado (1985). However, in the latter film, directed by Costner’s friend and Big Chill director Lawrence Kasdan, Costner’s supporting performance as a fun-loving cowboy caught the attention of filmgoers, filmmakers, and critics. Costner reportedly turned down the lead role in 1983’s War Games and a supporting part in 1986’s Platoon, but he accepted an offer to star as Elliot Ness in Brian DePalma’s adaptation of The Untouchables, and the hit film was a breakthrough for him, as he was compared with Golden Age leading men such as Gary Cooper and James Stewart. Costner followed that project with the suspense yarn No Way Out, costarring with Sean Young, Gene Hackman, and Will Patton. He then starred in two baseball movies, Bull Durham (1988) and Field of Dreams (1989), broadening his appeal. “Men like him, women love him,” wrote Time magazine movie critic Richard Corliss in 1989. By the end of the decade, Costner was bankable enough to begin pre-production work and to scout locations for Dances with Wolves (1990), his directorial debut. The film would earn him a Best Actor Oscar nomination, as well as the Academy Award for Best Director, cementing his place in Hollywood.
The Eighties in America
Country music
■
255
Kevin Coster, right, with Sean Young in a scene from No Way Out. (AP/Wide World Photos)
Impact Kevin Costner made it acceptable for cinema’s leading men to be wholesome and even oldfashioned after his 1980’s breakthrough following years of near-anonymity. Further Reading
Caddies, Kelvin. Kevin Costner: Prince of Hollywood. London: Plexus, 1992. Foote, Jennifer. “Hollywood’s Maverick Hero.” Newsweek 113, no. 17 (April 24, 1989): 72-73. Keith, Todd. Kevin Costner: The Unauthorized Biography. London: ikonprint, 1991. Worrell, Denise. “Hollywood Rediscovers Romance.” Time 130, no. 10 (September 7, 1987): 72-73. Bill Knight See also Action films; Big Chill, The; Film in the United States.
■ Country music Definition
Popular music genre derived from the folk and regional music of the American South, as well as that of cowboys in the West
Country music became big business in the 1980’s, reaching a larger mass audience, but this move to the mainstream led to a backlash among recording artists who sought to preserve a more traditional sound. With the popularity of country-influenced sound tracks to films such as Urban Cowboy (1980) and Nine to Five (1980), Nashville moved from a relative backwater to the mainstream of the entertainment industry. However, by mid-decade a movement labeled new traditionalism sought to reinvigorate the traditional country music sound by wedding classic country songs and performance styles with state-of-theart recording and marketing techniques. (continued on page 258)
256
■
The Eighties in America
Country music
Billboard Country Music Hit Singles of the 1980’s Year
Song
Performer
1980
“Tennessee River,” “Why Lady Why”
Alabama
“My Heroes Have Always Been Cowboys,” “On the Road Again”
Willie Nelson
“Could I Have This Dance?”
Anne Murray
“He Stopped Loving Her Today”
George Jones
“Cowboys and Clowns,” “Misery Loves Company,” “My Heart”
Ronnie Milsap
“Bar Room Buddies”
Merle Haggard
“Feels So Right,” “Love in the First Degree,” “Old Flame”
Alabama
“Fool-Hearted Memory”
George Strait
“Angel Flying Too Close to the Ground”
Willie Nelson
“Blessed Are the Believers”
Anne Murray
“Still Doin’ Time”
George Jones
“9 to 5,” “But You Know I Love You”
Dolly Parton
1981
1982
1983
“Elvira”
Oak Ridge Boys
“Close Enough to Perfect,” “Mountain Music,” “Take Me Down”
Alabama
“If You Think You Want a Stranger (There’s One Coming Home)”
George Strait
“Always on My Mind,” “Just to Satisfy You”
Willie Nelson
“Yesterday’s Wine”
George Jones and Merle Haggard
“Do I Ever Cross Your Mind?”
Dolly Parton
“Any Day Now,” “He Got You”
Ronnie Milsap
“Break It to Me Gently”
Juice Newton
“Dixieland Delight,” “Lady Down on Love,” “The Closer You Get”
Alabama
“Can’t Even Get the Blues,” “You’re the First Time I’ve Thought About Leaving”
Reba McEntire
“Amarillo by Morning,” “You Look So Good in Love,” “A Fire You Can’t Put Out”
1984
George Strait
“Pancho and Lefty”
Willie Nelson and Merle Haggard
“A Little Good News”
Anne Murray
“Islands in the Stream”
Dolly Parton
“Stranger in My House”
Ronnie Milsap
“(There’s a) Fire in the Night,” “If You’re Gonna Play in Texas (You Gotta Have a Fiddle in the Band),” “Roll On (Eighteen Wheeler),” “When We Make Love”
Alabama
“How Blue”
Reba McEntire
“Mama He’s Crazy,” “Why Not Me”
The Judds
“Does Fort Worth Ever Cross Your Mind?,” “Let’s Fall to Pieces Together,” Right or Wrong"
George Strait
“Just Another Woman in Love,” “Nobody Loves Me Like You Do”
Anne Murray
“Still Losin’ You”
Ronnie Milsap
The Eighties in America
Country music
Year
Song
Performer
1985
“Can’t Keep a Good Man Down,” “Forty-Hour Week (for a Livin’),” “There’s No Way” “Somebody Should Leave”
Alabama Reba McEntire
“Have Mercy,” “Love Is Alive,” “A Girl’s Night Out”
The Judds
“The Chair”
George Strait
“On the Other Hand”
Randy Travis
“Forgiving You Was Easy”
Willie Nelson
“Lost in the Fifties Tonight (In the Still of the Night),” “She Keeps the Home Fires Burnin’” “I Don’t Know Why You Don’t Want Me”
Ronnie Milsap Roseanne Cash
“Highwayman”
1986
1987
1988
“She and I,” “Touch Me When We’re Dancing”
■
257
Waylon Jennings, Willie Nelson, Johnny Cash, Kris Kristofferson Alabama
“Little Rock,” “What Am I Gonna Do About You?,” “Whoever’s in New England” “Cry Myself to Sleep,” “Grandpa (Tell Me ’bout the Good Old Days),” “Rockin’ with the Rhythm of the Rain” “It Ain’t Cruel to Be Crazy About You,” “Nobody in His Right Mind Would’ve Left Her” “Diggin’ up Bones”
George Strait Randy Travis
“Guitar Town”
Steve Earle
“Guitars, Cadillacs,” “Honky Tonk Man”
Dwight Yoakam
“Now and Forever (You and Me)”
Anne Murray
“In Love”
Ronnie Milsap
“Face to Face”
Alabama
“Last One to Know,” “One Promise Too Late”
Reba McEntire
“I Know Where I’m Going,” “Maybe Your Baby’s Got the Blues”
The Judds
“All My Ex’s Live in Texas,” “Am I Blue,” “Ocean Front Property”
George Strait
“Forever and Ever, Amen,” “I Won’t Need You Anymore (Always and Forever),” “Too Long Gone” “’80’s Ladies,” “Do Ya’”
Randy Travis K. T. Olsin
“Snap Your Fingers”
Ronnie Milsap
“Fallin’ Again,” “Song of the South”
Alabama
“I Know How He Feels,” “Love Will Find Its Way to You,” “New Fool at an Old Game” “Change of Heart,” “Give a Little Love,” “Turn It Loose”
Reba McEntire The Judds
“Baby Blue,” “Famous Last Words of a Fool,” “If You Ain’t Lovin’ (You Ain’t Livin’)” “Deeper than the Holler,” “Honky Tonk Moon, ”I Told You So"
George Strait Randy Travis
“I Sang Dixie,” “Streets of Bakersfield”
Dwight Yoakam
“Hold Me,” “I’ll Always Come Back”
K. T. Oslin
“Button off My Shirt”
Ronnie Milsap
“Runaway Train”
Roseanne Cash
Reba McEntire The Judds
(continued)
258
■
The Eighties in America
Country music
Billboard Country Music Hit Singles of the 1980’s
(continued)
Year
Song
Performer
1989
“High Cotton,” “If I Had You,” “Southern Star”
Alabama
“Cathy’s Clown”
Reba McEntire
“Let Me Tell You About Love”
The Judds
“Ace in the Hole,” “Baby’s Gotten Good at Goodbye”
George Strait
“Is It Still Over?,” “It’s Just a Matter of Time”
Randy Travis
New Voices, New Forces Alabama, a country-rock group that owed more to Southern rock than to Willie Nelson, became the first group to capitalize on the exposure of country music to a national audience. In the 1980’s alone, the group scored twentyeight number one singles on the Billboard Hot Country Singles chart (including seven in 1984-1985). Alabama paved the way for a Nashville-based country music industry that was as corporate and as skillful at exploiting marketing techniques to maximize exposure and sales as was the mainstream rock industry. The group’s popularity coincided with the development and marketing of the first networks of country radio stations (which in the 1990’s would become the national conglomerates such as Clear Channel Communications). Reba McIntyre was another artist who began her career in the 1970’s but became a preeminent female country voice in the 1980’s. McIntyre sold in excess of 20 million records and later branched into popular film and television work. The daughter of a professional rodeo contestant, McIntyre helped generate the template for combining traditional country material with the latest recording methods. Like Alabama, she was quick to consolidate the various aspects of her career—including recording, marketing, and touring. Like the mother-daughter duo the Judds, these “new country” stars also exploited the emerging music video format, which would expand the genre’s audience by reaching cable television viewers but would also continue to warp the traditional sound of country music. Taking a cue from pop rock-oriented MTV, cable stations such as Country Music Television (CMT) and the Nashville Network (TNN) began airing country music videos in 1983. While country stalwarts like
George Jones and Tanya Tucker tried to revive their flagging careers with music videos, McIntyre and the Judds used the format to catapult themselves from successful country artists into major media stars. New Traditionalists Perhaps the maverick of the new country stars of the 1980’s was George Strait. Donning a broad-brimmed cowboy hat, Strait played straight-up honky-tonk with a strong dose of Western swing. Classics such as “Amarillo by Morning” (1982) and “All My Ex’s Live in Texas” (1987) appealed to country fans turned off by the increasingly high-tech image and glamour of the new country music. To a lesser degree, Randy Travis emulated George Strait’s success. Deemed “too country” by Nashville labels in the early 1980’s, Travis eventually found success with songs such as “On the Other Hand” (1985) and “Diggin’ Up Bones” (1986) that hearkened back to the older country music defined by “three chords and a story.” Steve Earle’s Guitar Town (1986) became a rallying cry for the new traditionalist sound, as did the Bakersfield honky-tonk sound of Dwight Yoakum, who was also considered “too country” in the early 1980’s. Yoakum’s Guitars, Cadillacs, Etc., Etc. (1986) became a hit based largely on Yoakum’s faithful remake of the Johnny Horton classic “Honky-Tonk Man” (1956). However, the attempt to breathe new life into the old sound soon found its limit within the country music establishment. Earle’s Copperhead Road (1987), with its anti-Vietnam War title track, was considered too radical for Nashville and was released as a rock album out of Los Angeles. Also testing the limits of country music were genre-bending performers such as Lyle Lovett and k. d. lang, as well
The Eighties in America
as “cow-punk” bands such as Jason and the Nashville Scorchers, whose lead singer’s style owed as much to Johnny Rotten as to Johnny Cash. Impact Country music became both more mainstream and more diverse during the 1980’s. Film and music videos brought country music to a national audience, as did the development of country radio conglomerates. While this initially led to a hybridization of country with rock and pop music, it also provided the platform for movements reacting against the more mainstream, less distinctive sound of new country. These movements included new traditionalism and alternative country, the latter of which featured uncompromising, independent artists who reveled in the country music’s history of quirky regionalism and who were not afraid to graft punk rock, rockabilly, bluegrass, and blues onto traditional country in order to maintain and reinvent that regionalism. Further Reading
Cusic, Don. Reba: Country Music’s Queen. New York: St. Martin’s Press, 1991. Biography of quintessential country music performer and 1980’s success story Reba McIntyre. While Cusic relies on McIntyre’s own autobiography, he places her achievement in the context of the transformation of the country music industry during her rise to fame. Fenster, Mark. “Country Music Video.” Popular Music 7, no. 3 (October, 1988): 285-302. An analysis of the development of the country music video in the 1980’s, highlighting the crossover appeal of Reba McIntyre and Hank Williams, Jr., as well as the use of iconic country images by the new traditionalists. Fox, Pamela. “Recycled ‘Trash’: Gender and Au thenticity in Country Music Autobiography.” American Quarterly 50, no. 2 (1998): 234-266. Survey of popular autobiographies of female country music stars, including Reba McIntyre, Dolly Parton, and Naomi Judd. Analyzes the performance identities created by women in country music and considers their negotiation of maternal and sexual imagery. Goodman, David. Modern Twang: An Alternative Countr y Music Guide and Director y. Nashville: Dowling, 1999. A comprehensive guide to the alternative country subgenre that includes entries on forerunners such as Gram Parsons, as well as
Crack epidemic
■
259
the major figures from the 1980’s: Jason and the Nashville Scorchers, Dwight Yoakum, Steve Earle, and others. Valentine, Gill. “Creating Transgressive Space: The Music of k. d. lang.” Transactions of the Institute of British Geographers, n.s. 20, no. 4. (1995): 474-485. Examination of lang’s subversion of traditional country music themes and audience to create lesbian community through her recordings and live performances. Luke Powers See also
Farm Aid; Mellencamp, John Cougar; MTV; Music; Music videos.
■ Crack epidemic Definition
Invention, proliferation, and media coverage of crack cocaine
Crack cocaine is highly addictive and cheaper than powder cocaine, and its use quickly became prevalent among America’s urban poor. Media portrayals of its proliferation among minority populations of inner cities created increased concern over illicit drugs and led to more punitive drug laws, while increasing people’s fear of urban crime. In the late 1980’s, the media focused significant attention on a new drug, crack cocaine. Crack is a modified version of freebase cocaine formed by adding baking soda or ammonia to powder cocaine, then heating it. It is typically smoked by users. As a smokable form of the drug, crack is absorbed more effectively and more quickly by the body than is snorted cocaine hydrochloride. Thus, it is effectively both stronger and cheaper than powder cocaine, even though molecule for molecule of cocaine, powder cocaine and crack cocaine cost the same. When it was first introduced, a rock of crack could be purchased for as little as $2.50 in some areas. Because of an oversupply of cocaine in the 1980’s, dealers decided to convert powder cocaine to crack to expand their markets to poorer individuals who could not afford the more expensive powder cocaine. Crack users who were interviewed in the 1980’s said that one puff of crack could lead to instant addiction. The media exaggerated the prevalence of the drug, as well as the consequences of crack use. They portrayed crack addiction as an epidemic in America’s inner cities. This portrayal led to drastic
260
■
The Eighties in America
Craft, Christine
changes in U.S. drug policy and a renewed focus on fighting a “War on Drugs.” The Anti-Drug Abuse Act of 1986 specifically allocated money to reduce the supply of cocaine entering the United States. By the end of the 1980’s, drug abuse was considered to be one of the most important problems facing the United States. It was more prominent in urban centers, and there was a significant surge in crack use in New York, Los Angeles, and Houston. The ravages of the drug in U.S. cities took on added significance in the late 1980’s, when Senator John Kerry held hearings into the connections between the Iran-Contra affair and the importation of cocaine into the United States. Kerry’s hearings established that the federal government had, either knowingly or unknowingly, facilitated the importation of thousands of pounds of cocaine into the country and had paid hundreds of thousands of dollars to known drug traffickers to transport humanitarian assistance to the Contras. For the last two years of the 1980’s, national media attention concerning the crack epidemic focused on women, primarily pregnant crack users. Part of the crack epidemic involved the term “crack babies,” the children born to crack-addicted mothers, who themselves became addicted to the drug while still in the womb. Crack babies were portrayed as future violent criminals who, upon birth, exhibited a number of different developmental problems, low birth weight, hyperactivity, poor concentration, neurological problems, birth defects, brain damage, and cocaine withdrawal symptoms. Another outcome of the crack epidemic was an increase in turf wars among rival gangs in urban areas where crack was introduced. With the new lucrative drug addition to the market, individuals and gangs fought for the right to sell crack, and violence increased. Impact The crack epidemic was itself a widely discussed issue, but it also focused several larger issues central to the 1980’s. Fictional and nonfictional media portrayals prevalent during the decade ensured that crack cocaine was associated in the public mind with desperate, impoverished racial minorities committing violent crimes, while powder cocaine was seen as the drug of rich white stockbrokers and movie stars. While neither stereotype was completely accurate, the news media were criticized for their tendency to focus on the form of cocaine associated with inner-city crime rather than the form abused by
affluent white Americans. Crack addicts were also at the center of the 1980’s debate over whether the proper response to drug addiction was treatment or punishment. These criticisms from the Left were dismissed by conservatives, who accused critics of simply being “weak on crime.” Liberals for their part were quick to seize on the revelations of the IranContra affair, which seemed to provide evidence that people employed by the Central Intelligence Agency were also involved in drug running. Further Reading
Humphries, Drew. Crack Mothers: Pregnancy, Drugs, and the Media. Columbus: Ohio State University Press, 1999. Reinarman, Craig, and Harry G. Levine, eds. Crack in America: Demon Drugs and Social Justice. Los Angeles: University of California Press, 1997. Williams, Terry. Crackhouse: Notes from the End of the Line. New York: Penguin Books, 1992. Sheryl L. Van Horne See also Conservatism in U.S. politics; Crime; Gangs; Iran-Contra affair; Journalism; Liberalism in U.S. politics.
■ Craft, Christine Identification
Television news anchor whose sexdiscrimination lawsuit received national attention Born 1944; Canton, Ohio Craft made broadcast history when she won a $500,000 verdict against a Kansas City TV station for sex discrimination. Her lawsuit challenged the different standards by which male and female on-air broadcast news anchors were judged in the U.S. media industries, and the jury verdict, though eventually overturned on appeal, was considered a victory for women’s rights. On January 5, 1981, Christine Craft made her debut as co-anchor on the evening news on KMBC-TV in Kansas City, Missouri. Concerned about her appearance, the station wanted Craft to have a makeup and clothing makeover, but she refused to change. She criticized the station’s policy on appearance for anchors and reporters as being based on stereotyped characterizations of women. Nine months after her debut, the station removed
The Eighties in America
Craft as anchor and reassigned her as a generalassignment reporter. The station defended its decision based on negative viewer response to Craft’s performance. Craft resigned and sued Metromedia, the owners of KMBC-TV until 1982, for sex discrimination and fraud, arguing that she had been demoted because the audience viewed her as “too old, too ugly, and not deferential to men.” In 1983, a jury awarded Craft $500,000, but the judge overturned the award as excessive, because he believed the jury had been affected by extensive media coverage of the case. The judge also ruled that Craft had not provided sufficient evidence to establish that she had experienced sex discrimination: Removing an anchor because of poor ratings was standard procedure for television stations, whose ratings were critical in determining their “prestige and profits.” The judge’s written decision also included his finding that the station’s news director never said the much publicized phrase that Craft
Craft, Christine
■
261
was “too old, too unattractive, and not deferential enough to men.” The judge ordered a new trial to determine if Metromedia had falsely promised Craft that she would not have to make substantial changes in her appearance when it hired her. The second trial took place in a different venue, Joplin, Missouri, in an attempt to find an unbiased jury. In 1984, the second jury sided with Craft and awarded her $325,000. Both sides appealed the decision. Metromedia appealed to have the $325,000 award thrown out. Craft appealed, because she believed that the trial judge ignored evidence of sex discrimination, and she wanted the original $500,000 award reinstated. In 1985, a federal appeals court overturned Craft’s $325,000 award, ruling that there was insufficient evidence to prove that Metromedia was guilty of either fraud or sex discrimination. In 1986, Craft appealed to the U.S. Supreme Court, but the Court refused to hear her case.
Christine Craft poses on a news set at KRBK-TV in Sacramento, California, in 1986. The station hired her after she sued former employer KMBC-TV in Kansas City, Missouri. (AP/Wide World Photos)
262
■
Crime
Impact Although Christine Craft ultimately lost her sex-discrimination lawsuit against Metromedia, her legal battle brought greater public attention to the issue. It may have helped many other female journalists who wanted to be judged by performance rather than appearance. Further Reading
Beasley, Maurine H., and Sheila J. Gibbons. Taking Their Place: A Documentary History of Women and Journalism. Washington, D.C.: American University Press, 1993. Craft, Christine. Christine Craft: An Anchorwoman’s Story. Santa Barbara, Calif.: Capra Press, 1986. _______. Too Old, Too Ugly, Not Deferential to Men. Rocklin, Calif.: Prima, 1988. Eddith A. Dashiell See also
Age discrimination; Journalism; Sexual harassment; Television; Women in the workforce; Women’s rights.
■ Crime Definition
Transgressions of local, state, or federal law
During the 1980’s, crime in the United States and Canada and the manner in which policy makers managed it experienced divergent paths. U.S. legislators attempted to prevent future crimes by dealing harshly with criminals, while Canadians attempted to rehabilitate criminals and turn them into productive members of society. During the 1980’s, crime rates were on the rise in the United States. State and federal governments responded to the ensuing public outcry by passing tougher laws. Meanwhile, Canadian governments favored a gentler approach, proposing policies of restoration and rehabilitation rather than retribution. Certain demographic groups, such as young people and aboriginal peoples, were singled out for particular rehabilitation efforts, in an effort better to integrate those groups into Canadian society as a whole. Crime in the United States By the early 1980’s, crime rates in the United States had risen to alarming levels, causing criminologists and other experts to warn that the country could descend into a devastating state of chaos. However, the 1980’s was also a pivotal decade for crime: Crime rates for serious
The Eighties in America
crimes—a category including murder, drug trafficking, aggravated assault, rape, and burglary—leveled off during the decade, and by its end they had even begun to decline. During the same period, however, rates of incarceration increased dramatically. As a result, beginning in the 1980’s, the overall U.S. prison population increased significantly. The stricter laws being passed by state and federal legislatures included many requiring longer sentences, as well as mandatory sentences for some offenses and lower rates of parole. These laws took more criminals off the streets and kept them behind bars longer. In federal prisons alone, the number of inmates increased by more than 140 percent between 1980 and 1989, going from twenty-four thousand inmates to more than fifty-eight thousand inmates. These new laws were part of a concerted shift in public policy away from rehabilitation and toward incarceration simply as a way to remove convicted criminals from the public sphere, so they could not repeat their offenses. They were passed during a decade when public concerns about crime rates motivated many politicians to claim that they were “tough on crime.” President Ronald Reagan, for example, mounted a Get Tough on Crime campaign in 1986 that resulted in the passage of mandatory minimum sentences for drug trafficking. The law limited the discretionary power of judges, mandating, for instance, a sentence of no less than ten years for convicted first-time cocaine traffickers. As the decade progressed, Federal Bureau of Investigation (FBI) records indicated that the rate at which serious crimes were being reported had started to decline. Whether as a direct effect of the new laws or not, by decade’s end there had been a net 20 percent decrease in violent crimes, a decrease for which “tough on crime” programs received the credit. Some disturbing trends emerged as a result of the 1980’s increase in incarceration. African Americans came to represent a disproportionately large percentage of the prison population, especially the population serving time for drug-related offenses. Critics of the public policies instituted in the 1980’s tended to focus on this fact. African Americans abused drugs at approximately the same rate as did white Americans, but an African American drug user was much more likely to be arrested than was a white drug user. Moreover, an African American who was arrested for a drug-related offense was much more likely to be convicted than was a white
The Eighties in America
Crime
263
■
United States Crime Rates Per 100,000 Inhabitants, 1980-1989 Year Population
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
225,349,264 229,146,000 231,534,000 233,981,000 236,158,000 238,740,000 240,132,887 243,400,000 245,807,000 248,239,000
All crimes Violent Property
5,950.0
5,858.2
5,603.7
5,175.0
5,031.3
5,207.1
5,480.4
5,550.0
5,664.2
5,741.0
596.6
594.3
571.1
537.7
539.2
556.6
620.1
609.7
637.2
663.1
5,353.3
5,263.8
5,032.5
4,637.3
4,492.1
4,650.5
4,881.8
4,940.3
5,027.1
5,077.9
Murder
10.2
9.8
9.1
8.3
7.9
8.0
8.6
8.3
8.4
8.7
Forcible rape
36.8
36.0
34.0
33.7
35.7
37.1
38.1
37.4
37.6
38.1
251.1
258.7
238.9
216.5
205.4
208.5
226.0
212.7
220.9
233.0
assault
298.5
289.7
289.1
279.2
290.2
302.9
347.4
351.3
370.2
383.4
Burglary
1,684.1
1,649.5
1,488.8
1,337.7
1,263.7
1,287.3
1,349.8
1,329.6
1,309.2
1,276.3
Larceny/theft
3,167.0
3,139.7
3,084.9
2,869.0
2,791.3
2,901.2
3,022.1
3,081.3
3,134.9
3,171.3
502.2
474.7
458.9
430.8
437.1
462.0
509.8
529.5
582.9
630.4
Robbery Aggravated
Vehicle theft
Source: Federal Bureau of Investigation Uniform Crime Reports and the Disaster Center.
suspect arrested for the same crime. These trends were largely the result of a national drug policy that focused far more on inner-city crimes than on similar crimes taking place in the suburbs. Some other trends in crime in the United States during the 1980’s were as follows: Young African American males were more likely than not to be victims of crime overall; however, young people in general, notwithstanding race, experienced high rates of crime. Men committed crimes at higher rates than did women, while women were more likely to be victims than men, particularly older white females. Urban areas were made safer over the course of the decade; however, it remained the case that cities experienced higher crime rates than did rural and suburban areas. Crime in Canada
Canadian crime statistics trended largely in the opposite direction of U.S. statistics during the decade, as criminal activity rose steadily. When compared to Europeans, Canadians were more likely to be victims of serious crimes, including assault, sexual offenses, burglary, and robbery. These statistics alarmed experts; as a result, the 1980’s became a decade as pivotal in Canadian criminal and legal history as it was in U.S. history. One alarming trend was that a large percentage of crimes went unreported to policing agencies such
as the Royal Canadian Mounted Police (RCMP). For instance, only 10 percent of sexual assaults, 32 percent of other assaults, and 50 percent of propertyrelated crimes were reported. Meanwhile, of sexual assault victims (primarily women), 31 percent were attacked by someone they knew, such as a relative, a neighbor, or an acquaintance. Experts point to the likelihood that crimes such as sexual offenses were not reported to the police precisely because victims knew their attackers. By the end of the decade, both provincial and federal governments began to spend tremendous amounts of money to combat crime and to enact statutes and crime-reduction policies that would help Canadians feel safer. During the 1980’s, Canada began to incorporate a philosophy known as “restorative justice” into its public policy. Restorative justice seeks to restore the relationships harmed by a crime and to heal the damages caused by that crime. These include damages to victims, to communities, and to the offenders. In addition, preventative measures were adopted to keep the most likely offenders from committing crimes, and efforts were made to rehabilitate criminals, restoring them to an acceptable role in society. Young people were particularly targeted for these preventive and rehabilitative efforts, because criminal activity among youthful Canadians had increased in the previous decade. In 1984, therefore, Parlia-
264
■
The Eighties in America
Crime
ment passed the Young Offenders Act, which instituted different treatment for criminals between the ages of twelve and eighteen. Prior to the act’s passage, young offenders in Canada were treated just as adults were and were subjected to the same penalties. The Young Offenders Act instituted a policy of fashioning age-appropriate techniques of rehabilitation. The decision to tailor correctional techniques to accomplish restorative, rather than punitive, aims exemplified the philosophy guiding Canadian criminal justice during the 1980’s and beyond: The approach to crime prevention in Canada increasingly combined traditional focuses on rehabilitation and retribution with ideas reflecting the principles of restoration. This shift in Canadian justice was influenced by a steady increase in Canada’s prison population. Unlike the United States, where the prison population grew as a result of longer sentences, Canada’s prisons were crowded by an increase in the number of offenders. The nation sought to address this problem in part by instituting a restorative philosophy that asserted that offenders need not be isolated from the rest of society, either for justice to be achieved or for the members of society to feel safe. Thus, restorative justice is well-suited to a need to reduce crime while simultaneously reducing the prison population. These needs were also addressed by reforms in sentencing practices. In 1987, the Canadian Sentencing Commission produced an influential report outlining judicial processes in need of reform and focusing particularly on the sentencing process. The commission emphasized that sentencing policies represented a statement to society, asserting that lawbreakers would be held accountable. Thus, a strong sentencing policy would simultaneously reassure citizens that they were safe and deter potential offenders from committing crimes. Reforms were enacted to achieve these goals, but they also incorporated restorative principles, which sought to ensure that the particulars of a given case would be scrutinized so that the punishment would always fit the crime. The prisons themselves had to respond to emerging trends during the 1980’s as well. For example, an increasing proportion of the prison population was human immunodeficiency virus (HIV) positive, so health care facilities capable of treating prisoners with acquired immunodeficiency syndrome (AIDS) had to be constructed. Meanwhile, in a trend similar
to the increasing overrepresentation of African Americans in the U.S. prison population, Canadian aboriginal peoples made up a disproportionate percentage of their nation’s inmates. Like African Americans, aborigines were no more likely to commit crimes than were other demographic groups, but they were significantly more likely to be arrested, convicted, and sentenced for crimes they did commit. Canadian policy makers, however, were more sensitive to this situation than were their American counterparts. They sought throughout the decade to reverse the increase in aboriginal prisoners, albeit with little success. Impact
Each nation’s approach to crime had strengths and weaknesses. The United States succeeded in reducing recidivism, but only by creating the third-largest prison population per capita in the world, after the Soviet Union and apartheid-era South Africa. (With the fall of those two governments in the 1990’s, the United States would come to have the highest per capita prison population in the world.) The nation effectively reduced the symptom, without addressing the underlying causes of crime—causes such as poverty and racism that were arguably exacerbated by the American punitive approach to justice. Canada, for its part, attempted a more holistic approach to crime that sought to take account of race and class, among other factors. This approach had its own pitfalls, however, as, for example, some men accused of violence against women sought to justify their behavior on the grounds that it was appropriate to their particular subculture. It also tended to conflate cultural identity with racial identity in ways that proved counterproductive, encouraging stereotypes that portrayed some races as more likely to commit crimes than others.
Further Reading
Anand, Sanjeev. “The Sentencing of Aboriginal Offenders, Continued Confusion and Persisting Problems: A Comment on the Decision.” The Canadian Journal of Criminology 42 (July, 2000): 412419. Looks at aboriginal peoples in the Canadian criminal justice system. Cayley, David. The Expanding Prison: The Crisis in Crime and Punishment and the Search for Alternatives. Toronto: House of Anansi Press, 1998. Critical examination of traditionalist approaches to prison reform.
The Eighties in America
Doob, Anthony. “Transforming the Punishment Environment: Understanding the Public Views of What Should Be Accomplished at Punishment.” The Canadian Journal of Criminology 42 (July, 2000): 323-340. Analysis of public perceptions of preventative and restorative policy initiatives. Levitt, Steven D. “The Effect of Prison Population Size on Crime Rates: Evidence from Prison Overcrowding Litigation.” Quarterly Journal of Economics, May, 1996, 319-351. Evaluates statistics on prison overcrowding in the United States to determine its effects. Phillips, Llad. “The Criminal Justice System: Its Technology and Inefficiencies.” Journal of Legal Studies 10 (June, 1981): 363-380. A scathing look at the flaws of the justice system in the United States. Ramcharan de Lint, Subhas, and Thomas Fleming. The Canadian Criminal Justice System. Toronto: Prentice Hall, 2001. Explores all elements of Canada’s justice processes, from the courts to the police. Highlights flaws by suggesting that many processes, such as a reliance on precedence and procedural justice, are outmoded. Roach, Kent. “Changing Punishment at the Turn of the Century.” The Canadian Journal of Criminology 42 (July, 2000): 249-280. Appraisal of Canadian progress toward instituting an effectively restorative and rehabilitative system of justice. Roberts, Julian V. “Racism and the Collection of Statistics Relating to Race and Ethnicity.” In Crimes of Colour: Racialization and the Criminal Justice System in Canada, edited by Wendy Chan and Kiran Mirchandani. Peterborough, Ont.: Broadview Press, 2002. This chapter evaluates the roles that race plays in the justice system. Roberts, Julian V., and Andrew Von Hirsch, “Sentencing Trends and Sentencing Disparity.” In Making Sense of Sentencing, edited by Julian V. Roberts and David P. Cole. Toronto: University of Toronto Press, 1999. Analyzes trends in sentencing in the wake of reforms to determine whether those reforms have been effective. Verdun-Jones, Simon N., and Curt T. Griffiths. Canadian Criminal Justice. 2d ed. Toronto: Harcourt Brace, 1994. Broad overview of the system of justice in Canada, analyzing the positives and negatives of its implementation, as well as the abstract philosophical considerations that underlie it. Esmorie J. Miller
Cruise, Tom
■
265
See also
Aboriginal rights in Canada; Abscam; African Americans; Atlanta child murders; Central Park jogger case; Domestic violence; Rape; Reagan, Ronald.
■ Cruise, Tom Identification American actor Born July 3, 1962; Syracuse, New York
A talented and extremely charismatic movie actor, Cruise became a superstar in the 1980’s. Born Thomas Cruise Mapother IV, Tom Cruise discovered his love of acting during his senior year in high school, when he was cast in the school production of the musical Guys and Dolls (pr. 1950). Following graduation and with no interest in attending college, Cruise moved to New York City to pursue a professional acting career. In 1981, the young actor was hired to play a small role in the movie Endless Love, starring Brooke Shields and Martin Hewitt. That same year, he also earned a role in Taps, starring Timothy Hutton. Two years later, Cruise starred in the comedy Losin’ It (1983) and also played a supporting role in Francis Ford Coppola’s The Outsiders (1983). Cruise’s big break came later that same year, when he took on the leading role in another teen film, Risky Business (1983). Enormously popular at the box office, the movie earned more than $56 million in its first eleven weeks in theaters and launched Cruise on the road to superstardom. In 1983, he also starred in All the Right Moves, which, although not as financially successful as his other movies, nonetheless earned the young actor critical acclaim for his portrayal of the movie’s main character, Stef Djordevic. In 1986, Cruise starred in the tremendously successful film Top Gun, further elevating the young actor to superstar status. The movie, about an American fighter pilot, became that year’s highest-grossing film at the box office. Also in 1986, Cruise starred in The Color of Money with Paul Newman, a sequel to Newman’s earlier The Hustler (1961). In 1988, Cruise starred in the vastly successful movie Rain Man with Dustin Hoffman. That film received eight Academy Award nominations and won four awards, including Best Picture and Best Actor for Hoffman. Cruise ended the decade starring in Oliver Stone’s Born on
266
■
The Eighties in America
Cyberpunk literature
Roll,” became ensconced in American popular culture and memories of the 1980’s. Further Reading
Johnstone, Iain. Tom Cruise. London: Hodder & Stoughton, 2007. Prince, Stephen. American Cinema of the 1980’s: Themes and Variations. Piscataway, N.J.: Rutgers University Press, 2007. Sellers, Robert. Tom Cruise: A Biography. London: Robert Hale, 1997. Bernadette Zbicki Heiney See also
Academy Awards; Film in the United States; Hoffman, Dustin; Teen films.
■ Cyberpunk literature Definition
Science-fiction subgenre dealing with computer-dominated future societies
Cyberpunk responded to the development of computer networks by imagining worlds in which they were pervasive and defined human relationships. Works in the genre anticipated the development of the Internet, the World Wide Web, and virtual-reality gaming.
Tom Cruise receives a star on the Hollywood Walk of Fame in 1986. (AP/Wide World Photos)
the Fourth of July (1989). The critically acclaimed film earned Cruise his first Academy Award nomination for Best Actor and also served to highlight his talents as a serious actor. Impact After paying his dues early in the decade, Tom Cruise exploded onto the screen during the mid-1980’s. As a hardworking young actor, he took on roles in twelve different movies during the decade. Many of these films became huge box-office successes and built for him an enormous fan base. In addition to making him a superstar, Cruise’s acting also earned him a place in American cinematic and cultural history. His performance in Risky Business, especially the scene in which his character dances and lip-syncs to Bob Seger’s “Old Time Rock and
The principal texts that gave rise to and exemplified cyberpunk literature were a series of stories by William Gibson begun with “Johnny Mnemonic” (1981) and culminating with the novel Neuromancer (1984); the earlier stories were collected in Burning Chrome (1986). The characters in Gibson’s stories are able to project themselves into the virtual “cyberspace” contained in a worldwide computer network by “jacking in” through their personal computers (PCs). The countercultural values and slick, picaresque story lines of Gibson’s work inspired a cyberpunk movement, named by Gardner R. Dozois, loudly advertised by Bruce Sterling’s fanzine Cheap Truth, and widely popularized by Sterling’s best-selling anthology Mirrorshades (1986). Other key texts of cyberpunk fiction included Sterling’s Islands in the Net (1988) and his Shaper/Mechanist series, which was launched in 1982 and culminated in the novel Schismatrix (1985); Rudy Rucker’s Software (1982) and Wetware (1988); Gibson’s Count Zero (1986) and Mona Lisa Overdrive (1988); Michael Swanwick’s Vacuum Flowers (1987); and some of the short stories in John Shirley’s Heatseeker (1988) and Pat Cadigan’s Patterns (1989).
The Eighties in America
Virtual spaces contained within computers had previously been explored in a number of sciencefiction novels from the mid-1960’s onward, and many aspects of Gibson’s scenario had been anticipated by Vernor Vinge’s True Names (1981), but Gibson’s notion of cyberspace acquired an iconic significance. (An alternative term used in Neuromancer but not original to it—“the matrix”—subsequently acquired
Cyberpunk literature
■
267
a similar charisma by virtue of its use in cinema.) Neuromancer was published shortly after the National Science Foundation’s academically oriented network CSNET (founded in 1980) was connected to the Defense Department’s ARPANET with the aid of the Transmission-Control Protocol/Internet Protocol (TCP/IP) and just before the establishment of the NSFNet in 1985 laid down the backbone of the Internet, which would become the host of all the sites forming the World Wide Web. Selected Cyberpunk Fiction of the 1980’s Neuromancer’s timeliness enabled it to capture the imagination Date of the engineers and users develAuthor Title Type of Book Published oping such systems, who were already forming the nucleus of a Greg Bear Blood Music Novel 1985 new “cyberculture.” Many of the David Jay Brown Brainchild Novel 1988 enterprising young hobbyists rePat Cadigan Mindplayers Novel 1987 cruited by the companies blossomPatterns Short stories 1989 ing in California’s Silicon Valley were enthusiastic to conceive of William Gibson Neuromancer Novel 1984 themselves as ultra-cool innovative Count Zero Novel 1986 nonconformists; cyberpunk literaBurning Chrome Short stories 1986 ture gave them a label to apply to Mona Lisa Overdrive Novel 1988 this ideal and a definitive set of Marc Laidlaw Dad’s Nuke Novel 1985 hero myths—or, more accurately, Rudy Rucker Software Novel 1982 antihero myths. In cyberpunk litWetware Novel 1988 erature, cyberspace became a new frontier to replace the Wild West, Lewis Shiner Frontera Novel 1984 one whose particular type of lawJohn Shirley City Come a-Walkin’ Novel 1980 lessness would work to the advanThree-Ring Psychus Novel 1980 tage of nerds instead of gunslingEclipse Novel 1985 ers. Such fiction offered a new kind Eclipse Penumbra Novel 1988 of escapist imagery in which the Heatseeker Short stories 1988 obsolete mythology of the seemBruce Sterling The Artificial Kid Novel 1980 ingly abortive space age was reSchismatrix Novel 1985 placed by a nascent mythology inIslands in the Net Novel 1988 volving the use of technology to Crystal Express Short stories 1989 achieve a transcendent breakthrough to freedom from the burBruce Sterling, Mirrorshades: A Short-story editor Cyberspace Anthology collection 1986 dens of the flesh. The “uploading” of minds from Michael Swanwick Vacuum Flowers Novel 1987 the brain’s “wetware” to a much Vernor Vinge True Names . . . and vaster and more durable silicon Other Dangers Short stories 1982 matrix quickly became the holy The Peace War Novel 1984 grail of a “posthumanist” or “transMarooned in Realtime Novel 1986 humanist” movement founded Threats . . . and in the late 1980’s by such propaOther Promises Short stories 1989 gandists as F. M. Esfandiary (also known as FM-2030) and Max More
268
■
The Eighties in America
Cyberpunk literature
(Max T. O’Connor). Meanwhile, cyberpunk fiction often described artificial intelligences (AIs) that were native to cyberspace and that strove with disembodied humans for control of the virtual universe. Gibson called cyberspace a “consensual hallucination,” but the notion acquired a greater authority as computer software became better able to produce visible models of three-dimensional space incorporating sophisticated “virtual realities.” The progress of this kind of software and the hardware for displaying its results was most conspicuously seen during the 1980’s in the development of video games played on PCs and on coin-operated arcade machines. As a manifest movement, cyberpunk did not outlast the 1980’s—Bruce Sterling boasted that the term was “obsolete before it was coined”—but the label found continued life beyond its early enthusiasts. Marketing departments continued to apply the term to books that did not necessarily warrant it, and postmodernist critics adopted it as well to discuss important trends in American culture, especially the burgeoning confusion of the meaning of the word “real.” Moreover, the subgenre inspired several other related subgenres, notably steampunk—in which stories are often set in an alternate, technologically advanced version of Victorian England. Impact Cyberpunk’s iconic motifs were so closely pursued by actual developments in computer technology that they soon lost their capacity to inspire awe. However, the fantasy of a physically or spiritually accessible cyberspace, a virtual reality to which humans could travel, remained a powerful part of the cultural imaginary. While cyberpunk fiction was not solely responsible for this idea, it did shape the popular understanding of the atmosphere and nature of cyberspace worlds. Thus, even as it became outdated, cyberpunk continued to influence the development of both science fiction and popular ideas about technology, especially about interactions between technology and humans.
Further Reading
Butler, Andrew M. The Pocket Essential Cyberpunk. Harpenden, Middlesex, England: Pocket Essentials, 2000. Succinct overview of the literature and its authors. Cavallaro, Dani. Cyberpunk and Cyberculture: Science Fiction and the Work of William Gibson. London: Athlone Press, 2000. A rather slapdash retrospective analysis of the movement and Gibson’s central role within it. Featherstone, Mike, and Roger Burrows, eds. Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment. Thousand Oaks, Calif.: Sage, 1995. Collection of fourteen essays—first featured in a special issue of the journal Bodies and Society— discussing the cultural impact of cyberpunk ideas and imagery. Heuser, Sabine. Virtual Geographies: Cyberpunk at the Intersection of the Postmodern and Science Fiction. Amsterdam: Rodolpi, 2002. A more careful retrospective study than Cavallaro’s, viewing cyberpunk as a quintessentially postmodern phenomenon; includes a timeline of cyberpunk history and a good bibliography. McCaffery, Larry, ed. Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Science Fiction. Durham, N.C.: Duke University Press, 1991. A showcase anthology whose compendium of literary examples casts a wider net than usual; its twenty items of miscellaneous nonfiction include Timothy Leary’s “The Cyberpunk: The Individual as Reality Pilot.” Sterling, Bruce. Preface to Mirrorshades: The Cyberpunk Anthology. New York: Arbor House, 1986. The combative “manifesto” of the cyberpunk movement, employed as a preface to the subgenre’s primary showcase; also reprinted in McCaffery’s anthology. Brian Stableford See also Book publishing; Computers; Gibson, William; Literature in the United States; Video games and arcades; Virtual reality.
D ■ Dallas Identification Prime-time television soap opera Date Aired April 2, 1978, to May 3, 1991
Dallas was the first prime-time soap opera aired in the United States. A groundbreaking venture, it took the traditionally housewife-targeted serial format and attempted to broaden its appeal for prime-time audiences by incorporating more explicit sexuality and nominally male-oriented themes. The concept for Dallas, developed by writer David Jacobs, was originally based on a situation reminiscent of William Shakespeare’s Romeo and Juliet (pr. c. 1595-1596) but set in contemporary Texas: The married couple Bobby Ewing and Pam Barnes would be caught in the middle of a conflict between their two warring families. The Columbia Broadcasting System (CBS) had commissioned a script, determined a one-hour pilot would not “show well,” and ordered five episodes. Dallas was thus initially planned as a miniseries, yet Jacobs hoped it would be popular enough for the network to turn it into a regular weekly show. To that end, actors were hired who were not “big names” and would therefore be available if the show continued. Patrick Duffy, playing Bobby Ewing, was the biggest star hired. Linda Gray and Victoria Principal beat out the competition for their roles, Sue Ellen Shepard Ewing and Pamela Barnes Ewing, respectively. The producers first discussed the part of J. R. Ewing with Robert Foxworth, but they loved Larry Hagman’s enthusiasm for the character and his “wicked” little smile. The Show Premieres The miniseries premiered on Sunday, April 2, 1978, and it aired on consecutive Sunday nights through the end of April. Critics panned the show, but viewers responded to the focus on family dynamics mixed with greed, glamour, intrigue, and romance. By the final episode, Dallas ranked in the top ten of the week’s most watched shows, and CBS ordered thirteen more scripts for the fall.
Dallas became one of the most successful television dramas in history and attracted a worldwide audience. The audience relished the conflict between the “good” brother Bobby and the “evil” brother J. R. They admired Miss Ellie, who, like a traditional mother, was the moral center for the Ewing family and sympathized with the sorrows of Sue Ellen. However, it was the actions of villain J. R. that kept most audience members watching. Hagman stated, “My character is the evil focal point of the show.” A power-hungry and amoral oil executive, J. R. was obsessed with Ewing Oil, a company founded by his father and namesake. Any threat either to the company itself or to his control of it brought out the worst in J. R., who became the man people loved to hate. Cliffhangers Dallas was best known for its seasonending cliffhangers, which caused fans to spend the summer hiatus in suspense, wondering how interrupted plotlines would be resolved. The most famous of these cliffhangers gave rise to a catchphrase that became part of American culture: “Who shot J. R.?” Because of his actions, J. R. had accumulated a number of enemies in the first two years of the show. On March 21, 1980, in the second season’s final episode, J. R., working late in his office at Ewing Oil, was shot twice by an unknown assailant. This became arguably the greatest cliffhanger of all time. Worldwide that summer, everyone seemed to be asking “Who shot J. R.?” The answer was a closely held secret. Hagman was the only cast member who knew, and he was offered $250,000 by a consortium of European newspapers for the answer. Even the queen mother of England asked Hagman to tell, but he would not do so. On November 21, 1980, 300 million people in fifty-seven countries tuned in to discover who shot J. R. Streets emptied as people gathered in front of television sets. In Turkey, the parliament recessed so that representatives could find out who had attempted to murder J. R. (The answer was Kristen, his sister-in-law and mistress.)
270
■
Dallas
The Eighties in America
tions to the problem. To mislead the cast, the crew, and reporters, three alternative resolutions were filmed. In a notorious and later parodied turn of events, the chosen solution was that the entire previous season, one year’s worth of episodes, had been Pam’s bad dream. Characters that had been introduced were discarded, situations were erased, dead characters were brought back to life, and the series picked up again from the end of the 1984-1985 season. Impact Dallas became the most popular television program in the world and an icon of popular culture. More than ninety countries The cast of Dallas poses in this publicity shot from 1980. Clockwise from left: Patrick reported empty streets during Duffy, Victoria Principal, Jim Davis, Larry Hagman, Linda Gray, Charlene Tilton, and the hour Dallas aired. Viewers beBarbara Bel Geddes. (Hulton Archive/Getty Images) came involved and were often consumed by characters and their activities. People wore T-shirts disEven after the show’s overall ratings began to slip, playing J. R.’s face, some stating “I hate J. R.,” others, cliffhanger episodes in the spring and their resolu“I love J. R.” Strangers found a common language, tion in the fall were able temporarily to boost the discussing episodes of Dallas. Scholars based articles program back into the top ten. Another famous cliffand dissertations on Dallas; it became one of the hanger occurred in the May 16, 1986, season finale, most studied texts in the history of television. when Pam Ewing woke up and found her husband It many ways, the show echoed the excesses of the Bobby—who had been killed at the end of the previ1980’s, not only through its location in Texas, but ous season and buried at the beginning of the 1985also in its focus on the rich. Some described the show 1986 season—standing in the shower. He said “Good as a caricature of the 1980’s decadence with its focus morning,” and the season ended, leaving viewers to on conspicuous consumption, greed, and J. R.’s wonder how Bobby could be alive, or if Duffy was credo: “It’s not what you get that matters; it’s what even playing the same character. Explanations for you can get away with.” As the luster of the era faded, the actor’s return were easy to surmise: Bobby had so did the popularity of the show. As Hagman noted, been killed off, because Duffy had wanted to leave “Dallas died with the Reagan era.” the show. However, in the following season, Dallas’s ratings had fallen, as the central conflict between Subsequent Events After 356 episodes, Dallas the two brothers had been replaced by situations inended on May 3, 1991, in a two-hour conclusion that volving international intrigue, which viewers found showed J. R. losing everything and contemplating less interesting. Meanwhile, Duffy’s attempt at a casuicide. Two made-for-television movies, Dallas: J. R. reer outside of Dallas had not been particularly sucReturns (1996) and Dallas: The War of the Ewings cessful. (1998), followed. Hagman’s personal appeal and the producers’ offer of a huge salary increase convinced Duffy to reFurther Reading turn. The question remained, however: How could Geraghty, Christine. Women and Soap Opera: A Study of Duffy’s character, Bobby, plausibly be brought back Prime Time Soaps. Cambridge, Mass.: B. Blackwell, to life? The writers developed three different solu1991. Discusses the patriarchal elements of the
The Eighties in America
Dance, popular
program, particularly the influence of J. R. and his relationships with women. Hagman, Larry. “Hats Off to Ten Years of Dallas.” People Weekly, April 4, 1988, 98ff. Hagman reflects on his years playing J. R., as well as on other characters and certain episodes of the show. Kalter, Suzy. The Complete Book of “Dallas”: Behind the Scenes at the World’s Favorite Television Program. New York: Abrams, 1986. Includes background material concerning the series, plot summaries from the initial miniseries through the 1985-1986 season, and many photographs. Liebes, Tamar, and Elihu Katz. The Export of Meaning: Cross-Cultural Readings of “Dallas.” 2d ed. Cambridge, England: Polity Press, 1993. Examination of the reception of Dallas in countries other than the United States and of the meanings attached to American culture when it circulates beyond U.S. borders. Marcia B. Dinneen See also
Dallas
Business and the economy in the United States; Dynasty; Reagan, Ronald; Soap operas; Television; Wall Street.
■ Dance, popular Definition
Forms of dance made popular by films, music, and other cultural elements
During the 1980’s, rapidly changing urban culture led to new forms of popular dance, as the creations of that culture spread throughout American society. Visual media aided significantly in the dissemination of these new types of dance. During the 1980’s, several of the most popular mainstream dances originated on the margins of society, including break dancing, slam dancing, the lambada, and vogue dancing. As a result of the decline of disco, funk, and rock and the changing urban culture within the United States, by 1980 hip-hop had gained recognition as a music genre and had begun its entrance into mainstream American culture. The middle of the decade saw a flourishing of the first hip-hop artists to achieve financial success. Hip-hop culture originated among the African American and Hispanic youth of New York City, and it encompassed music, dance, and fashion. The most prominent hip-hop dance was break dancing, which was
■
271
often learned and performed in everyday spaces instead of dance studios or schools. Indeed, break dancing was used as a nonviolent means of settling gangs’ territorial disputes. A particularly acrobatic form of dance, it often involved touching the ground with one’s head or back. Dancers performed on public streets wearing clothes made of slick materials to enable sliding and hooded jackets or bandanas to perform head spins and windmills. They often used large sheets of cardboard in lieu of dance floors and danced to music played on a nearby boombox. While break dancing was performed to recorded music, slam dancing was usually performed to live music. The new dance form, also referred to as moshing, reportedly began in Los Angeles in the early 1980’s. Slam dancers jumped aggressively and slammed into one another in a mosh pit to the beat of heavy metal or punk music. Dancing to different forms of music would result in different methods of moshing. For instance, moshing to hardcore punk music was faster and to an extent more strategized or arranged, while moshing to the more popular metal music was less choreographed and often took place in a much larger pit. Criticized as violent and dangerous, the dance was also recognized positively for its influence in helping dancers form communal associations. Skank, another form of slam dancing, also emerged during the 1980’s and was performed to ska music, which originated in Jamaica. In addition, the 1980’s saw the emergence of the lambada, which originated in Brazil, evolving from such other dances as the forró, sayas, maxixe, and carimbó. The lambada became associated with the idea of “dirty dancing,” because partners swayed quickly with their hips in close contact. In 1987, the film Dirty Dancing, starring Patrick Swayze and Jennifer Grey, greatly enhanced the popularity of the lambada. Influence of Visual Media Motion pictures and the new format of music videos both greatly facilitated the dissemination of the new 1980’s dance forms, first throughout America and then around the rest of the world. In 1981, cable television channel MTV began broadcasting music videos twenty-four hours a day. By the mid-1980’s, music videos played a central role in popular music marketing, and they came to reach a sizable, mass audience. When videos featuring break dancing, slam dancing, and the lambada were broadcast, millions of viewers encoun-
272
■
The Eighties in America
Davies, Robertson
tered these dances for the first time. Performers such as Michael Jackson, the “King of Pop,” greatly advanced the art of dance in the 1980’s by popularizing break dancing styles in music videos. Jackson was particularly known for the innovative choreography of his music videos. His video for “Thriller”—the title track of the best-selling album of all time, with worldwide sales exceeding 104 million—was fourteen minutes long and contained a remarkable dance sequence. In 1983, while performing “Billie Jean” at the “Motown Twenty-Five: Yesterday, Today, Forever” concert, Jackson debuted the “moonwalk,” which came to be regarded as his signature move. Teens instantly began to emulate his break dancing moves. Madonna, a professionally trained dancer, also used the music video form and her strong ability as a dancer to advance her career. Credit is given to her for popularizing voguing, a dance form developed in New York’s underground drag-house subculture, also known as ball culture. Many of the dancers Madonna employed on her tours were gay men who were familiar with ball culture and helped introduce her to it. Voguing was a form of competition through dance that utilized the struts and poses of fashion models. In addition, several films featuring dance achieved popularity during the decade. These included Fame (1980), Wild Style (1982), Flashdance (1983), Footloose (1984), and Dirty Dancing (1987). The dance styles portrayed in these films both arose out of and fed back into popular dance movements of the 1980’s. Impact Even though many trends and fads of the 1980’s later fell out of favor, the decade’s popular dances continued to develop, changed over time, and greatly influenced popular culture into the twenty-first century. These trends had themselves developed largely through the mainstream American appropriation of dances from other cultures, such as the lambada, or dances created by marginalized subcultures, such as break dancing and voguing. Notably, both of the latter forms originated as mediums of competition rather than simple performance. Slam dancing also arose as a subcultural means of expression, a form of dance that could be assayed only by those willing to brave a mosh pit. It too gained a bit more mainstream acceptance, as a toned-down version of slam dancing spread to many concert venues and other performance spaces.
Further Reading
Desmond, Jane. Meaning in Motion: New Cultural Studies of Dance. Durham, N.C.: Duke University Press, 1997. Features a variety of essays by dance experts that explore the cultural significance of dance. Includes dances pertinent to the 1980’s, such as the lambada and hip-hop forms. Deyhle, Donna. “Break Dancing and Breaking Out: Anglos, Utes, and Navajos in a Border Reservation High School.” Anthropology and Education Quarterly 17 (June, 1986): 111-127. Scholarly but accessible article that sheds light on how various groups of high school children utilize break dancing to create group identity and achieve success. Driver, Ian. A Century of Dance. London: Cooper Square Press, 2001. Each of the ten chapters considers a decade of dance. The final chapter, “Street Style,” features hip-hop and break dancing, Madonna, and moonwalking. Heavily illustrated. Nelson, George. Hip-Hop America. New York: Penguin Books, 2005. While this book tells the story of rap both as art form and as cultural and economic force, it also deals with the beginnings and development of break dancing in the early 1980’s. M. Casey Diana See also
Boom boxes; Break dancing; Cable television; Fads; Film in the United States; Flashdance; Heavy metal; Hip-hop and rap; Jackson, Michael; Leg warmers; Madonna; MTV; Music; Music videos; Rock and Roll Hall of Fame; Teen films; Television; World music.
■ Davies, Robertson Identification Canadian novelist Born August 28, 1913; Thamesville, Ontario Died December 2, 1995; Toronto, Ontario
Davies’ novels deployed magic, myth, and the supernatural to free his work from what he saw as a spiritually repressive modernity. Robertson Davies had been writing for many decades before the 1980’s, but that decade saw him achieve international renown. Although his Deptford Trilogy— Fifth Business (1970), The Manticore (1972), and World of Wonders (1975)—was published in the 1970’s, it only gained a large international readership in the
The Eighties in America
Davies, Robertson
■
273
following decade. These novels appealed to readers because of their ingenious, old-fashioned storytelling, but also because of Davies’ serious treatment of magic, mysticism, and the psychological theories of Carl Jung. Davies’ use of Jungian thought was in some quarters considered retrogressive and politically incorrect; his popularity may therefore suggest the growing conservatism associated with the 1980’s, as well as a continuation of the interest in alternative spiritualities associated with the previous decade. Davies’ first book of the 1980’s, The Rebel Angels (1981), received wide acclaim and began a new trilogy known as the Cornish Trilogy. Set in a college of the University of Toronto that resembled Davies’ own Massey College, the novel featured a host of professors, mainly professors of reliRobertson Davies in 1984. (Library and Archives Canada) gion, all with eccentric names and traits. In this novel, Father Simon Darcourt and Maria Theotoky (the surname is a version of the gels, and guardian spirits. With his flowing white Greek Orthodox term for the Virgin Mary) investibeard and magisterial cane, Davies was in great degate the late and mysterious Francis Cornish, a wellmand as a lecturer in the 1980’s, fascinating even known art connoisseur. What’s Bred in the Bone those who had not read his books with his dry wit, er(1985), the second book of the trilogy, was a prequel udition, and theatrical personality, which was a curithat related Cornish’s earlier life and the charming ous blend of medieval necromancer and Victorian chicanery by which he made his way in the world. sage. The concluding volume, The Lyre of Orpheus (1988), portrayed Cornish’s legacy in local theater, thereby Impact Davies’ popularity in the 1980’s spoke to a referencing Davies’ own significant experience as strong interest in spirituality and mysticism. Apprean actor and playwright in Canada. The latter two ciation of his celebration of eccentricity, originality, books of the trilogy also received prominent critical and individuality and fascination with his emphasis notice. What’s Bred in the Bone was nominated for the on the supernatural seemed to correspond with a 1986 Booker Prize, Britain’s most famous literary decrease in secular progressive politics and an inaward. crease in conservatism, although many social conDavies’ exuberant persona and the strange charservatives ironically disapproved of such mystical litacters and Gothic doings in his novels helped reinerature. vent Canadian literary culture as far more lively and unconventional than it had been previously. It was Further Reading Davies’ belief that his novels would also help reintroGrant, Judith Skelton. Robertson Davies: Man of Myth. duce to all of North America an acceptance of magic Toronto: Viking Press, 1994. and mysticism, including the reality of demons, anLa Bossière, Camille R., and Linda M. Morra. Robert-
274
■
Day After, The
son Davies: A Mingling of Contrarieties. Ottawa, Ont.: University of Ottawa Press, 2001. Little, Dave. Catching the Wind in a Net: The Religious Vision of Robertson Davies. Toronto: ECW Press, 2006. Margaret Boe Birns See also
Literature in Canada; Literature in the United States; Richler, Mordecai.
■ Day After, The Identification Made-for-television movie Date Aired on November 20, 1983
The Day After graphically depicted the horrors of nuclear war and, at least implicitly, criticized the mutual assured destruction theory upon which the United States’ security policy depended in the early 1980’s. It was a widely seen and discussed example of a common subgenre in 1980’s American culture, a culture increasingly dominated by fear of nuclear holocaust. The apocalyptic danger of nuclear war had long been a subject of films and television shows before The Day After was first broadcast to approximately 100 million American viewers in 1983. In that year, the United States was again in the midst of an arms race with the Soviet Union, increasing public anxiety and making the broadcast seem particularly relevant. Earlier treatments, moreover, had tended to be low budget (for example, the 1963 Twilight Zone episode “The Old Man in the Cave”) or had focused on the events leading up to such a war, rather than the aftermath. The major exception to this, the powerful 1959 presentation of the end of the world in On the Beach, had treated the topic almost clinically, featuring the unscathed but doomed Australians waiting for the radioactive cloud to reach and kill them too. The Day After, by contrast, starkly depicted the ugly brutality of nuclear destruction, although it also—as its producers acknowledged—understated the case, in part by portraying the United States being hit by significantly fewer missiles than would actually occur in a Soviet first strike. Still, the film painted a grisly enough picture a third of the way into its duration, when the pastoral life around Lawrence, Kansas, was irrevocably destroyed by nuclear missiles targeting
The Eighties in America
the missile silos located there. By then, the film’s producers had already taken their audience through the vocabulary of the age: “launch on warning” warfare, stage two alerts, and other prevailing concepts of nuclear warfare. Most interesting, the American Broadcasting Company (ABC) followed the film by airing a candid discussion of the dangers of the era, featuring leading public figures, scientists, and commentators. Over time, its message spread. By 1987, The Day After had been shown in more than forty countries abroad, including Mikhail Gorbachev’s liberalizing Soviet Union. Impact The airing and subsequent widespread distribution of the film was highly successful in achieving one of its aims: It brought about widespread national and international discussion of the possible consequences of nuclear warfare. Ironically, however, that discussion was not entirely of a pacifist nature. Its scenario of Soviet aggression as the cause of nuclear annihilation also bolstered President Ronald Reagan’s argument at the time that the United States could not afford to rely on arms parity with the Soviet Union but needed demonstrable arms superiority for the mutual assured destruction (MAD) system to succeed. The arms reduction agreements and accompanying discussions by Soviet and American leaders of means of averting catastrophe that the film’s producers had deemed necessary came only later, when the Soviet Union itself began to implode peacefully near the end of the 1980’s and disintegrated into its constituent parts in the early years of the following decade. Further Reading
Gregg, Robert W. International Relations in Film. Boulder, Colo.: Lynne Rienner, 1998. Jordan, Chris. Movies and the Reagan Presidency: Success and Ethics. Westport: Conn.: Praeger, 2003. Lipschutz, Ronnie D. Cold War Fantasies: Film, Fiction, and Foreign Policy. Lanham, Md.: Rowman & Littlefield, 2001. Strada, Michael, and Harold Trope. Friend or Foe? Russians in American Film and Foreign Policy. Lanham, Md.: Scarecrow Press, 1997. Joseph R. Rudolph, Jr. See also Cold War; Film in the United States; Reagan, Ronald; Science-fiction films; Television; Terminator, The.
The Eighties in America
■ Decker, Mary Identification American middle-distance runner Born August 4, 1958; Bunnvale, New Jersey
The only American runner—male or female—to hold the U.S. records in all middle-distance running events simultaneously, Decker dominated women’s track in the 1980’s. Exploding into the international track-and-field scene in her early teens and quickly setting several world records, “Little Mary Decker” by the early 1980’s had secured her place as one of the United States’ most charismatic and successful track-andfield figures, known for her tiny frame (she was barely ninety pounds) and her signature pigtails, as
At the 1984 Summer Olympics, Mary Decker (right) is passed by Zola Budd in the 3,000-meter race shortly before the two collided. Decker was unable to finish the race. (AP/Wide World Photos)
Decker, Mary
■
275
well as for her fierce competitiveness. Decker set six world records in 1982 alone, and, in that same year, she won both the 1,500-meter dash and the 3,000meter dash in the Helsinki World Track Championships, an unprecedented achievement for which she was named Sports Illustrated ’s Sportsperson of the Year, as well as receiving both the Amateur Athletic Union (AAU) James E. Sullivan Award for outstanding amateur athlete and the Jesse Owens Track and Field Award. She was the first woman thus honored. Expected to win handily at the 1984 Summer Olympics in front of an enthusiastic American crowd in Los Angeles, Decker collided with a South African runner, Zola Budd, in the home stretch of the grueling 3,000-meter run and tumbled into the infield, injuring her hip. She could not finish the race. Although she blamed Budd for crowding her in a clumsy attempt to pass her, race officials did not penalize Budd. The heartbreaking photo of the stricken Decker, in tears, helplessly watching the other runners pass by became one of the defining images of that Olympics. Decker was far from finished, however. Indeed, 1985 would be her most accomplished year, an undefeated season in which she won a dozen 1-mile and 3,000-meter races, along the way setting two world records, including a 4:16.7 outdoor mile. Married in 1985 to discus thrower Ron Slaney, Decker did not compete in 1986 while she had a baby. She continued to train afterward, but injuries frustrated her attempts to qualify for either the 1988 or the 1992 Olympic Games. Impact Despite her moment of widest celebrity coming from her fall in the Los Angeles Olympics and her subsequent very public demand for Budd’s disqualification, which many saw as whiney, unprofessional, and further evidence of the decade’s generation of athletes being spoiled—and despite her failure to win any Olympic medals in her long career—Mary Decker is recognized by track-and-field aficionados for her unparalleled dominance of the sport. Her competitive record, including several world records that survived into the twenty-first century, marked her as one of the all-time great runners. Her career was also remarkable for its longevity: Decker began competing at the age of eleven and—despite career-threatening injuries and more than twenty surgical procedures, family obligations, and the distractions of the media—she continued to
276
■
Deconstructivist architecture
compete for a good two decades in a field in which success is often measured by one’s performance in only one or two competitive seasons. Further Reading
Heywood, Leslie, and Shari Dworkin. Built to Win: The Female Athlete as Cultural Icon. Minneapolis: University of Minnesota Press, 2004. Smith, Lissa. Nike Is a Goddess: The History of Women in Sports. Boston: Atlantic Monthly Press, 1995. Tricaid, Louise. American Women’s Track and Field, 1981-2000: A History. Jefferson, N.C.: McFarland, 2007. Joseph Dewey See also Griffith-Joyner, Florence; Olympic Games of 1984; Retton, Mary Lou; Sports.
■ Deconstructivist architecture Definition
An architectural style that juxtaposes different structural and design elements in seemingly random ways
Deconstructivist architecture symbolized both the complexity of contemporary life and a postmodern aesthetic that was the result of unprecedented American wealth and power in the 1980’s. The deconstructivist movement in architecture, also known as deconstructivism or deconstruction, is an evolution in postmodern architecture that began in the 1980’s. “Deconstructivism” is a term that seems counterintuitive when applied to building, and examples of this style often seem equally counterintuitive in terms of their organization and construction. Deconstructivist architecture often looks as though it has been exploded, cobbled together with random bits and pieces from a number of different buildings. It can seem intentionally designed to be abstract rather than functional. In the 1980’s, deconstructivist architects drew on theories from other disciplines, such as philosophy, literature, and cultural studies, to develop an approach to buildings that reflected the fragmented, pluralistic, and global nature of everyday life. Deconstructivist architecture attempted to illustrate poststructuralist ideas about diffusion, discontinuity, fragmentation, and context. Many of the foundational ideas of deconstructivist architecture reflect the work
The Eighties in America
of literary scholars of the 1980’s and the French philosopher Jacques Derrida, who founded a literary and philosophical movement known as deconstruction that saw that complexity was an organizing principle of human experience. Deconstructivism is most concerned, then, with questions of meaning and how people make order in their world. New Shapes Deconstructivist buildings have several distinctive physical features, including generally non-rectilinear foundations; unusually curved or distorted facades; and unpredictable, almost chaotic structures. These elements reflect deconstructivist architects’ interest in experimenting with ideas about the nature of the “skins,” or facades, of buildings, using or referencing non-Euclidean geometry in architecture, and creating a building or place that sends messages about dislocation at the same time that it acts as a clear locus of action or experience. Early interest in this type of architecture was first expressed in Europe. For example, the 1982 Parc de la Villette competition included several examples of projects that might be understood as deconstructivist. These included a collaborative submission from Derrida and the American architect Peter Eisenman, as well as the winning entry by Bernard Tschumi, an architect with offices in both France and New York who was interested in academic and theoretical approaches to architecture. By 1988, deconstructivist architecture was influential enough to merit the attention of the Museum of Modern Art in New York. Organized by the architect Philip Johnson and his associate Mark Wigley, the museum’s Deconstructivist Architecture exhibition featured the work of the seven most influential deconstructive architects of the time: Frank Gehry, David Liebskind, Rem Koolhaas, Zaha Hadid, Coop Himmelblau, Eisenman, and Tschumi. What these architects had in common was their sense that the traditional view of architecture as an art focused on order, stability, and history was no longer a valid way to view building design. Instead, in the 1980’s, architecture began to explore pure abstraction and the power of critical theories developed in other disciplines. As a result, the structure of a building came to be seen as a potential tool for questioning and reforming social relationships, on both a community and an individual level. While many of the architects whose works were highlighted by the Museum of Modern Art’s exhibit
The Eighties in America
subsequently distanced themselves from the term, “deconstructivism” became a common description of the particular look and approach adopted by many of them. Gehry, for example, is among the architects who disassociated himself from the deconstructivist movement, but his home in Santa Monica (1978) is, for many architectural historians, the prototypical deconstructivist house. Beginning with an ordinary three-bedroom cottage in an ordinary neighborhood, Gehry changed its masses, spatial envelopes, and facade, subverting the normal expectations of domestic design. Typical of deconstructivist buildings, Gehry’s house emphasized irregular and quirky shapes and volumes and used unexpected materials, like metal, tile, and stucco, in unusual ways and jarring combinations. In contrast, Eisenman, who embraced the label of deconstructivist, produced house
Deconstructivist architecture
■
277
designs in the 1980’s that focused on an effect of dislocation, achieved through formal purity and lack of historical or vernacular reference. Gehry projects like Loyola Law School in Los Angeles (1981-1984), Edgemar Center in Santa Monica (1988), and Disney Concert Hall in downtown Los Angeles (begun in 1989) reveal deconstructivism’s lack of interest in the unity, orthodoxy, and simple functionality that shaped most modernist architectural designs. Buildings like Eisenman’s Wexner Center for the Arts in Columbus (1989) illustrate the complexity and fragmentation common to deconstructivist architecture. A three-dimensional grid runs through the castle-like structure. Some of the grid’s columns fail to reach the ground and loom over the stairways, creating a feeling of dread and concern about the structural integrity of the columns.
The Walt Disney Concert Hall in downtown Los Angeles, California, is an icon of deconstructivist architecture. (Jon Sullivan/ pdphoto.org)
278
■
The Eighties in America
De Lorean, John
New Tools
Deconstructivist architects made unparalleled use of computer-aided design (CAD), which by the 1980’s was a common tool in all architectural firms. For deconstructivist architects, computers were an important design aid, permitting three-dimensional modeling and animation that supported their desire to create very complicated spaces. Similarly, the ability to link computer models to production activities allowed the manufacturing of mass-produced elements at reasonable cost. While the computer made the designing of complex shapes easier, though, not everything that looks odd is “deconstructivist.” It was the use of new technologies in ways that were connected to postmodern and post-structural theoretical frameworks that enabled deconstructivist architects to revolutionize people’s experiences of their built environments in the 1980’s.
Impact As an expression of postmodern attitudes, deconstructivist architecture was as much about theory and cultural change as it was an approach to building design. Although in some ways deconstructivism as a movement was at the fringes of 1980’s architecture, it significantly changed the shape of the American built environment. Further Reading
Frampton, Kenneth. Modern Architecture: A Critical History. 3d ed. London: Thames & Hudson, 1992. Basic, classic reference, updated to include postmodern developments in architecture. Jencks, Charles. The New Moderns: From Late to NeoModernism. New York: Rizzoli International, 1990. Collects a wide-ranging set of essays and interviews whose breadth makes this a first-rate resource. Johnson, Phillip, and Mark Wigley. Deconstructivist Architecture. New York: Little, Brown, 1988. Exhibition catalog that includes a helpful introductory essay by Wigley. Macrae-Gibson, Gavin. The Secret Life of Buildings: An American Mythology for Modern Architecture. Cambridge, Mass.: MIT Press, 1985. Theory-based close look at seven important buildings completed in the 1980’s. Wigley, Mark. The Architecture of Deconstruction: Derrida’s Haunt. Cambridge, Mass.: MIT Press, 1995. Examination of the theoretical roots of deconstructivist architecture. J. R. Donath
See also
Architecture; Art movements; CAD/ CAM technology; Cyberpunk literature; Gehry, Frank; Neoexpressionism in painting; Virtual reality.
■ De Lorean, John Identification
American automobile executive and entrepreneur Born January 6, 1925; Detroit, Michigan Died March 19, 2005; Summit, New Jersey An American automobile executive turned flamboyant business entrepreneur, John De Lorean started the De Lorean Motor Company, which produced the DMC-12, a futuristic sports car. As a controversial, larger-than-life personality, De Lorean was the central figure in a highly publicized drug-trafficking trial at which he was found not guilty due to entrapment. John De Lorean, a one-time General Motors automobile executive, founded the De Lorean Motor Company (DMC) in 1975. He established his factory near Belfast, Northern Ireland, and received partial financial support for his company from the British government, which invested more than $150 million in an attempt to reduce the unemployment rate in Northern Ireland, then more than 20 percent. By 1981, the De Lorean manufacturing plant in Northern Ireland employed nearly twenty-five hundred people. Beginning in 1981, DMC manufactured the DMC-12, later called simply the De Lorean, an avantgarde, stainless-steel sports car with doors that, when opened, looked like “seagull wings.” Almost from the start, the company experienced budgetary and engineering problems. In October, 1981, the company manufactured only two thousand cars, and it had trouble selling even that many because they were expensive. By early 1982, poor sales resulted in some very serious financial problems for the company. De Lorean desperately sought to acquire an infusion of cash to keep his company afloat. In midJune, 1982, after several meetings with men he believed to be members of an organized crime ring, De Lorean agreed to participate in a deal involving illegal narcotics. The men turned out to be undercover agents working for the Federal Bureau of Investigation (FBI). On October 19, 1982, De Lorean was arrested in a Los Angeles, California, hotel room and charged with eight counts ranging from conspiracy to smug-
The Eighties in America
Demographics of Canada
■
279
lar movie Back to the Future (1985). As for De Lorean, with his legal troubles behind him, he disappeared from public life during the rest of the 1980’s. Impact De Lorean was one of very few individuals who have demonstrated the entrepreneurial spirit and possessed the wherewithal to start their own automobile company in modern times. De Lorean, an automobile-industry trailblazer, risked his reputation and personal life to save his dream, but in the end he was unable to save either. Further Reading
De Lorean, John Z., with Ted Schwarz. DeLorean. Grand Rapids, Mich.: Zondervan, 1985. Demott, John S. “Finished: De Lorean Incorporated.” Time, November 1, 1982. Joseph C. Santora See also Back to the Future; Business and the economy in the United States; Crime; Fads.
■ Demographics of Canada John De Lorean speaks at a February 19, 1982, press conference under a picture of his company’s automobile, the DMC-12. (AP/ Wide World Photos)
gling fifty-five pounds of cocaine (valued at $24 million) to the use of communications facility charges to help finance his failing company. He spent ten days in jail. In late 1982, the De Lorean Motor Company went bankrupt after producing approximately ten thousand cars; the factory was closed on orders from the British government. Thereafter, De Lorean spent most of his time preparing for his highly publicized trial. In mid-August, 1984, Federal District Judge Robert M. Takasugi dismissed all eight counts against De Lorean on the grounds that he had been entrapped by the FBI. In other words, the judge determined that De Lorean committed only crimes that he had been persuaded to commit by agents of the government, crimes that he would not otherwise have committed. Despite his acquittal, De Lorean was unable to salvage his highly tarnished image and regain his prominence in business. However, his car, the DMC-12, remained an iconic object of the 1980’s, especially after it was featured as a time machine in the highly popu-
Definition
The size, composition, and distribution of the population of Canada
During the 1980’s, Canada experienced an influx of immigrants who were neither British nor French. Because the nation had traditionally been split between citizens of British and French descent, the entry of significant populations from Asia, the Middle East, and the Mediterranean greatly increased national diversity. In the 1980’s, immigration was the central force in Canada, driving the growth of both the overall population and the labor force. Formal immigration policy through most of Canada’s history had favored migrants from Europe and North America, who it was believed assimilated easily and served the country’s labor needs most appropriately. Displaced persons and refugees from war-torn Europe were invited to enter the country in limited numbers after World War II; later, during the Cold War, refugees immigrated from Hungary (1956), Czechoslovakia (1968), and other areas of Soviet aggression. Immigration The Canadian Immigration Act of 1952 limited admission to Canada using discriminatory criteria, including nationality, lifestyle, and suitability to the workforce. Many people circumvented the
280
■
The Eighties in America
Demographics of Canada
Immigrant Population by Birth for Canada Canada’s immigrant population underwent a major change during the decade, beginning in 1981. For the first time in the nation’s history, immigrants arriving from Asia, particularly those from Hong Kong, India, the People’s Republic of China, Vietnam, and the Philippines, outnumbered immigrants from the United Kingdom and other European nations.
Place of Birth
Immigrant Population: 1971-1980
% of Total
Immigrant Population: 1981-1990
% of Total
996,160
100.0
1,092,400
100.0
United States
74,015
7.4
46,405
4.2
Central and South America
67,470
6.8
106,230
9.7
Caribbean and Bermuda
96,025
9.6
72,405
6.6
Total
Europe:
356,700
35.8
280,695
25.7
132,950
13.3
63,445
5.8
Other Northern and Western Europe
59,850
6.0
48,095
4.4
Eastern Europe
32,280
3.2
111,370
10.2
United Kingdom
131,620
13.2
57,785
5.3
Africa
Southern Europe
58,150
5.8
64,265
5.9
Asia:
328,375
33.0
512,160
46.9
West Central Asia and the Middle East
30,980
3.1
77,685
7.1
Eastern Asia
104,940
10.5
172,715
15.8
Southeast Asia
111,700
11.2
162,490
14.9
Southern Asia
80,755
8.1
99,270
9.1
15,420
1.5
10,240
0.9
Oceania and other Source: Statistics Canada.
1952 law, however, by coming under the sponsored labor program, whereby an immigrant agreed by contract to work in mining, farming, railway transportation, or domestic service for a minimum of two years. The “absorptive capacity” model, admitting only those applicants who could fill labor niches, dominated Canadian immigration policy until the late 1980’s. In reaction to World Refugee Year (19591960), international pressures, and labor needs, Canada liberalized its immigration policy through the 1960’s, bringing down ethnic barriers by expanding admissible classes of immigrants to include Asians, Africans, and other non-European groups. Economic woes in the 1970’s, including inflation and unemployment, slowed the flow of immigrants
into the country; by 1984, immigration reached its lowest point since the early 1960’s: Only eighty-three thousand people immigrated to Canada in that year. The liberalization of Canada’s immigration policy in the 1960’s profoundly changed the country’s demographic profile. Immigrants from nonEuropean nations poured into Canadian cities, especially toward the end of the 1980’s. By the time of the 1991 census, more than 30 percent of Canada’s population reported an origin other than British or French. In the 1980’s, approximately 40 percent of the foreign-born Canadian population came from Asia and the Middle East, 30 percent came from Europe, and about 20 percent came from the Caribbean and Central and South America.
The Eighties in America
The foreign-born population was concentrated most heavily in Ontario and western Canada; about 90 percent settled in Canada’s fifteen largest cities, a far higher percentage than that of Canadians in general. By 1991, Toronto’s population was 38 percent foreign-born. Vancouver’s population was 30 percent foreign-born. Some 24 percent of Hamilton’s population was foreign-born, as was 21 percent of Windsor’s population and 20 percent of the populations of both Calgary and Victoria. The four Atlantic Provinces and the Northern Territories were nearly untouched by this immigration, and the Prairie Provinces remained more than 90 percent Canadian-born. Canada’s population increased from about 24 million in 1980 to 26.8 million in 1991, and more than 1.2 million of that increase was attributed to immigration. From its founding “mosaic” philosophy, diversity remained a defining characteristic of Canadian society. The changing composition of Canadian society through the 1980’s affected Canada in myriad ways. Non-European immigration added to diversity and caused British and French dominance to decline proportionately; by 1981, only 25 percent of Canadians named French as their mother tongue. Immigration fueled the growth of Toronto to such a degree that it surpassed Montreal in population for the first time in the census of 1981. By 1986, the population of Toronto was 3.4 million, while Montreal’s population was 2.9 million. The strong emphasis on French language and culture in Quebec discouraged many immigrants from settling in Montreal, Quebec City, and other Québécois cities and towns. New immigrants applied pressure to liberalize immigration laws and codify multiculturalism. Mainstream Canadians learned about new lifeways and traditions from Muslims, Hindus, and other ethnoreligious minorities as they grew in numbers and visibility. New trade ties developed as a result of immigrant businesses maintaining ties to their owners’ home countries. For example, a burgeoning Chinese market developed as the number of Chinese Canadians increased. Pressures mounted, too, for the government to take sides on contentious political issues such as the Arab-Israeli question and human rights abuses in immigrants’ homelands. Population Growth The infusion of immigrants, most of whom arrived during their young, productive years, meant growth in the workforce, vitality in
Demographics of Canada
■
281
the economy, and a younger population than existed in most developed countries. In 1987, Canada’s birth rate was 15 births per 1,000 persons, and the death rate was 7 deaths per 1,000 persons, resulting in an annual natural increase of 0.8 percent. The nation’s fertility rate was 1.7 children born per woman. The population’s doubling time was about ninety years, the infant mortality rate was 8 deaths in 1,000 live births, and the average Canadian life expectancy was seventy-six years. About 22 percent of the nation’s population was under the age of fifteen. These numbers were very similar to those for the same year in the United States. Ethnic minorities, the majority of them foreignborn, registered somewhat higher fertility rates than did native-born Canadians. The country’s age structure was pyramidal in form, with only slight differences in the male to female ratio, except for the elderly, among whom women outnumbered men as a result of their longer life expectancies. Ethnic minorities in general exhibited younger population age structures. Canada’s infant mortality rate was held low by national and provincial health care plans, healthy lifestyles, and a high ratio of medical personnel to the general population. Indigenous Inuit and Indian mortality rates for infants and adults outdistanced those for other Canadians, although there was a steady decline in mortality rates in these populations throughout the decade as a result of improvements in health care facilities. Both Inuit and Indian populations tended to be young in age structure as a result of high fertility and falling infant mortality. Specific death rates for certain diseases registered higher for indigenous peoples than for the general population. Canada’s birth rate remained high throughout the decade as a result of both immigration and the “echo” children of the baby-boom generation. The “echo” children, born between 1980 and 1995, grew up primarily in Canada’s urban areas, especially Ontario and parts of western Canada. Thus, fertility was higher in metropolitan areas, particularly in the suburbs, where baby boomers were disproportionately represented. The “echo” was much smaller in French Quebec, the east, and settlements of the north. With the “echo” generation came an increased need for child-care facilities and elementary and secondary teachers and schools, especially in metropolitan areas. Population growth caused cities to expand, thus increasing government outlays for infrastructure.
282
■
The Eighties in America
Demographics of the United States
Population Distribution and Density
Canada’s population generally lies in a string along the U.S.Canadian border, with more than half of it concentrated in the Great Lakes-St. Lawrence region, the country’s core population center. In 1980, that region housed eleven of the country’s twenty-five largest metropolitan areas; Canada’s highest urban and rural densities overall were recorded in that core, particularly at the western end of Lake Ontario. The concentration of new immigrants in the country’s cities and economic growth areas helped reinforce the pattern of heavy density in the core; however, it also helped fuel the westward drift of the population, especially into centers like Vancouver, Calgary, and Edmonton. Ethnic enclaves in Toronto and Vancouver became cities within cities, with significant Chinese, Greek, and Korean populations, among others. This dynamic social geography was accompanied by changes in land-use patterns, especially aerial expansion of the suburbs. While immigrants added to high population densities in urban cores like Toronto, suburban growth led to the evolution of polynucleated urban centers, in which metropolitan areas developed outlying secondary urban centers. National cities, such as Toronto and Vancouver, grew mostly as a result of immigration to Canada from abroad. Their growth was also significantly fueled, however, by internal migration of Canadians into the cities from outlying areas, also known as urbanization. Regional cities grew less dramatically. The Canadian north remained sparsely populated in comparison to the country’s core: In 1981, the Yukon had only twenty-three thousand residents, and the Northwest Territories had only forty-six thousand residents. The country’s Inuits lived mostly north of the northern treeline, concentrated in the Northwest Territories and along the coasts of Arctic Quebec, while the Indian, or First Nations, peoples resided south of the treeline, many on reserves. Small increases in population across Canada’s north in the 1980’s were associated mostly with resource development and patterns of high indigenous fertility. The country’s overall population density was only seven persons per square mile in 1991.
Impact Canada became a more diverse nation in the 1980’s, at a time when it was already negotiating issues of cultural diversity. Early in the decade, as the Canadian constitution was being modified and patriated, the government sought to adopt a multi-
cultural policy toward British, French, and First Nations Canadians. Asian and other non-European immigration rendered this multicultural policy all the more necessary and all the more difficult as the decade progressed. Further Reading
Day, Richard J. F. Multiculturalism and the History of Canadian Diversity. Toronto: University of Toronto Press, 2000. Overview of multiculturalism as a philosophy and as a national policy; discusses the practical implications of multiculturalism for national cohesion. Hawkins, Freda. Canada and Immigration. 2d ed. Montreal: McGill-Queen’s University Press, 1988. History of immigration to Canada, with a summary of the surge of immigrants in the 1980’s. Kalbach, W. E., and W. McVey. The Demographic Basis of Canadian Society. 2d ed. Toronto: McGraw-Hill Ryerson, 1979. Summary of the major components of population growth and change, including births, deaths, and migration. Li, Peter S. The Making of Post-War Canada. Toronto: Oxford University Press, 1996. Summarizes the philosophies and policies that shaped the demographics of Canada, including immigration and multicultural policies. Ann M. Legreid See also
Canada Act of 1982; Demographics of the United States; Immigration Reform and Control Act of 1986; Immigration to Canada; Minorities in Canada.
■ Demographics of the United States Definition
The size, composition, and distribution of the population of the United States
During the 1980’s, the population of the United States increased in cultural diversity. Marriage rates declined, meaning that a lower proportion of adults were members of nuclear families than in the past. Progress was made in the battle against several diseases, while others proved less tractable, and middle-class wage earners failed to benefit from the decade’s growth in prosperity among those who were fortunate enough to join the ranks of the upper class or to gain their income through investment.
The Eighties in America
In 1980, the U.S. Bureau of the Census reported that the total population of the United States was 226.5 million people. Ten years later, it reported a total of 248.7 million people. This growth represented a 9.8 percent rate of increase for the decade, or an average annual rate increase of 0.94 percent. The 9.8 percent rate of increase during the 1980’s was the second-lowest rate for a decade in U.S. history, the lowest being 7.3 percent during the depression decade of the 1930’s. In contrast, the population had increased by 11.4 percent during the 1970’s, by 13.4 percent in the 1960’s, and by 18.5 percent during the baby boom of the 1950’s. During the three previous centuries of American history, the population had almost always grown by more than 30 percent
Demographics of the United States
■
283
per decade, with several decades witnessing a growth rate of more than 40 percent. The crude birth rate of the 1980’s decade was approximately 16, meaning for every one thousand people living in the United States, there were sixteen births per year. This birth rate was low in comparison with the baby-boom period (around 1946-1965). The rate, nevertheless, was slightly higher than that of the 1970’s, during which the birth rate was about 15. Like earlier periods, the birth rate of the 1980’s varied considerably among ethnic groups. The white birth rate was about 15, compared with an African American birth rate of about 20. The death rate was significantly lower than the birth rate. In 1989, there were 8.6 deaths per 1,000 people, a rate that changed little be-
United States Population by Census Division, 1980-1990 In the decade from 1980 to 1990, the United States’ population increased by 9.8 percent, climbing from 226,542,199 to 248,709,873, according to the U.S. Census Bureau. The greatest gains were in the South Atlantic, Mountain, and Pacific Divisions, particularly in the states of Florida, Nevada, Arizona, Alaska, and California. Census Division
Population 1980
Population 1990
226,542,199
248,709,873
9.8
New England: Maine, New Hampshire, Vermont, Massachusetts, Rhode Island, Connecticut
12,348,920
13,206,943
6.9
Middle Atlantic: New York, New Jersey, Pennsylvania
36,787,896
37,602,286
2.2
East North Central: Ohio, Indiana, Illinois, Michigan, Wisconsin
41,682,908
42,008,942
0.8
West North Central: Minnesota, Iowa, Missouri, North Dakota, South Dakota, Nebraska, Kansas
17,184,090
17,659,690
2.8
South Atlantic: Delaware, Maryland, District of Columbia, Virginia, West Virginia, North Carolina, South Carolina, Georgia, Florida
36,957,453
43,566,853
17.9
East South Central: Kentucky, Tennessee, Alabama, Mississippi
14,666,142
15,176,284
3.5
West South Central: Arkansas, Louisiana, Oklahoma, Texas
23,743,473
26,702,793
12.5
Mountain: Montana, Idaho, Wyoming, Colorado, New Mexico, Arizona, Utah, Nevada
11,371,502
13,658,776
20.1
Pacific: Washington, Oregon, California, Alaska, Hawaii
31,799,815
39,127,306
23.0
Total United States
% Increase
284
■
Demographics of the United States
tween 1960 and 2000. The estimated life expectancy in 1989 was seventy-nine years for women and seventytwo for men. In 1980, the comparable figures had been seventy-seven for women and seventy for men. Immigration During the 1980’s, immigration was responsible for more than one-third of U.S. population growth, and it changed the ethnic makeup of the country. From the end of World War II until 1965, the number of immigrants had averaged fewer than three million per decade, with about 70 percent coming from Europe, Canada, and Oceania. The Immigration Act of 1965, however, ended the quota system that had given preferences to those ethnic groups that had entered early in the nation’s history. The new system was based primarily on three considerations: family reunification, labor skills, and providing asylum for political refugees. By the 1970’s, an increase in nontraditional immigrants was becoming quite noticeable. Additional legislation in 1980 and 1986 attempted to clarify the new rules, as well as to reduce a growing stream of illegal immigrants entering the country. From 1980 to 1989, 7.3 million legal immigrants entered the United States, which was the highest number since the first decade of the twentieth century. In comparison with the size of the country’s total population, however, the volume of immigrants in the 1980’s was smaller than in earlier periods. In the 1980’s, the average annual rate of immigration was 3.2 immigrants per 1,000 people living in the country, which was less than one-third the average annual rate during the first decade of the 1900’s. During the earlier period, however, there had been relatively few illegal immigrants, whereas an estimated two million illegal immigrants entered the country during the 1980’s, with more than one-third of them crossing the Rio Grande from Mexico. The change in the origins of the immigrants was even more consequential than the increase in the total numbers. In contrast to the pre-1960 situation, only about 13 percent of U.S. immigrants came from Europe, Canada, or Oceania. More than half came from Latin America, with half of this number coming from Mexico. More than one-third came from Asia, with the largest groups immigrating from the Philippines, Vietnam, Korea, China, India, and Laos. For many Americans, the increasingly multicultural nature of their society first became noticeable in the 1980’s.
The Eighties in America Marriage and Family At the end of the 1980’s, the U.S. Bureau of the Census reported that 57.9 percent of persons fifteen years of age and older were married. That number represented the continuation of a long-term decline in the percentage of married persons in the United States. In 1980, the rate had stood at 60.8 percent, and it had been 69.6 percent in 1960. In all of these years, there was considerable variation between populations of different races and ethnicities. In 1990, the percentage of whites who were married stood at 60.4 percent, compared with 47.9 percent of Hispanics and 38.8 percent of African Americans. The Bureau of the Census also reported a trend toward people getting married later in life. In 1960, the median age of a person marrying for the first time was 22.8 years for men and 20.3 for women. By 1980, the median age had increased to 24.7 for men and 22.0 for women. By 1990, the median age for men had increased to 26.1; for women, it was 24.4. There was also an increase in the number of single adults in the United States. In 1960, only 8.2 percent of white women and 12.7 percent of African American women did not marry before the age of twenty-nine. In 1980, the proportion of white women of this age who had never married had increased to 13.6 percent, whereas the proportion for African American women had grown to 29 percent. By 1990, the comparable percentages were 19.2 percent for white women and 52.0 percent for African American women. Despite a significant overall trend toward increasing divorce rates during the second half of the twentieth century, there was a modest decrease in the divorce rate during the 1980’s. From 1960 until 1980, the divorce rate increased by a factor of nearly three. In 1980, it stood at 22.6 per 1,000 married women, whereas it was down to 20.9 per 1,000 married women in 1990. There was substantial regional variation in divorce rates. In both 1980 and 1990, the highest divorce rate was in the Rocky Mountain states. In 1980, the lowest rate was in the Middle Atlantic states; in 1990, this region was tied with New England for the lowest rate. Among individual states, the lowest rate in 1990 was in Massachusetts, compared with Alaska, which had the highest rate. Closely associated with the divorce rate was the number and percentage of children directly affected by divorce. While the number and percentage of affected children rose dramatically from 1960 until
The Eighties in America
1980, the number and percentage during the 1980’s actually declined modestly. In 1989, the number of affected children was over one million (or 16.8 percent of the minor population). A decade earlier, in 1980, there had been 1.17 million affected children (or 18.1 percent of the minor population). In 1960, in contrast, there had only been about 0.46 million affected children (7.5 percent). By 1989, numerous social scientists were writing about the social pathologies associated with large numbers of single-parent families, of which disproportionate numbers lived in poverty. Health and Disease
During the 1980’s, there was a significant decrease in the number of deaths due to several specific diseases, continuing a trend that had begun in 1950. The National Center for Health Statistics reported that 152 persons per 100,000 died of heart disease in 1990, compared to 202 per 100,000 in 1980 and 307.2 per 100,000 in 1950. The death rate from malignant cancers, however, increased slightly during the decade (to 135.0 compared to 132.8). One of the important health indicators was the infant mortality rate, which was 9.8 infant deaths per 1,000 live births in 1989. The rate was down from 12.6 in 1980 and 29.2 in 1950. The infant mortality rate for whites was 8.1 in 1989, whereas it was 18.0 for African Americans. The population’s increase in longevity during the decade was due in part to a decline in the consumption of cigarettes. For 1990, the Centers for Disease Control and Prevention (CDC) reported that 25.7 percent of Americans were smokers, compared to 32.1 percent in 1983 and 42.4 percent in 1965. Among nonsmokers in 1990, 51.6 percent were former smokers. The CDC attributed about 20 percent of 1990 deaths to smoking, meaning that smokingrelated diseases were a significant contributor to potential years of life lost. In that year alone, according to the CDC model, the U.S. population lost more than five million years of life because of smoking, including 1.152 million years for those under the age of sixty-five. Although human immunodeficiency virus (HIV) infection was not officially classified as a cause of death before 1987, it grew rapidly in the United States throughout the decade. The death rate from the resulting acquired immunodeficiency syndrome (AIDS) grew from 5.5 per 100,000 people in 1987 to 13.8 per 100,000 people just six years later. By the
Demographics of the United States
■
285
end of the decade, more than 30 per 100,000 people were infected by HIV, which had become the secondleading cause of death among Americans aged twenty-five to forty-four years old. About 80 percent of those diagnosed with the virus were males older than thirteen. The infection rate for African Americans was about four times higher than that for nonHispanic whites. No effective treatment for the disease had yet been developed, so it almost always resulted in death within a few years. Employment and Income Unemployment was a major economic problem throughout the 1980’s, but the employment rate was particularly high during the recession of the decade’s early years. Having remained at about 7 percent throughout the 1970’s, the rate shot up to more than 9.5 percent in both 1982 and 1983, reaching a peak of 10.7 percent (11.9 million workers) in November of 1982. With the end of the recession, the number of people actively seeking work decreased to 6.2 percent of the potential workforce in 1987 and further decreased to 5.3 percent in 1989. Although this rate was low when compared with that for the twentieth century as a whole, the country did not achieve full employment (estimated at 4.5 percent unemployment) until the late 1990’s. Although there was considerable fluctuation in the unemployment rate during the decade, the ratio of white to African American unemployment changed very little. The African American unemployment rate remained somewhat more than twice as high as the white unemployment rate. For African Americans, the unemployment rate in 1980 was 12.3 percent, compared with 11.4 percent in 1989. For white Americans, the comparable rates were 5.1 percent in 1980 and 4.5 percent in 1989. Between 1980 and 1989, the median income of all U.S. families increased from $35,839 to $37,950, which represented no appreciable gain in purchasing power. White families in 1989 had median incomes of $49,108, compared with $22,866 for black families and $26,528 for Hispanic families. That year, moreover, 10.9 percent of white families had incomes below the poverty line, whereas the poverty rate for African American families was 31.9 percent, and it was 28.1 percent for Hispanics. Neither African Americans not Hispanics had made any progress toward converging with white median family incomes in two decades. The racial disparity was due
286
■
Designing Women
in part to the disproportionate number of singleparent families among these two minorities. Impact Demographic trends that had been evolving since the late 1960’s came to fruition in the 1980’s. The growth rate of the population declined moderately, while the two-year increase in life expectancy produced much concern about the long-term health costs of an aging population. At the same time, increasing immigration was changing the ethnic makeup of the United States, rendering it more diverse than at any time in the nation’s history. With the increase in births out of wedlock combined with the growing divorce rate, many social scientists warned about the problems associated with singleparent families. Investors in the stock market did much better than did middle-class workers, and African American families made almost no progress toward reducing the gap between their average incomes and those of white families. With few exceptions, these trends of the decade would continue into the early years of the twenty-first century. Further Reading
Anderton, Douglas, Richard Barrett, and Donald Bogue. The Population of the United States. 3d ed. New York: Free Press, 1997. Large book containing a wealth of historical statistics with clear tables and cogent discussions about the reasons for changing statistics. Bouvier, Leon, and Lindsey Grant. How Many Americans? Population, Immigration, and the Environment. San Francisco: Sierra Club Books, 1994. Makes informed predictions about the future based on historical precedents. Ehrlich, Paul, and Anne Ehrlich. The Population Explosion. New York: Simon & Schuster, 1990. Influential work arguing that population must be reduced in order to prevent a future catastrophe. Hinde, Andrew. Demographic Methods. New York: Oxford University Press, 1998. Textbook that explains the concepts, methods, and goals of the field. Klein, Herbert. A Population History of the United States. New York: Cambridge University Press, 2004. Scholarly survey of demographic history, from the first Native Americans through the changing patterns of the post-1980 era. Highly recommended. Longman, Phillip. The Empty Cradle: How Falling Birthrates Threaten World Prosperity and What to Do
The Eighties in America
About It. New York: Basic Books, 2004. Rejects conventional wisdom to emphasize the negative aspects of a declining population. Wright, Russell. Twentieth Century History of the United States Population. New York: Rowman & Littlefield, 1996. Summary of census data from 1900 until 1990, providing insight into the social, economic, and political factors that have shaped the nation. Thomas Tandy Lewis See also
Abortion; AIDS epidemic; Health care in the United States; Immigration to the United States; Income and wages in the United States; Social Security reform; Unemployment in the United States; Women in the workforce.
■ Designing Women Identification Television comedy series Date Aired from September 29, 1986, to May 24,
1993 Set in Atlanta during the late 1980’s and early 1990’s and revolving around four female characters, Designing Women presented a new image of Southern women that appealed to a diverse audience. Celebrating rather than mocking the South and Southerners, Designing Women featured strong characters and snappy dialogue delivered with authentic accents. It elevated Southern humor to an entirely new level. The show introduced audiences to the Sugarbaker Design firm of Atlanta, which was owned and operated by the characters. The women of the show—the spirited, feisty owner of the firm, Julia Sugarbaker (played by Dixie Carter); Julia’s younger sister, Suzanne Sugarbaker (played by Delta Burke, a former beauty queen); their initially shy but later aggressive associate, Mary Jo Shively (played by Annie Potts); and the sweet, naïve office manager, Charlene Frazier (played by Jean Smart)—were beautiful, smart, and funny. When the series began, all four women were single, either by circumstance or by choice, and in each episode, instead of actually designing anything, the characters would deliver their highly opinionated commentary on everything from fast food to the First Amendment. Action was limited, but scripts were packed with clever jokes, memorable oneliners, witty verbal repartee, and sentimental anec-
The Eighties in America
dotes of eccentric ancestors. The material highlighted the show’s Southern charm and showcased the talents of the cast, particularly their impeccable comic timing. The women’s banter would also often be punctuated by the voice of Anthony Bouvier (played by Meshach Taylor), the Sugarbakers’ handyman and later partner. He would contribute—often very unwillingly—the male perspective on subjects decidedly feminine. While consistently funny, Designing Women also ventured to explore some of the 1980’s most controversial issues. With episodes devoted to denouncing sexism, racism, the exploitation of the poor, and domestic violence, the show was edgier and more sophisticated than many other sitcoms of the decade. Impact Even though its ratings were respectable, after its first season Designing Women was put on hiatus. Fans of the show, though, mounted a successful letter-writing campaign to persuade network executives to give the show another chance. After returning to the air, the show ran for six more years. A large part of the show’s success can be attributed to the impassioned speeches delivered dramatically and convincingly in nearly every episode by Julia Sugarbaker, who was occasionally referred to by the other characters as “the Terminator.” Thanks to the show’s long second run in syndication, primarily on the Lifetime Cable Network, fans of the show learned to recite many of these speeches word for word.
Devo
■
287
■ Devo Definition
New Wave band
After playing clubs and remaining in obscurity in the 1970’s, Devo briefly entered the musical mainstream in the 1980’s. Devo first performed at Kent State University in the 1970’s and became a part of the New Wave music explosion in Akron, Ohio, in the middle of that decade. Recognition from David Bowie and Iggy Pop earned the group a recording contract and television appearances. Their stage performances were fast-paced in the punk tradition, but they relied on a sophisticated manipulation of pop culture themes that was beyond the conceptual grasp of most punk bands. Devo wore identical costumes on stage that projected a futuristic image very different from the stylistic chaos of punk bands. Punk relied on shock and rage as its underlying message; Devo relied on heavy doses of irony. Mark Mothersbaugh, the group’s lead singer, had received extensive visual-arts training and employed those skills to great advantage with the band. Devo also benefited greatly from the exposure it received on twenty-four-hour music television. Bands in the 1970’s had relied on weekly music shows or latenight cable programs for exposure. For Devo, these outlets limited their impact, since their visual references had more substance than did their minimalist
Further Reading
Burke, Delta. Delta Style: Eve Wasn’t a Size 6 and Neither Am I. New York: St. Martin’s Press, 1998. Carter, Dixie. Trying to Get to Heaven: Opinions of a Tennessee Talker. New York: Simon & Schuster, 1996. McPherson, Tara. Reconstructing Dixie: Race, Gender, and Nostalgia in the Imagined South. Durham, N.C.: Duke University Press, 2003. Owen, A. Susan, Sarah R. Stein, and Leah R. Vande Berg. Bad Girls: Cultural Politics and Media Representations of Transgressive Women. New York: Peter Lang, 2007. Traci S. Thompson See also
Feminism; Sitcoms; Television; Women in the workforce; Women’s rights.
Devo, around 1980. (Hulton Archive/Getty Images)
288
■
musical skills. Outlets like MTV, however, repeatedly exposed viewers to the rich visual references the band employed. The group’s resulting popularity resulted in its videos receiving coveted spots in the cable network’s roster of frequently repeated songs. The band also experimented with several performance gimmicks. It created a character named Booji Boy, a childlike clown persona who sometimes appeared on stage to sing at the end of concerts. Later, Devo introduced “Dove, the band of love,” a concept that had Devo open its concerts in the guise of a Christian rock band. The popularity peak for Devo came with the video that accompanied its catchy 1982 tune “Whip It.” The video was riddled with sexual innuendo and a surreal Old West setting that summed up the ability of the band to manipulate and recycle pop culture icons. After “Whip It,” the popularity of Devo waned. Devo was consistent, almost relentless, in exploiting a narrow musical style, and modern music fans are fickle in their search for novelty. What had been shocking in 1976 was mild by 1986, and New Wave music on MTV gave way to heavy metal and megastars like Prince, Madonna, and Michael Jackson. Devo went on hiatus in 1990, with Mothersbaugh working on children’s television projects with Pee Wee Herman and the Rugrats show on the Nickelodeon network. Impact Devo flourished briefly in a pop culture marketplace where its intelligent wit ultimately proved to be a liability. Unlike durable pop stars who periodically reinvent themselves, the group chose to pursue a unique creative vision and eventually got left behind as a footnote in the musical history of the 1980’s. At a time when punk and New Wave were small niche markets, Devo employed simple, catchy tunes and sophisticated marketing to reach a wide audience. The band’s success paved the way for other edgy bands to claim a share of recognition in the volatile pop music scene of the 1980’s. Further Reading
Brendan, Masar. The History of Punk Rock. New York: Lucent Books, 2006. Reynolds, Simon. Rip It Up and Start Again. New York: Penguin, 2005. Michael Polley See also
The Eighties in America
Diets
Music; New Wave music; Pop music; Talking Heads.
■ Diets Definition
Regimens of sparing or targeted food consumption designed to reduce weight or increase health
Interest in diets and slimmer bodies grew during the 1980’s, as did belief in the importance of healthy eating habits, which were not always compatible with popular diets. Despite this interest, the number of overweight Americans also increased. The 1980’s saw the intensification of an interest in thinness that had been growing since the 1960’s. Americans spent an estimated $10 billion a year on diet drugs, diet books, special meals, weight-loss classes, and exercise videotapes during the 1980’s. Runway models, actors, Playboy centerfolds, and beauty queens shrank, as the American ideal of beauty altered to favor increasingly smaller, thinner figures. Weight-Loss Regimens of the 1980’s
Americans, both thin and not, turned to diets to shed pounds. Many diets of the 1980’s, even those promoted by physicians, were nutritionally unsound and too low in calories to be sustainable. Weight-loss diets routinely made best seller lists throughout the decade. Regimens that had been popular in the 1970’s continued to find devotees. The low-carbohydrate, highprotein Atkins diet; the heart-healthy Pritikin program; and the Scarsdale diet, which required the dieter to adhere to a complex two-week plan based on particular combinations of food groups, remained popular during the 1980’s. New diets of the decade included the Stillman diet, which emphasized unlimited amounts of low-calorie, lean proteins; Martin Katahn’s rotation diet, which alternated three plans of varying calories in order to keep the metabolism moving; and the Beverly Hills diet, which allowed dieters to eat only fruit for the first ten days. Storefront weight-loss centers also began to spring up to compete with the long-established Weight Watchers. Both Jenny Craig and Nutri/System set up shop in the early 1980’s. In addition to diets, calorie-conscious consumers who did not want to change their eating habits resorted to new, reformulated low-fat or lowcalorie products that promised taste without guilt. A close reading of product labels revealed, however, that manufacturers frequently compensated for de-
The Eighties in America
creased flavor with additional salt or, in the case of low-fat foods, additional sugar. In the 1980’s, nearly 20 percent of the average American’s food dollar was spent on diet foods. One particularly striking example of the interest in low-calorie foods was Stouffer’s Lean Cuisine frozen diet meals. Within a year of their 1981 release, they resurrected the declining frozen-food business, turning it into an $800 million market. Their success was due in large part to another 1980’s innovation: the microwave oven. Health Concerns During the 1980’s, Americans became more aware that healthy eating habits played a significant role in disease prevention thanks to the release of a number of studies on lifestyle and health. Serum cholesterol was singled out for its part in heart disease, and the consumption of cholesterol in foodstuffs was at the time thought to be directly related to the level of cholesterol in the bloodstream. As a result, many Americans decreased their consumption of whole milk and eggs, as they became familiar with saturated and unsaturated fats. The consumption of foods thought to lower cholesterol levels, such as oats, dramatically increased and many existing products, particularly breakfast cereals, were reformulated to include them. In 1988, sales of oatmeal increased by 20 percent and sales of oat bran quintupled. Even as diet and exercise tomes topped the best seller lists, obesity rates in the United States continued to rise. By 1985, about 34 million Americans, or one out of every five, was obese. In 1988, the Surgeon General’s Report on Nutrition and Health declared that poor diet played a role in two-thirds of the 21.5 million deaths of the previous year. Eating Disorders The trend toward thinner bodies contributed to a dramatic increase in eating disorders, primarily among women. Previously rare, anorexia nervosa—a psychological disorder in which sufferers increasingly limit their intake to the point of starvation and death—and bulimia—a binging and purging disorder—became much more prevalent during the 1980’s. Singer Karen Carpenter’s death of anorexia complications in 1983 brought national attention to eating disorders. Critiques of the cultural imperative to be thin began to appear from feminists and larger Americans, who claimed that eating disorders, as well as discrimination, were the result of too much attention to one’s weight.
Diets
■
289
Impact Men as well as women were under intense social pressure to conform to a physical ideal that was increasingly slender during the 1980’s. As a result, the decade saw a dramatic surge in the number of weight-loss products available; retooled foods, weight-loss plans, and diet counseling were popular ways to shed pounds. The financial success of these products failed to alleviate the nation’s growing weight problem, however, as Americans tried to repair their diets with quick fixes rather than longterm solutions. Further Reading
Belasco, Warren. Appetite for Change: How the Counterculture Took on the Food Industry. Ithaca, N.Y.: Cornell University Press, 1989. Includes information on light foods and frozen diet meals introduced during the 1980’s. Brumburg, Joan Jacobs. Fasting Girls: The History of Anorexia Nervosa. Cambridge, Mass.: Harvard University Press, 1988. Cultural history of anorexia that seeks to explain why the disease became prevalent in the 1980’s. Chernin, Kim. The Obsession: Reflections on the Tyranny of Slenderness. New York: Harper & Row, 1981. One of the first feminist critiques of the obsession with thinness. Fraser, Laura. Losing It: America’s Obsession with Weight and the Industry That Feeds on It. New York: Dutton, 1997. A magazine writer’s take on the weight-loss industry of the 1980’s and 1990’s. Includes profiles of several diets popular in the 1980’s. Levenstein, Harvey. Paradox of Plenty: A Social History of Eating in Modern America. New York: Oxford University Press, 1993. Contains an overview of trends in 1980’s food styles, including light foods and dieting. Schwartz, Hillel. Never Satisfied: A Cultural History of Diets, Fantasies, and Fat. New York: Anchor Books, 1986. Historical overview of dieting, including diets of the 1980’s. Seid, Roberta Pollack. “Why Thin Is Never Thin Enough.” In Never Too Thin: Why Women Are at War with Their Bodies. New York: Prentice Hall, 1989. Critical examination of the thinness trend in the 1980’s. Shelly McKenzie See also Aerobics; Aspartame; Food trends; Simmons, Richard.
290
■
Disability rights movement
■ Disability rights movement Definition
Social movement to win and protect rights of equal access and equal treatment for people with physical and mental disabilities
The disability rights movement emerged in the United States during the 1970’s, and it gained momentum in the 1980’s, despite federal governmental challenges and setbacks in federal courts. Disability rights activists had reason to be both optimistic and concerned in the early 1980’s. United Nations resolutions made 1981 the International Year of Disabled Persons and 1982-1993 the Decade of Disabled Persons. The Independent Living Movement took hold globally, and governments in developed and developing countries made progress in their disability policies. Prospects in the United States were less encouraging. President Ronald Reagan’s administration sought to reduce the federal government’s size and spending, which endangered legislative gains disability rights activists had made in the 1970’s. At risk were the 1973 Rehabilitation Act, which banned disability-based discrimination in federally funded institutions and programs, and the 1975 Education for All Handicapped Children Act (later known as Individuals with Disabilities Education Act or IDEA), which required public education to take place in the least restrictive feasible environment. Responses to Federal Resistance
Organizations such as the Disability Rights Education and Defense Fund (DREDF) began working against President Reagan’s policies during his first year in office (1981). Citizens who wrote letters to elected officials to oppose weakening disability rights laws were crucial to the lobbying campaign. Such efforts earned disability rights advocates an advantage by the decade’s midpoint: Vice President George H. W. Bush unexpectedly began to support some of the movement’s demands, and the Reagan administration reduced somewhat its resistance to regulation in the context of disability rights. The Rehabilitation Act and IDEA survived, and several new measures became law, among them the Employment Opportunities for Disabled Americans Act (1986), the Fair Housing Act Amendments (1988), and the Civil Rights Restoration Act (1988). The judicial system was another key front in the 1980’s disability rights struggle, with federal courts
The Eighties in America
sometimes limiting the scope and application of laws. Activists were disappointed by U.S. Supreme Court decisions regarding IDEA (Hudson Central School District v. Rowley, 1982) and the Rehabilitation Act (Bowen v. American Hospital Association, also known as the “Baby Jane Doe” case, 1986). Results were more favorable in federal appeals and circuit courts, as in ADAPT v. Skinner (1989), which improved public transportation accessibility, and Daniel R. R. v. State Board of Education (1989), which strengthened IDEA. Prominent Organizations and Leaders
A proliferation of new organizations reflected the movement’s energy and diversity. National Black Deaf Advocates (founded 1980) and the Association of Late Deafened Adults (founded 1987) fortified the deaf and hard-of-hearing communities. The alliance of feminism and disability rights grew stronger with the creation of the Networking Project on Disabled Women and Girls and the Womyn’s Braille Press (both 1980). Concrete Change (founded 1986) worked for accessibility in public housing. American Disabled for Accessible Public Transit (ADAPT, now known as American Disabled for Attendant Programs Today) took radical action throughout the decade, expanding its agenda from public transportation to support for the Americans with Disabilities Act (1990, also known as the ADA). The movement had multiple leaders rather than a single unifying figure. In 1983, Californians Edward Roberts and Judith Heumann built on their work in the 1970’s by establishing the World Institute on Disability. Washington, D.C., was a major site for activism, with Patrisha Wright of DREDF and Evan Kemp of the Disability Rights Center fighting against the Reagan administration and for the ADA. Another agent of change in Washington was Justin Dart, whose personal experience with polio, financial wealth, and government connections were indispensable to the movement. At Washington’s Gallaudet University, where the curriculum is designed for deaf and hard-of-hearing persons, student leaders organized demonstrations that gave the institution its first deaf president in 1988.
Disability Culture and Disability Studies
Disability rights spokespersons asserted themselves in literature, journalism, performing and visual arts, and academia during the 1980’s. Movement periodicals included The Disability Rag (founded 1980, now titled
The Eighties in America
The Ragged Edge), Deaf Life (founded 1988), and Mouth: The Voice of Disability Rights (founded 1989). San Francisco’s Bay Area, a center for progressive causes, was home to the Wry Crips theater group (founded 1985) and the AXIS Dance Troupe (founded 1987). Influenced by ethnic and women’s studies programs, scholars with disabilities brought their experiences and perspectives into the humanities. Sociologist Irving Zola, a wheelchair user with polio, helped found the Society for Disability Studies in 1982. Contributions from feminist and gay and lesbian scholars were especially helpful in making disability studies a force for intellectual inquiry and social change. Impact The 1980’s was a critical decade for the disability rights movement. Working against formidable odds, individually and in groups, activists sustained the progress of the 1970’s and broke ground for greater achievements in the 1990’s. Although alliances with nondisabled citizens from all levels of U.S. society were invaluable, people with disabilities were most interested in setting their own agendas. By learning from and forming coalitions with similar movements for inclusion, participation, justice, and equal opportunity, the disability rights movement brought positive change to all areas of public life in the United States during the 1980’s. Subsequent Events
Disability rights activism in the 1980’s led to the ADA’s passage in 1990. The ADA had bipartisan support in Congress, and President George H. W. Bush signed the measure into law enthusiastically. Still, many conservative politicians and business interests resented the ADA, and it did not always fare well with the U.S. Supreme Court. The 1990’s saw fewer public demonstrations for disability rights, but legislative and judicial advocacy remained strong, as did disability culture and studies. Further Reading
Barnartt, Sharon N., and Richard K. Scotch. Disability Protests: Contentious Politics, 1970-1999. Washington, D.C.: Gallaudet University Press, 2001. Explores activism from political science and sociological perspectives. Fleischer, Doris Zames, and Frieda Zames. The Disability Rights Movement: From Charity to Confrontation. Philadelphia: Temple University Press, 2001. Scholarly but accessible history of activism, legislation, and culture.
Disposable cameras
■
291
Mairs, Nancy. Plaintext: Essays. Tucson: University of Arizona Press, 1986. The author writes candidly and passionately about her experiences with multiple sclerosis. Shapiro, Joseph P. No Pity: People with Disabilities Forging a New Civil Rights Movement. New York: Times Books, 1993. A journalist’s sympathetic and wide-ranging overview of the movement. Shaw, Barrett, ed. The Ragged Edge: The Disability Experience from the Pages of the First Fifteen Years of “The Disability Rag.” Louisville, Ky.: Advocado Press, 1993. Anthology of journalism and creative writing from a leading disability rights publication. Zola, Irving K. Missing Pieces: A Chronicle of Living with a Disability. Philadelphia: Temple University Press, 1982. A disability studies pioneer describes living in a Netherlands community designed for people with disabilities. Ray Pence See also
Bush, George H. W.; Congress, U.S.; Feminism; Gallaudet University protests; Homosexuality and gay rights; Reagan, Ronald; Supreme Court decisions; United Nations; Women’s rights.
■ Disposable cameras Definition
Single-use box cameras with preloaded film and focus-free lenses
Disposable cameras, usually loaded with twenty-fourexposure rolls of color print film, became instant hits with consumers. The simple cameras appealed to those who preferred easy-to-use technology, who needed a camera at a moment’s notice, who preferred a less expensive or lighterweight camera for use during outdoor activities, or who wanted a simple starter camera for a child. In 1986, one century after the Eastman Company’s “You press the button, we do the rest” slogan opened the door to amateur photography, Fujifilm introduced the first disposable, or single-use, camera, the QuickSnap. One year later, Kodak introduced its own single-use camera, the Fling. The two easy-touse and inexpensive plastic cameras soon became popular with the consumers of the 1980’s, who by this time were demanding products that were affordable and readily available. The single-use camera, which could be purchased at virtually any retail store, became a popular gadget during a time when
292
■
The Eighties in America
DNA fingerprinting
the marketplace was buzzing with high-tech products inspired by an expanding high-tech society. The cameras were especially popular with youth and young adults, inspiring a new craze for taking snapshots. Kodak had introduced its first camera for nonprofessionals in 1888, and in 1900 it had developed its even more popular Brownie camera, which sold for only one dollar. Shooting snapshots became a part of everyday life in the twentieth century, leading to further developments in camera and film technology, including the QuickSnap and the Fling in the 1980’s. Like the early Kodak cameras, the 1986 single-use cameras also involved little more than “pressing the button” and letting the manufacturer “do the rest.” However, the single-use cameras of the 1980’s were not returned to the consumer; only the photographs were returned, while the cameras were often partially recycled. Some were even remanufactured and then resold with a new lens and, for models with a flash, new batteries. This practice led Kodak, Fujifilm, and others to warn of potential problems with “used” single-use cameras. The Fujifilm camera used Super HR 400 35mm color print film. It came with a thirty-five-milimeter f/11 lens and a single shutter speed of 1/100 second. The Kodak camera was equipped with Kodacolor VR-G 200 print film in 110 format. It had a twenty-five-millimeter f/8 lens and a shutter speed of 1/120 second. Within a few years, various models of single-use cameras were made for underwater use or for use in rainy or damp conditions. Built-in flashbulbs allowed for indoor shots. Some cameras could take panoramic shots or pictures in 3-D. Soon, they could take black-and-white photos and even instant Polaroid “Popshots.” In the early twenty-first century, the cameras were offered in digital format, and singleuse cameras also were sold through vending machines placed in what Kodak called “point of picture” locations. Impact The single-use camera introduced to 1980’s consumers provided ease of use, affordability, and convenience. The cameras were simple and straightforward, requiring nothing more from the consumer than pointing and shooting; they were inexpensive (costing around ten dollars); and they were convenient, available everywhere. The QuickSnap and the Fling helped reinvigorate amateur photography,
which, even into the 1980’s, had a reputation as a hobby for those with money and technical savvy. To be able to purchase a camera for little more than the cost of two movie tickets and immediately begin taking pictures was a milestone in the history of photography, and it marked 1986 and 1987 as years to remember. Further Reading
Ford, Colin, and Karl Steinorth, eds. You Press the Button, We Do the Rest: The Birth of Snapshot Photography. London: D. Nishen, 1988. Medintz, Scott. “Point, Shoot, Toss.” Money, July, 1999, 143. West, Nancy Martha. Kodak and the Lens of Nostalgia. Charlottesville: University Press of Virginia, 2000. Desiree Dreeuws See also Camcorders; Fads; Photography.
■ DNA fingerprinting Definition
Technique using human genetic material for forensic identification
DNA fingerprinting revolutionized human identification and the field of forensic science. It enabled law enforcement agents to identify many criminals they otherwise could not have caught, as well as exonerating innocent people who had been wrongly convicted of crimes before the technique was developed. DNA fingerprinting, also referred to as DNA typing or DNA profiling, is a technique developed by Alec Jeffreys, an English geneticist at the University of Leicester. When Jeffreys digested genomic deoxyribonucleic acid (DNA) into small fragments using enzymes called restriction endonucleases, he discovered that the genome contained short pieces of DNA that were repeated many times and dispersed throughout the entire genome. Jeffreys realized that since each unique individual had a different number of repeats, people could be identified on the basis of the resulting patterns found in their DNA. Since restriction endonucleases were used to produce the DNA fragments and since the number of repeated units varied from one person to another, Jeffreys’s technique was often referred to as restriction fragment length polymorphism (RFLP) and as variable number of tandem repeats (VNTR).
The Eighties in America
Do the Right Thing
■
293
onstrated that DNA left in the young woman’s underpants had not been his. The RFLP/VNTR method of DNA profiling is time-consuming, and the exact size of the DNA bands is difficult to determine. More rapid and more exact methods later replaced the original RFLP/ VNTR method, making DNA fingerprinting more useful and more reliable as a forensic tool. Impact DNA profiling makes the prosecution of suspects (especially those such as rapists who leave DNA samples at the scene) and the identification of unknown persons easier and virtually irrefutable. It gained rapid acceptance in the United States and elsewhere by the late 1980’s. All fifty U.S. states now have laws requiring the collection of DNA samples from convicted sex offenders. In 1990, the Federal Bureau of Investigation (FBI) established a database of DNA fingerprints called the Combined DNA Index System (CODIS). Further Reading
Alec Jeffreys is credited with developing the technique of DNA fingerprinting. (David Parker/Science Photo Library)
DNA profiling is extensively used by forensic scientists for human identification. The first practical application of DNA profiling occurred in the United Kingdom in an immigration case, while the first forensic application of DNA profiling in the United States was in 1987 in Florida. In that year, Tommie Lee Andrews was tried for committing a burglary and rape that had concluded a crime spree of twenty-three rapes stretching back to May, 1986. After a lengthy hearing, DNA evidence was admitted at the trial, which ended in a hung jury. Andrews was retried and convicted on November 6, 1987. On August 14, 1989, DNA profiling was first used to overturn a conviction. In 1977, sixteen-year-old Cathleen Crowell claimed that she had been raped and from police photographs identified Gary Dotson as her attacker. In May, 1979, Dotson was convicted. In 1985, Crowell, then known by her married name of Webb, recanted her story, revealing that she had fabricated the rape because she had had sex with her boyfriend and was afraid of becoming pregnant. Prosecutors did not believe Webb’s new story, but Dotson was finally exonerated when it was dem-
Butler, John M. Forensic DNA Typing. Boston: Elsevier, 2005. Lee, Henry, and Frank Tirnady. Blood Evidence: How DNA Is Revolutionizing the Way We Solve Crimes. New York: Basic Books, 2003. Lubjuhn, Thomas, and Jorg T. Epplen. DNA Profiling and DNA Fingerprinting. Basel, Switzerland: Birkauser, 1999. Spencer, Charlotte. Genetic Testimony: A Guide to Forensic DNA Profiling. Upper Saddle River, N.J.: Prentice-Hall, 2003. Charles L. Vigue See also America’s Most Wanted; Crime; Genetics research; Rape.
■ Do the Right Thing Identification Influential African American film Director Spike Lee (1957) Date Released June 30, 1989
Written, directed, and produced by Spike Lee, Do the Right Thing brought into focus racial tensions throughout the United States in the 1980’s through its raw dialogue and expressionistic style. Do the Right Thing was the third feature-length film writen, directed, and produced by Spike Lee. Born
294
■
Do the Right Thing
in Atlanta and nicknamed Spike by his mother Mary, Shelton Jackson Lee was raised in the Bedford-Stuyvesant (Bed-Stuy) neighborhood of Brooklyn, New York, where the action of Do the Right Thing takes place. Lee received a B.A. in mass communications from Morehouse College, a private, all-male, historically African American, liberal arts college in Atlanta before earning an M.F.A. in film from New York University in 1982. Plot Summary and Major Themes Do the Right Thing opens in the early morning of what is predicted to be the hottest day of the summer. Radio deejay Mister Señor Love Daddy (Samuel L. Jackson), who serves as narrator, sets the scene. One by one, audience members are introduced to the neighborhood’s residents, including Sal (Danny Aiello), an Italian American who owns a local pizza parlor. He operates the restaurant with his two sons, the domineering Pino (John Tuturro) and Vito (Richard Edson), the frequent target of Pino’s abuse. Mookie (Spike Lee), a likable slacker, delivers pizzas for Sal and scrounges to earn enough money to take care of his Puerto Rican girlfriend Tina (Rosie Perez) and their infant son Hector (Travell Lee Toulson). Despite changes in the racial and ethnic makeup of the block, Sal’s Pizzeria has remained a fixture in the neighborhood. Its proprietor takes great pride in the fact that his food has nourished the residents of the block over the years. Pino, however, harbors resentment toward many members of the community and would prefer to close up shop. Tensions rise when a young African American, Buggin’ Out (Giancarlo Esposito), threatens to boycott the restaurant unless Sal puts some photographs of African Americans on the restaurant’s walls, which currently display famous Italian Americans. Buggin’ Out’s confrontational black nationalism is balanced by the peacekeeping tendencies of the patriarch of the street, Da Mayor (Ossie Davis), a benign old drunk. In many ways, the characters of Buggin’ Out and Da Mayor embody the divergent philosophies of two prominent civil rights activists from the 1960’s: Malcolm X, who advocated the use of violence for self-protection, and Martin Luther King, Jr., who supported strategies of nonviolence. Throughout the film, a mentally retarded African American man named Smiley (Roger Guenveur Smith) tries to sell photographs of Malcolm X and Martin Luther King, Jr., to various people on the block. Most view Smiley
The Eighties in America
as a pest and try to ignore him, perhaps suggesting that the messages of the two slain civil rights heroes have also been largely ignored. Tempers reach the boiling point when an African American man, Radio Raheem, nicknamed for the huge portable stereo he supports on his shoulders, refuses to turn down his music as he orders a slice of pizza at Sal’s. The song that blares from his box is “Fight the Power” by the politically aware 1980’s rap group Public Enemy. This song, which calls for an active resistance by African Americans to white cultural hegemony, serves as an aural motif throughout the film. Its aggression is tempered, however, by the mellifluous jazz score composed by Bill Lee, Spike’s father, that rounds out the film’s softer moments. Radio Raheem, who wears gold jewelry that spells out LOVE across the knuckles of one hand and HATE across the other, eventually has his radio completely demolished by Sal’s baseball bat. The scene quickly turns chaotic. Caucasian police arrive on the scene and, in their attempt to restrain Radio Raheem, choke him to death with a nightstick. Upon witnessing this, Mookie, who has been advised by Da Mayor to “always do the right thing,” picks up a trash can and throws it through the front window of Sal’s Pizzeria, inciting a riot that leads to the total destruction of the restaurant. As the sun rises on the next day, Sal and Mookie make a tenuous reconciliation. The film closes by zooming in on a photograph of Martin Luther King, Jr., and Malcolm X on the charred wall of Sal’s restaurant. Finally, quotations encapsulating each leader’s philosophy are slowly scrolled down the screen. The film does not suggest whether or not Mookie did the right thing by smashing Sal’s window, nor does it suggest which civil rights leader’s message bears the greater truth. Stylistic Innovations and Awards
While the film provides an unflinching, realist portrayal of racial tensions, its overall style is better described as expressionistic. Shooting on location, Lee transformed several city blocks by painting Brooklyn brownstones with bright, hot colors. In terms of cinematography, the film makes use of striking canted angles that contribute to the mood of chaos and uncertainty. Do the Right Thing also contains an innovative montage in which characters of various races and ethnicities spew out a litany of racial epithets as if in direct response to a viewer’s provocation.
The Eighties in America
While Do the Right Thing won several important critics’ awards, including those of the New York, Chicago, and Los Angeles film critics’ associations, it received only two Academy Award nominations: Danny Aiello was nominated for Best Supporting Actor and Spike Lee for Best Original Screenplay. Neither Aiello nor Lee took home the Oscar. Impact
Do the Right Thing helped pave a path for socially conscious urban dramas such as Boyz ’N the Hood (1991) and Menace II Society (1993). These films were written and directed by African American filmmakers John Singleton and Albert and Allen Hughes, respectively. In the ensuing decade, Lee continued to make edgy films about provocative subjects, such as his essay on interracial relationships, Jungle Fever (1991), and the biopic Malcolm X (1992).
Further Reading
Fuchs, Cynthia, ed. Spike Lee Interviews. Oxford: University Press of Mississippi, 2002. Collects twentytwo interviews on topics including race, politics, and the media. Guerrero, Ed. Do the Right Thing. London: British Film Institute, 2002. Focuses on Spike Lee’s representation of race in Do the Right Thing and the rise of multicultural voices in filmmaking. Reid, Mark A., ed. Spike Lee’s “Do the Right Thing.” Cambridge, England: Cambridge University Press, 1997. This volume from the Cambridge Film Handbooks series contains essays that analyze Do the Right Thing from a variety of perspectives, as well as reviews by influential critics. Corinne Andersen See also Academy Awards; African Americans; Bonfire of the Vanities, The; Boom boxes; Film in the United States; Multiculturalism in education; Public Enemy; Racial discrimination.
■ Domestic violence Definition
Abuse, especially habitual abuse, of one family or household member by another
During the 1980’s, the inappropriate and damaging nature of family violence received increasingly powerful public recognition, although intervention by the police, courts, and other social services operated with little uniformity across the United States.
Domestic violence
■
295
Evidence of domestic violence has been identified in Native American archaeological remains, and it was observed in American Indian cultures at the time of European contact, as well as in colonial America. The U.S. Constitution provided legal rights to citizens of the new republic, but those rights were mitigated by an English common law tradition that in many respects treated women as chattel of their fathers, brothers, or husbands. While the legal rights of all American women were vastly improved by the 1980’s, American culture still retained traditional associations of women with the private, domestic sphere. On the average, men enjoyed greater political and economic power and privilege than did women. As a result, for many years domestic violence committed by men against women often received little public recognition as a social issue worthy of concern. It was frequently assumed, moreover, to affect only a minority of American women, who were even sometimes blamed for their abuse. The problem was seen as neither serious in individual cases nor endemic to American society. Abuse in the 1980’s The assumption that women were responsible for their own abuse—still prevalent in the 1980’s—both caused and was used to excuse the frequently poor quality of police response, legal advocacy, and social services for victims of abuse. Some civil servants, however, were well aware of the plight of American women, and they invested great effort in promoting change. Partly as a result of these efforts, courts in most U.S. states by the mid1980’s had acquired the statutory authority to issue immediate restraining orders to protect abused women from their abusers. The first structured services offered to battered women in the United States were provided in the mid-1970’s by community-based organizations founded by activists, usually informed by feminist theory, who acted as advocates on behalf of women. Such feminist advocates often believed that violence was a result of gender inequalities perpetuated by American patriarchal culture. Thus, they saw reform of the public, political sphere as necessary to counter violence occurring in the private, domestic sphere. The experiences of shelter staff exposed them to horrific accounts of male violence toward family members and also demonstrated the challenges of encouraging male-dominated governmental institutions, such as the police, the legal system,
296
■
The Eighties in America
Domestic violence
and social assistance agencies, to help female victims of abusive men. These early shelters did, however, provide essential services to women and their children, who would otherwise have had no place of sanctuary to escape their violent home situations. During the 1980’s, U.S. organizations devoted to sheltering and representing the interests of battered women increased in number. Most were still operating according to feminist principles, but this gradually changed as the decade progressed. Shelter staffs also became increasingly professionalized, with formal educational qualifications. Higher levels of staff training were necessary both for shelter staff and for others who worked with abused women. In many states, police and court workers were provided with specialized instruction to help them better understand domestic violence and to increase the effectiveness of their work with abused women and children. Community awareness of the impact of domestic violence also increased during the 1980’s, with media attention focused on the behavior of public figures, such as Mike Tyson, and with the publication of The Color Purple (1982), by Alice Walker. Increased public consideration of the issue meant that, in some states and court jurisdictions, dramatic modifications were made to legal codes and to police practice manuals. These changes afforded abused women increased levels of legal protection. They did little, however, to ensure consistent treatment of women across jurisdictions, and individual state legislatures, police departments, and courts continued to differ dramatically in their responses to domestic violence. Impact
Domestic violence, most commonly inflicted by men on women and children, has been a tremendous challenge both for American families and for the police, courts, and social services. While family abuse received greater recognition during the 1980’s and government agencies intervened to a greater degree than had been the case during the previous decade, there were still many aspects of abuse that remained unrecognized. These included same-sex domestic violence, violence directed against immigrant women, and the ethnic dimensions of the problem.
Subsequent Events In 1995, Congress passed the Violence Against Women Act, ensuring major changes in legal responses to family violence and funding a variety of interventions.
Further Reading
Dobash, R. Emerson, and Russell P. Dobash. Violence Against Wives. New York: Free Press, 1979. Makes the powerful argument, which received increasing acceptance during the 1980’s, that family violence results from the unequal, patriarchal structure of society. Raymond, Chris. “Campaign Alerts Physicians to Identify, Assist Victims of Domestic Violence.” JAMA: Journal of the American Medical Association 261, no. 7 (February 17, 1989): 963-964. Expresses a recognition by the medical establishment that hurt women should be questioned about the cause of their injuries and provided with support and options for escaping violence. Schneider, Elizabeth M. Battered Women and Feminist Lawmaking. New Haven, Conn.: Yale University Press, 2000. Extensive coverage of domestic violence and U.S. law, from the point of view of feminist theory. Sokoloff, Natalie J., with Christina Pratt, eds. Domestic Violence at the Margins: Readings on Race, Class, Gender, and Culture. New Brunswick, N.J.: Rutgers University Press, 2005. Comprehensive collection of essays covering the effects of domestic violence on minority populations in the United States. Sullivan, Cris M., and Tameka Gillum. “Shelters and Other Community-Based Services for Battered Women and Their Children.” In Sourcebook on Violence Against Women, edited by Clare M. Renzetti, Jeffrey L. Edleson, and Raquel Kennedy Bergen. Thousand Oaks, Calif.: Sage, 2001. History of the American shelter movement and other organizations providing services to battered women. Summers, Randal W., and Allan M. Hoffman. “United States.” In Domestic Violence: A Global View, edited by Randal W. Summers and Allan M. Hoffman. Westport, Conn.: Greenwood Press, 2002. Catalogs the U.S. government’s response to domestic violence, and also includes legal definitions, cultural trends, victim and perpetrator characteristics, and the effects of abuse on children. Compares these statistics to those of other nations. Susan J. Wurtzburg See also Color Purple, The; Crime; Feminism; Marriage and divorce; Rape; Tyson, Mike; Women’s rights.
The Eighties in America
■ Doppler radar Definition
A system that utilizes the Doppler effect to measure the velocity of moving objects
The development of Doppler radar systems during the 1980’s led to myriad applications, ranging from improved weather prediction to improved air defense systems. The primary use of Doppler radar is to distinguish between stationary and moving objects by using the Doppler effect, which occurs whenever there is relative motion between a radar wave’s source and an observer. During the late 1970’s and into the 1980’s, computer systems and processing techniques improved, gaining the ability to analyze the frequency content of radar signals using mathematical operations known as fast Fourier transforms. As a result, coherent pulsed Doppler radar systems were developed that could determine the velocity of moving targets. Pulsed radar systems use one antenna that alternately transmits pulses and receives their reflections. These reflected waves are shifted from their initial frequency by the Doppler effect. If the object from which they are reflected is moving away from the transmitter, the waves’ frequency is lowered. If the object is moving toward the transmitter, the frequency is increased. Thus, by comparing the initial and reflected frequencies of radar waves, computers can determine the velocity and direction of movement of target objects. In the 1980’s, improved Doppler radar systems led to improvements in weather forecasting, air traffic control, and air defense. During the 1980’s and 1990’s, the National Weather Service installed Doppler radar systems throughout the United States. Radar waves transmitted from these systems were scattered and reflected by objects in the air, including raindrops, snow crystals, hailstones, and dust. Improved computer systems could use Doppler frequency-shift information to determine the speed and direction of winds blowing around these airborne objects. During the midto late 1980’s, the Next Generation Weather Radar program (NEXRAD), or Weather Service Radar 1988 Doppler (WSR-88D), advanced Doppler radar to the forefront of efforts to detect severe weather events that could threaten life and property. The presence, speed, and direction of severe weather elements, such as turbulence, violent thunderstorms, tornadoes, hurricanes, and lightning, were determined from Doppler radar measurements.
Douglas, Michael
■
297
Impact The advancements made in Doppler radar systems during the 1980’s provided meteorologists with the ability to ascertain flight conditions and make weather forecasts taking into account atmospheric flow patterns and wind motions in storms. They also increased their ability to determine the location and intensity of precipitation. Improved Doppler radar technology could detect low-level windshear and microburst hazards in the vicinity of airports, as well as detecting and monitoring the movement and development of severe storms. The new technology improved air traffic control systems and brought them to higher levels of automation. During the 1980’s, meteorologists extended their ability to predict weather to about a week in advance. Continued improvements in the precision of Doppler radar systems would increase the forecast interval to fourteen weeks and longer. In addition, Doppler radar laser guns, used to help enforce roadway speed limits, were added to the law-enforcement arsenal in the late 1980’s. Further Reading
Doviak, Richard J., and Dusan S. Zrnic. Doppler Radar and Weather Observations. 2d ed. Mineola, N.Y.: Dover, 2006. Schetzen, Martin. Airborne Doppler Radar. Reston, Va.: American Institute for Aeronautics and Astronautics, 2006. Winslow, Jennifer L. Comparisons of Observations by WSR-88D and High Resolution Mobile Doppler Radar in Tornadoes and a Hurricane. Washington, D.C.: Storming Media, 1998. Alvin K. Benson See also
Astronomy; Computers; Science and tech-
nology.
■ Douglas, Michael Identification American actor and producer Born September 25, 1944; New Brunswick, New
Jersey Douglas in the 1980’s proved that he could come back from years of career inactivity and that he had acting talent as well as film-production skills. He starred in several films that came to define both the cinema and the American culture of the 1980’s.
298
■
The Eighties in America
Douglas, Michael
turned to the limelight, co-producing and starring in an old-fashioned but tongue-in-cheek actionadventure, Romancing the Stone. A turning point for his career, the film was critically praised and a boxoffice hit. It revived his career both as an actor and as a producer. He was an executive producer of Starman the same year. In 1987, Douglas broadened his acting range from journeyman actor to respected star with two hit motion pictures. Co-starring with Glenn Close, Douglas portrayed weak adulterer Dan Gallagher in Fatal Attraction, which became a worldwide success. He followed that performance by starring in Oliver Stone’s Wall Street, as devious stock manipulator Gordon Gekko. For the latter role, in which he personified some of the worst of the decade’s economic disparities, Douglas won an Academy Award for Best Actor. His other cinematic efforts of the decade included It’s My Turn (1980), The Star Chamber (1983), A Chorus Line (1985), Black Rain (1989), and The War of the Roses (1989). “Movies are magic,” Douglas told Barbara Paskin of Ladies’ Home Journal. “They have brought me much more than reallife emotions have. We all know how life is going to end. Not movies.”
Michael Douglas celebrates winning the Academy Award for Best Actor for his performance in Wall Street at the April, 1988, ceremony in Los Angeles, California. (AP/Wide World Photos)
Michael Douglas was considered a likable, if lightweight, leading man at the beginning of the 1980’s. He had been famous for a celebrity father (Kirk Douglas), a hit television show (ABC’s The Streets of San Francisco, in which he co-starred with Karl Malden from 1972 to 1976), and an award-winning motion picture (1975’s One Flew over the Cuckoo’s Nest, which he produced). However, the cultural capital amassed through those distinctions seemed largely to have dissipated, and few people in the early 1980’s took Douglas seriously as an actor. By the end of the decade, all that had changed: He won Hollywood’s top acting award for a role that many said epitomized the 1980’s. Between 1980 and 1983, Douglas was mostly absent from the public scene, as he recovered from a serious skiing accident. In 1984, however, he re-
Impact Although personally and professionally improving in the 1980’s, Michael Douglas probably had the biggest impact on the decade with his portrayal of stock trader Gordon Gekko in 1987’s Wall Street, with the memorable line that seemed to sum up society’s fascination with the stock market: “Greed is good.” Further Reading
Dougan, Andy. Michael Douglas: Out of the Shadows. London: Robson Books, 2003. Lawson, Alan. Michael Douglas. London: Robert Hale, 1993. McGuigan, Cathleen, and Michael Reese. “A Bull Market in Sin.” Newsweek 110, no. 24 (December 14, 1987): 78-79. Parker, John. Michael Douglas. London: Headline, 1994. Bill Knight See also
Academy Awards; Action films; Close, Glenn; Fatal Attraction; Film in the United States; Television; Turner, Kathleen; Wall Street.
The Eighties in America
■ Drug Abuse Resistance Education (D.A.R.E.) Definition
A drug use prevention program that educates children about drugs and ways to resist peer pressure Date Began in Los Angeles in 1983 D.A.R.E. became a national program that aimed to prevent children from abusing drugs in the first place. It therefore targeted fifth and sixth grade students. Drug Abuse Resistance Education (D.A.R.E.) was founded in Los Angeles in 1983, the result of a joint effort by the Los Angeles Police Department and the Los Angeles Unified School District. In 1987, D.A.R.E. America, a nonprofit corporation, sought to nationalize the D.A.R.E. program, create a national training program for officers, and solicit federal funding. In the late 1980’s, the Bureau of Justice Assistance funded five regional D.A.R.E. training centers. By the early twenty-first century, nearly threequarters of all U.S. school districts had a D.A.R.E. program in place. Although the D.A.R.E. programs varied widely, they all shared some specific components, including teaching elementary school students about the dangers of illegal drug use and how to resist peer pressure to use drugs. The program focused primarily on marijuana, tobacco, and alcohol. Drug Abuse Resistance Education programs typically consisted of seventeen forty-five-minute lessons, taught once a week by a police officer who received—at the least—an intensive two-week, eighty-hour training program on how to teach the D.A.R.E. curriculum. The D.A.R.E. curriculum included a focus on improving children’s self-esteem, coping skills, assertiveness, communication skills, and decision-making skills. Students also learned to identify positive alternatives to drug use. They received information and built skills through a variety of activities, including question-and-answer sessions, role playing, workbook exercises, and group discussions. Upon completion of the program, a graduation ceremony was conducted. Impact The D.A.R.E. program that began in the early 1980’s expanded to most school districts across the country. However, while D.A.R.E. was a better crime-prevention program than the Just Say No campaign mounted by the Ronald Reagan administration and First Lady Nancy Reagan, the majority of
Dukakis, Michael
■
299
studies evaluating D.A.R.E. programs revealed that they did not reduce drug use. Initial evaluations of the program did show positive results, but later studies indicated that people who went through D.A.R.E. programs were actually more likely to use drugs later in life than were people who did not. The program remained widely praised across the country, however, and despite the lack of evidence of its effectiveness, hundreds of millions of dollars were invested in it. Further Reading
Bukoski, William. Meta-Analysis of Drug Abuse Prevention Programs. Rockville, Md.: NIDA Research, 1997. Caulkins, Jonathan, James Chiesa, and Shawn Bushway. An Ounce of Prevention, a Pound of Uncertainty: The Cost-Effectiveness of School-Based Drug Prevention Programs. Santa Monica, Calif.: RAND, 1999. Goldberg, Raymond. Taking Sides: Clashing Views in Drugs and Society. Guilford, Conn.: McGraw-Hill/ Dushkin, 2005. Sheryl L. Van Horne See also
Crime; Just Say No campaign.
■ Dukakis, Michael Identification
Governor of Massachusetts from 1975 to 1979 and 1983 to 1991 and Democratic presidential candidate in 1988 Born November 3, 1933; Brookline, Massachusetts Dukakis was the first Greek American to win a presidential nomination. His campaign helped define the U.S. public’s perceptions of liberal politicians in the late 1980’s and the 1990’s. Michael Dukakis grew up in a professional middleclass family, the son of Greek immigrants to the United States. His father was a doctor and his mother a schoolteacher. Dukakis attended college in Massachusetts, earning a bachelor of arts degree in 1955. He then enlisted in the United States Army, serving as an intelligence analyst. After two years in the Army, he returned to Massachusetts to complete his education. In 1960, Dukakis graduated from Harvard Law School, and he began practicing law in Boston. Three years later, he married Katherine “Kitty” Dickson.
300
■
Dukakis, Michael
The Eighties in America
Dukakis began his political career in local government. He went on to serve in the state legislature and to mount a failed campaign for lieutentant governor, before being elected governor of Massachusetts in 1974. His state’s economic woes caused him to lose a bid for reelection four years later. After a few years in academia, Dukakis returned to the governor’s mansion in 1983 and helped bring the state out of economic turmoil. He won two more terms and in 1986 was named the most effective governor in the nation by the National Governors’ Association. In 1988, Dukakis coauthored a book detailing his state’s economic recovery—the so-called Massachusetts Miracle—and touting his abilities as a presidential contender. The 1988 Presidential Campaign In 1988, Dukakis entered the race for the Democratic presidential nomination. The crowded field included U.S. senator Al Gore of Tennessee, former senator Gary Hart of Colorado, and the outspoken preacher and civil rights activist Jesse Jackson. Dukakis secured the nomination and named Senator Lloyd Bentsen of Texas as his running mate. Throughout the campaign, Dukakis described himself as the quintessential realization of the American Dream—a child of immigrants who could become president of the United States. He used the Neil Diamond song “Coming to America” as his campaign theme. The 1988 presidential race was a contest between the little-known Dukakis and Ronald Reagan’s heir apparent, Vice President George H. W. Bush. Dukakis attempted to run a positive campaign, but he was brutally assailed by the opposition. The Republicans labeled him a typical northeastern, far-left Democrat. A member of the American Civil Liberties Union (ACLU) and self-proclaimed “proud liberal,” Dukakis proved unable to respond effectively when the Bush campaign began using the word “liberal” as an accusation. The governor was accused of being soft on crime, and during a presidential debate, he seemed cold and callous when addressing a loaded question about capital punishment. When the moderator asked Dukakis if he would still oppose the death penalty if a criminal raped and murdered his wife, the governor defended his position without emotion. The Republicans also attacked Dukakis’s policies as governor of Massachusetts. Convicted murderer William Horton had escaped from the Massachusetts justice sys-
Massachusetts governor Michael Dukakis attends a presidential campaign rally at the University of California, Los Angeles, on the eve of the 1988 general election. (Hal O’Brien/ccby-sa-3.0)
tem when he was released on a weekend furlough and never came back. He later raped a woman in Maryland. Horton’s story was used repeatedly in anti-Dukakis television commercials. Throughout the campaign, Dukakis struggled with his image. With his short stature and Greek features, the governor did not impress television viewers, nor was he a passionate speaker. He often seemed cold and aloof to voters. Furthermore, Dukakis lacked real military experience. His Army service paled in comparison to Bush’s exploits as the youngest naval aviator in World War II. In an effort to show the governor as a worthy commander in chief, Dukakis was photographed wearing a helmet and driving a tank. Rather than looking presidential,
The Eighties in America
however, Dukakis looked comical and out of place. The photograph was used by the Republicans to further ridicule Dukakis. Dukakis also lacked foreign policy experience. His only political experience was in local and state government. Meanwhile, his opponent reminded voters of his eight years in the Reagan administration and former position as U.S. director of central intelligence. In the end, Dukakis could not overcome his lackluster image. When Governor Dukakis campaigned with his running mate, the tall, distinguished Bentsen often seemed the better candidate for the presidency. Dukakis also failed to counter the Reagan mystique. Despite reminding Americans of the recent Iran-Contra scandal, Dukakis could not tarnish the Reagan-Bush image. Voters viewed Bush as the better choice for the presidency. Following the 1988 campaign, Dukakis returned to Massachusetts and finished his term as governor. He considered losing the presidential election one of his life’s biggest disappointments. After leaving public service, Dukakis took positions as a lecturer at the University of California, Los Angeles, and at Northeastern University in Massachusetts, teaching political science and public policy. Impact Governor Dukakis’s presidential campaign had lasting effects for his country and his party. Because he was unable convincingly to negate the pejorative sense of the word “liberal” as used by his opponent, that connotation tended to remain operative in U.S. national politics. Future contenders, rather than defend liberalism, began to distance themselves from the label altogether. Dukakis’s legacy in Massachusetts, however, remained associated with the Massachusetts Miracle and the successful recovery of the state’s economy. Further Reading
Dukakis, Michael, and Rosabeth Moss Kanter. Creating the Future: The Massachusetts Comeback and Its Promise for America. New York: Summit Books, 1988. Dukakis’s own account of his success in rescuing his state’s economy, written to prepare for his presidential run and argue for his qualifications to guide the national economy. Gaines, Richard, and Michael Segal. Dukakis: The Man Who Would Be President. New York: Avon, 1988. Another book published during the campaign, this biography sought to introduce Dukakis to voters.
Dupont Plaza Hotel fire
■
301
Goldman, Peter, and Tom Mathews. The Quest for the Presidency, 1988. New York: Simon & Schuster, 1989. Postmortem of the 1988 presidential campaign. Polsby, Nelson W., and Aaron Wildavsky. Presidential Elections: Contemporary Strategies of American Electoral Politics. New York: Free Press, 1991. General work detailing presidential campaign successes and failures of the 1980’s. Rhonda L. Smith See also Bentsen, Lloyd; Bush, George H. W.; Elections in the United States, 1988; Horton, William; Reagan, Ronald; Reagan Democrats.
■ Dupont Plaza Hotel fire The Event
A massive fire at a hotel-casino along Puerto Rico’s upscale north shore killed ninetyseven people Date December 31, 1986 Place El Condado Beach, San Juan, Puerto Rico Deliberately set as part of an escalating contract dispute between hotel management and labor, the catastrophic New Year’s Eve fire not only exposed the hotel’s inadequate emergency preparedness but also underscored wider economic problems confronting the Caribbean tourist industry. During December, 1986, the management of the Dupont Plaza Hotel, a luxury resort along Puerto Rico’s Gold Coast, had tried unsuccessfully to renegotiate a contract with hotel workers, who were threatening to strike during the lucrative holiday season. The hotel had received menacing letters and even bomb threats, and disgruntled workers had set three small fires trying to unsettle hotel operations and encourage a favorable settlement. However, concerned guests had been reassured by management that the hotel had not been specifically threatened. When management would not concede to union demands for higher wages, the union voted at an emergency mid-afternoon meeting on New Year’s Eve to strike at midnight. Less than ten minutes after the meeting, around 3:30 p.m., three workers— Héctor Escudero, Armando Jimenez, and José Francisco Rivera Lopez—angered over the impending strike and intending to cause property damage to the hotel, used cooking oil from the hotel’s kitchens to start a fire in a second-floor storage room filled
302
■
The Eighties in America
Duran Duran
with unused furniture. The furniture quickly caught fire, and its protective plastic wrapping gave off thick, toxic fumes. Within minutes, a massive fireball spread first to the adjacent ballroom and then to the lobby. Holiday tourists panicked as the superheated air and smoke from the fire threatened the secondfloor casino. Hotel security, long concerned with monitoring activity within the casino, had routinely chain-locked all doors other than the casino’s main entrance, which became engulfed in flames. Desperate to exit, guests hurled themselves out of windows or swarmed into elevators and stairwells, only to find the ground floor in flames. Within fifteen minutes,
97 people died, most from asphyxiation. More than 140 were injured. The three hotel workers were eventually found guilty of murder, arson, and conspiracy. In one of the largest civil lawsuits ever filed, the hotel was sued by more than two thousand plaintiffs seeking damages of almost $2 billion. The plaintiffs charged hotel management with failing to provide adequate warning and lacking sufficient emergency procedures (the hotel did not have fire sprinklers). The Dupont Plaza never reopened under that name. Instead, it was sold to the Marriott chain, completely refurbished, and reopened as the San Juan Marriott and Stellaris Casino. Impact In addition to triggering an industry-wide reform of fire-prevention measures and evacuation procedures, the terrorist attack on the Dupont Plaza underscored the disparity in the Caribbean between the wealthy tourists upon whom the region’s tourism industry depends and the indigenous population that maintains that industry, often at wages just above the poverty line. Further Reading
Dietz, James L. An Economic History of Puerto Rico. Princeton, N.J.: Princeton University Press, 1987. Monge, Jose Trias. Puerto Rico: The Trials of the Oldest Colony in the World. New Haven, Conn.: Yale University Press, 1999. Noon, Randall K. Engineering Analysis of Fires and Explosions. Boca Raton, Fla.: CRC Press, 1995. Joseph Dewey See also MGM Grand Hotel Fire; Terrorism; Unions.
■ Duran Duran Identification British New Wave band Date Formed in 1978
Duran Duran was instrumental in revolutionizing and popularizing synthesizer music, contributing to the unique sound of the 1980’s. The group also produced exotic music videos and participated in the increasing trend of mass marketing popular bands. A U.S. Coast Guard helicopter rescues trapped tourists from the roof of the Dupont Plaza Hotel during the December 31, 1986, fire that cost ninety-seven people their lives. (AP/Wide World Photos)
Duran Duran was a British “new romantic” band that was a leading player in the 1980’s U.S. New Wave music scene. Guitar player John Taylor and keyboard player Nick Rhodes formed the band in 1978 in Birmingham, England. Guitar player Andy Taylor,
The Eighties in America
Duran Duran
■
303
Duran Duran in January , 1980. From left: Andy Taylor (guitar), John Taylor (bass), Simon Le Bon (vocals), Nick Rhodes (keyboards), and Roger Taylor (drums). (Hulton Archive/Getty Images)
drummer Roger Taylor, and lead vocalist Simon LeBon rounded out the main lineup through most of the decade. They were known for their synthesizerdriven music and elaborate music videos. They were also known for their good looks and deliberate focus on style as well as music. Their popularity increased through coverage in teen magazines such as Smash Hits and Tiger Beat, and some drew comparisons between Duran Duran’s American fans and those of the Beatles in the 1960’s. The British press dubbed them the Fab Five, a deliberate echo of the Beatles’ Fab Four moniker. The band reached the height of its fame in the mid-1980’s and sold more than 70 million albums worldwide during its career. By 1984, it was featured on the cover of Rolling Stone magazine. The group’s 1980’s albums included Duran Duran (1981, rereleased in the United States in 1983), Rio (1982), Seven and the Ragged Tiger (1983), Arena (1984), Notorious (1986), Big Thing (1988), and Decade: Greatest Hits (1989). Its 1980’s U.S. hit singles included “Rio,” “Hungry Like the Wolf,” “Is There Something I Should Know?,” “Union of the Snake,” “New Moon
on Monday,” “The Reflex,” “The Wild Boys,” “A View to a Kill,” “Notorious,” “I Don’t Want Your Love,” “Do You Believe in Shame?,” and “All She Wants Is.” “The Reflex” was the band’s first U.S. number one single, and “A View to a Kill” was the first theme song from a James Bond movie to reach number one on the U.S. charts. In 1985, Duran Duran’s exhausted members took a hiatus from the full band and worked in smaller groups on several outside projects. Nick Rhodes, Simon LeBon, and Roger Taylor formed Arcadia, while John and Andy Taylor joined Robert Palmer and Tony Thompson to form Power Station. The full band reunited briefly to record “A View to a Kill” and to perform at the July 13, 1985, Live Aid concert at JFK Stadium in Philadelphia, Pennsylvania. Roger and Andy Taylor left the band the following year. The band then went through several changes in the ensuing years, with the original lineup briefly reuniting in the early twenty-first century. Impact Duran Duran’s members were video pioneers who rose to prominence in the United States
304
■
The Eighties in America
Dworkin, Andrea
at the same time as MTV, each aiding the other. Videos such as “Hungry Like the Wolf” and “Rio,” shot in exotic locations on 35mm film rather than videotape, were a radical departure from other early videos that featured bands simply performing on a stage. Duran Duran was also one of the first bands to use video technology during live performances. The group’s style and technique had a lasting influence on the medium, and its members received a lifetime achievement award at the 2003 MTV Video Music Awards. Their synthesizer-driven music, catchy pop tunes, and focus on fashion and marketing also influenced later generations of artists and managers. Further Reading
Kallen, Stuart A., and Bob Italia. Rock in Retrospect: The 1980’s. Bloomington, Minn.: Abdo & Daughters, 1989. Mallins, Steve. Duran Duran: Notorious. London: Andre Deutsch, 2006. Martin, Susan. Duran Duran. New York: Wanderer Books, 1984. Marcella Bush Trevino See also MTV; Music; Music videos; New Wave music; Pop music; Synthesizers.
worlds of publishing, politics, government, and big business, she became a target for retaliatory hate campaigns. She was also controversial within the feminist movement, and a few influential feminists criticized and publicly ridiculed Dworkin. Because of her lack of political tact, abrasive manner, and obesity, as well as claims by some that she was a lesbian, Dworkin presented an inviting target for some critics, who aimed where they believed it would do the most damage: They accused Dworkin of being a radical feminist solely because she was angry at herself for not being attractive to men. Critics were unaware or unconcerned that Dworkin’s romantic partner and eventual spouse was a man. Dworkin, though wounded professionally and personally by these attacks, continued in her antipornography campaigns. In 1983, while teaching for a semester at the University of Minnesota, Dworkin and attorney Catharine MacKinnon wrote an ordinance that made producing or selling pornography a violation of women’s civil rights. It was passed twice by the Minneapolis city council but vetoed by the mayor. The same ordinance was adopted by the state of Indiana in 1984 but was later ruled unconstitutional and overturned by an appeals court. In 1986, Dworkin’s efforts against pornography found an ally in the U.S. Attorney General’s office.
■ Dworkin, Andrea Identification
American feminist Born September 26, 1946; Camden, New Jersey Died April 9, 2005; Washington, D.C. The persistent and aggressive actions of Andrea Dworkin to link pornography to violence against women helped strenthen the legal rights of women victimized by sex crimes. Andrea Dworkin was a fervent and dedicated foe of male sexual dominance, particularly as expressed in pornography. Throughout the 1980’s, Dworkin wrote prolifically and spoke passionately as an advocate for oppressed women. A harsh critic of the male-dominated
Andrea Dworkin addresses a federal commission on pornography in January, 1986. (AP/Wide World Photos)
The Eighties in America
The Attorney General’s Commission on Pornography, also known as the Meese Commission, was formed to study the effects of pornography and possible responses to those effects. Dworkin’s research and opinions were accepted by the committee. As a result, at least for a time, store owners were forced to remove pornographic magazines from high-visibility shelves and to relegate them instead to more obscure and protected store locations. Public awareness of the proliferation and possible consequences of overt pornography was heightened as a direct result of Dworkin’s effort. While Dworkin firmly believed that pornography resulted in harm and sometimes even death for women, she was not in favor of obscenity laws, considering them ineffectual. Instead, she advocated passage of federal civil rights laws for women who were sexually victimized, as well as a host of punitive laws against pornographers. Dworkin published a treatise on sexual intercourse (aptly titled Intercourse) in 1987. In it, she asserted that even classic literature often portrayed male-dominant sexual positions as a method of subjugating women. She was consequently incorrectly quoted as saying that all intercourse is rape, reinforcing the widely held perception that she hated men categorically and irrationally. Impact
Andrea Dworkin raised American awareness of hate crimes and sexual crimes against women. She encouraged government to extend civil rights protection for victims of unwanted and violent sex acts, laying the groundwork for hallmark legislation in the future.
Further Reading
Dworkin, Andrea. The Political Memoir of a Feminist Militant. New York: Basic Books, 2002. Steger, Manfred, and Nancy Lind. Violence and Its Alternatives: An Interdisciplinary Reader. New York: St. Martin’s Press, 1999. Twyla R. Wells See also Feminism; Handmaid’s Tale, The; Meese, Edwin, III; Pornography; Rape; Sexual harassment.
Dynasty
■
305
■ Dynasty Identification Prime-time television soap opera Producers Aaron Spelling (1923-2006), Esther
Shapiro (1934), and Richard Shapiro (1934) Date Aired from January 12, 1981, to May 10, 1989 Conceived as ABC’s answer to CBS’s successful prime-time soap opera Dallas, Dynasty became a weekly ritual for more than 100 million viewers in more than seventy countries. Viewers rearranged their schedules to watch the program, and Dynasty parties were not uncommon, as fans got together to cheer for their favorite characters and gasp at the twists in the plot. Set for the most part in Denver, Colorado, Dynasty revolved around Blake Carrington, a self-made millionaire played by John Forsyth. The show portrayed both Blake’s business dealings and his personal relationships with family members. Of particular note were his continuing conflicts with his ex-wife, Alexis (Joan Collins), and his love for his second wife, Krystle (Linda Evans). Both women were extraordinarily beautiful and wore designer clothes, but neither was young, making them unusual protagonists for a soap opera. They were outstanding in their strong, goal-oriented personas. Alexis, introduced in the second season to win more viewers, was Dynasty’s equivalent of Dallas’s J. R. Ewing, a ruthless character whom fans loved to hate. She was a businesswoman at a time when women were still dreaming about cracking the glass ceiling. Krystle, a more traditional character, acted as the moral center of the drama. A physical fight between the two women in a lily pond was a highlight of the 1982-1983 season. The show’s principal viewers were women, attracted by the glamorous lifestyles portrayed, and gay men, lured by both the campy style of the program and the story line concerning Carrington’s gay son Steven. Dynasty was the first prime-time network drama to feature an openly gay major character. Dynasty premiered as a three-hour movie in January, 1981; as a prime-time series, it quickly climbed in the ratings once Collins joined the cast. For the 1984-1985 season, it was the top-rated show in the United States. The cliffhanger that season, a deadly wedding in Moldavia, ended with nearly every character on the show being caught in a hail of automatic
306
■
Dynasty
The Eighties in America
designed gowns of the main female characters—but it also created a market for luxuries reminiscent of those enjoyed on the show, and for the lifestyle for which those luxuries stood. Fans’ fascination with the goods displayed on the show led to the creation of the Dynasty collection of products, ranging from clothes to linens to fragrances. The show has been seen as emblematic of the Ronald Reagan era, which came to be known for its extravagance. The duration of the program, moreover, corresponded quite closely with that of the Reagan administration. Dynasty began broadcasting during the first Reagan inaugural. It became a major hit during his presidency and was canceled just months after he left the White House. Subsequent Events Dynasty creator Aaron Spelling is flanked by two of the show’s stars, Linda Evans, left, and Joan Collins, at a party in 1984. (AP/Wide World Photos)
gunfire. Viewers were in such suspense over the summer that the first episode of the next season made national news broadcasts, which showed Dynasty fans gathering to watch and discover which characters had survived. Impact Centered on the lives of the extremely wealthy, Dynasty was a paean to consumerism. Not only did it display opulence—from the eighty-fourroom Carrington mansion to the Nolan Miller-
After the show was abruptly canceled in 1989, several story lines were left unfinished. In 1991, ABC aired Dynasty: The Reunion, a two-part miniseries that wrapped up the loose ends. A television movie about the original program, Dynasty: The Making of a Guilty Pleasure, was aired in 2005.
Further Reading
Gripsrud, Jostein. The “Dynasty” Years. New York: Routledge, 1995. Shapiro, Esther. “Introduction.” Dynasty: The Authorized Biography of the Carringtons. Garden City, N.Y.: Doubleday, 1984. Marcia B. Dinneen See also
Dallas; Soap operas; Television.
E ■ École Polytechnique massacre The Event
The murders of fourteen women by a misogynistic twenty-five-year-old man Date December 6, 1989 Place École Polytechnique, University of Montreal, Montreal, Quebec Marc Lépine engaged in a premeditated massacre of women college students because he believed them to be feminists. This gender-related attack profoundly influenced university students, especially women, as well as feminists of all ages across Canada. They demanded that measures be taken to improve the safety of Canadian women, including strengthening gun-control laws; some also spoke out against what they saw as the misogynistic culture that had produced Lépine. The shooting spree at École Polytechnique, also called the Montreal Massacre, occurred at the end of the autumn semester, 1989, when Marc Lépine entered the school and began shooting female students and staff. Late in the afternoon of December 6, Lépine moved rapidly through the engineering building, finding young women and shooting them with a legally purchased semiautomatic rifle, or stabbing them with a hunting knife. In a grim parallel to gender-related killings in other parts of the world, Lépine targeted a fourth-year engineering class, forcing the men to leave the room, lining the women up against a wall, and executing them. As he killed these six female students, he ranted against “feminists,” demonstrating his extreme hatred and resentment of women. Lépine eventually killed fourteen women: Geneviève Bergeron (aged twenty-one), Hélène Colgan (aged twenty-three), Nathalie Croteau (aged twentythree), Barbara Daigneault (aged twenty-two), AnneMarie Edward (aged twenty-one), Maud Haviernick (aged twenty-nine), Maryse Laganière (aged twentyfive), Maryse Leclair (aged twenty-three), Anne-Marie Lemay (aged twenty-seven), Sonia Pelletier (aged twenty-eight), Michèle Richard (aged twenty-one),
Annie St-Arneault (aged twenty-three), Annie Turcotte (aged twenty-one), and Barbara Klucznik Widajewicz (aged thirty-one). With the exception of Laganière, who was a member of the university’s staff, all these young women were students, most of them in the engineering department. In addition to murdering these students, Lépine injured approximately a dozen other individuals, including a few men, before finally turning his rifle on himself and committing suicide. The police arrived on the scene after Lépine was dead, prompting a reevaluation of police response protocols. Post-Massacre Events The high death toll of the massacre and the youth of Lépine’s victims shocked the Montreal community and all Canadians. It was the worst single-day massacre in Canadian history, a statistic that was soon noted by the national media. An additional issues of concern was the fact that the gunman had used a high-powered semiautomatic weapon that he had obtained legally after paying a paltry licensing fee. Women across the nation drew attention to Lépine’s hatred for women, especially feminists, as well as the ease with which he had carried out his attack given the delayed police response. All of these issues had significant implications for women’s safety in public settings. Media accounts of the massacre interpreted the event through two divergent lenses. Some reporters discussed it as a symptom of a larger social problem: They asserted that the cause of the massacre could be directly attributed to the fact that violence against women was still relatively socially acceptable in Canada, and they saw the crime as arising from systematic sexual inequities in Canadian society. Other writers for the Canadian mainstream media eschewed this approach: They focused instead on the mental health and psychology of the killer, suggesting that one pathological individual was solely responsible for the massacre. In this vein, many reporters wrote about Lépine’s unhappy childhood and the physical abuse he suffered during his first seven
308
■
The Eighties in America
École Polytechnique massacre
lence and of the ready availability of powerful weapons. It generated a national discussion about misogyny, safety, and gun control whose effects are still ongoing. Further Reading
A wounded shooting victim is wheeled out of the École Polytechnique in the wake of Marc Lépine’s rampage. (AP/Wide World Photos)
years of life at the hands of his Muslim Algerian father. Few journalists ventured beyond these two broad explanations or considered the connections between society and the individual. Despite these attempts at explanation, many members of the Canadian media and the general public had great difficulty understanding such a brutal crime. They could not satisfactorily explain Lépine’s motivation, nor could they comprehend how he had been able to spend twenty minutes hunting down students without any challenge from the police. Some media accounts blamed the school’s male students and suggested that they should have done something to protect their female classmates. These reports were extremely detrimental to the survivors, and after the massacre, several students committed suicide. Many Canadian students and others strove to create something positive after the event. A number of memorials were established, and groups were founded to work for increased gun safety in Canada, resulting in social changes in the 1990’s and later. Impact The Montreal Massacre was a bloody reminder to Canadians of women’s vulnerability to vio-
Adamson, Nancy, Linda Briskin, and Margaret McPhail. Feminist Organizing for Change: The Contemporary Women’s Movement in Canada. Toronto: Oxford University Press, 1988. A history of Canadian feminism that does an excellent job of summarizing the state of the movement at the time of the murders. Eglin, Peter, and Stephen Hester. The Montreal Massacre: A Story of Membership Categorization Analysis. Waterloo, Ont.: Wilfrid Laurier University Press, 2003. An ethnomethodological analysis of media accounts of the massacre, focusing on the categories employed in those accounts, such as “feminism” and “women.” Malette, Louise, and Marie Chalouh, eds. The Montreal Massacre. Translated by Marlene Wildeman. Charlottetown, P.E.I.: Gynergy Books, 1991. Documentation of French Canadian reactions to the murders, including newspaper articles and letters to the editor; includes brief biographical information on the fourteen murdered women. Nelson, Adie, and Barrie W. Robinson, eds. Gender in Canada. 2d ed. Toronto: Prentice Hall, 2002. Presentation of a gendered understanding of Canada. O’Donovan, Theresa. Rage and Resistance: A Theological Reflection on the Montreal Massacre. Waterloo, Ont.: Wilfrid Laurier University Press, 2007. An account, based in feminist theology, of one woman’s attempt to understand the murders of the Montreal women. Rathjen, Heidi, and Charles Montpetit. December 6: From the Montreal Massacre to Gun Control, the Inside Story. Toronto: McClelland & Stewart, 1999. An account of the massacre and its aftermath and consequences from the perspective of surviving students at the University of Montreal. Rosenberg, Sharon, and Roger I. Simon. “Beyond the Logic of Emblemization: Remembering and Learning from the Montreal Massacre.” Educational Theory 50, no. 2 (Spring, 2000): 133-156. Study of individual recollections and understanding of the massacre several years after the event. Wilson, I. P. (Trish). “Reading the ‘Montreal Massa-
The Eighties in America
cre’: Idiosyncratic Insanity or the Misreading of Cultural Cues?” In Ethnographic Feminism: Essays in Anthropology, edited by Sally Cole and Lynne Phillips. Ottawa, Ont.: Carleton University Press, 1996. Feminist analysis of the the Canadian media’s representation of the event and its aftereffects. Susan J. Wurtzburg See also
Crime; Feminism; Post office shootings; Reagan assassination attempt; San Ysidro McDonald’s massacre; Sexual harassment; Stockton massacre; Women’s rights.
■ Economic Recovery Tax Act of 1981 Identification U.S. federal legislation Date Signed into law on August 13, 1981
The Economic Recovery Tax Act of 1981 was an eradefining law promoted by the newly elected President Ronald Reagan that significantly reduced federal tax levels. Ronald Reagan was elected president in November, 1980, promising significant tax cuts, and his campaign pledge was fulfilled less than eight months after he assumed office with the passage of the sweeping Economic Recovery Tax Act (ERTA, also known as the Kemp-Roth tax cut). Among the significant features of ERTA, personal income taxes were cut by around 23 percent over two years, they were indexed for inflation, and the top marginal tax bracket was reduced from 70 percent to 50 percent. In addition, a generous set of depreciation schedules (called the Accelerated Cost Recovery System) was initiated for businesses. The tax cuts enacted by ERTA were the largest in history at the time and were guided by an unusually coherent political and economic ideology. The primary theoretical support for significant tax cuts was provided by the theory of supply-side economics, which was supported by President Reagan. Reagan argued that cutting personal income taxes would stimulate the economy, resulting in new jobs and other opportunities for Americans to create wealth. As a consequence, he believed, net governmental revenues would increase, because there would be more income to tax, despite the decrease in the rate of taxation. Economists overwhelmingly rejected
Economic Recovery Tax Act of 1981
■
309
this argument, and they were concerned that ERTA would significantly increase the national debt. Supply-side supporters, however, argued that budget deficits would not be a serious problem. Every House Republican stood behind ERTA, and a sizable group of conservative Democrats—dubbed Boll Weevils—voted with the Republicans in support of the tax cuts. Impact Through the successful passage of ERTA, Reagan’s philosophy regarding the size and scope of government set the tone for U.S. budget politics into the twenty-first century. Supporters credited the tax cuts with spurring economic expansion throughout the 1980’s and with reducing the scope of the federal government. Critics, however, argued that the tax cuts came at the expense of large budget deficits and a constantly increasing national debt. It has been estimated that ERTA cost the federal government more than $2 trillion in lost revenue over the period 1982-1991, which—because spending did not decrease alongside revenue—led to the largest deficits in the country’s history. Between 1982 and 1989, the national debt almost tripled, going from $1.1 trillion to $2.9 trillion. Budget politics in the United States after 1981 focused largely on the annual deficits contributing to this debt. Those politics also became more contentious, with budgetary votes becoming increasingly partisan and divisive after 1981.
President Ronald Reagan signs the Economic Recovery Tax Act into law while vacationing in California on August 13, 1981. (AP/Wide World Photos)
310
■
Education in Canada
ERTA also changed the perception of the budget priorities of the Republican Party. Prior to Reagan’s presidency, Republicans had historically pushed for tax cuts, but not at the expense of large budget deficits. As desirable as tax cuts were, balanced budgets were usually deemed more important. Beginning with the Reagan administration, however, the emphasis for many Republicans moved toward cutting taxes rather than reducing the deficit. This trend was greatly strengthened after George H. W. Bush’s presidency. Bush, concerned over the state of the debt, broke a campaign promise and raised taxes. He failed to win a second term in 1992. Further Reading
Steinmo, Sven. Taxation and Democracy. New Haven, Conn.: Yale University Press, 1993. Stockman, David. Triumph of Politics. New York: Harper and Row, 1986. White, Joseph, and Aaron Wildavsky. The Deficit and the Public Interest: The Search for Responsible Budgeting in the 1980’s. Berkeley: University of California Press, 1989. Patrick Fisher See also
Business and the economy in the United States; Congress, U.S.; Conservatism in U.S. politics; Reagan, Ronald; Reagan Democrats; Reagan Revolution; Reaganomics.
■ Education in Canada Definition
Policies, practices, and cultural trends affecting academic instruction in Canada, from preschool through graduate and professional schools
During the 1980’s, the policy of multiculturalism was implemented in the areas of curriculum and teacher training. Criteria for funding private schools were either maintained or amended as a result of increasing demands from the public, and community colleges were also challenged. Meanwhile, baccalaureate programs were instituted for the first time in British Columbia. As the only industrialized democracy without a national office of education, Canada had ten provincial and three territorial systems of education. Although Canada was the first country in the world to adopt an official policy embracing multiculturalism
The Eighties in America
in 1971, ten years later, the Canadian Ethnic Studies Association suggested that the policy had not been translated into school curricula. As a result, in 1981 the association organized a national conference with the theme “De-mythifying the Minority Groups: Some Issues in School Curricula.” In the following year, Canada’s new constitution made it a multicultural state by virtue of Section 27 of the Canadian Charter of Rights and Freedoms. Multiculturalism in Canadian Schools In response to the policy of multiculturalism, several provinces focused on eliminating prejudice and stereotyping from school curricula and textbooks. Regular review procedures were set up to screen the books to be included on authorized reading lists. Nova Scotia issued guidelines for curricula to reflect the experiences of its multiethnic, multicultural population. Newfoundland reorganized courses on literary heritage and required students to take a Newfoundland culture course. Positive developments also occurred in the areas of teacher preparation and training. In Newfoundland, multiculturalism was an integral part of both academic and methodological courses in social studies. Saskatchewan’s Teacher Bursary Program encouraged teachers to enroll in secondlanguage instruction courses and to pursue multicultural studies. Alberta’s Intercultural Educational Program was designed to prepare middle-class teachers to deal with students of other classes. A program in British Columbia aimed at developing teachers’ cross-cultural communication skills. Another one in Ontario focused on awareness of cultural pluralism. In addition, local and provincial associations organized workshops, seminars, courses, and colloquia on multiculturalism. In the late 1980’s, two documents provided an official discourse on diversity. Multiculturalism: Building the Canadian Mosaic (1987) was a product of the new policy of multiculturalism in a bilingual framework. This document defined the goals and approaches of the official multiculturalism policy and provided a basis for the Canadian Multiculturalism Act of 1988. Funding Private Schools Private schooling increased significantly in Canada beginning in the 1970’s. In the 1980’s, this trend continued, as parents who wanted to send their children to private school demanded more public money to subsidize that choice. Supporters of private schools argued that the rights of parents to choose the kind of edu-
The Eighties in America
cation their children would receive was a democratic right derived from the freedoms of religion and conscience, both of which were guaranteed by the Charter of Rights and Freedoms. Five provinces supported private schools: Quebec, Alberta, Saskatchewan, British Columbia, and Manitoba. Quebec was the province with the highest percentage of students in private schools and also with the most generous funding for private schools. Such Québécois private schools received funding based on their curricula: Schools whose curricula were similar to those of provincial public schools received a grant for each enrolled student equal to 80 percent of the average cost of educating a student. All the province’s other private schools received grants totaling 60 percent of the average cost per student. In Alberta, the Association of Independent Schools and Colleges of Alberta and the Edmonton Society for Christian Education were the most active organizations lobbying for private education. As a result of their efforts, grants to private schools increased from 33 percent to 75 percent of the average cost per student. In 1981, legislation passed making Albertan schools eligible for these grants after only a single year of operation. The previous requirement had been three years. In Saskatchewan, grants of 55 percent of the average cost per student were made to those private schools that followed provincial curriculum standards. Moreover, schools that had been in operation for at least five years could be reimbursed for up to 10 percent of their approved capital costs. Meanwhile, not satisfied with 30 percent grants and partial funding for operating costs and teachers’ salaries, the British Columbia Federation of Independent Schools Association pressed government and ministry officials for more funding. The Community Colleges Established between the 1960’s and the 1970’s as an adaptation of the American college system, Canada’s community colleges had successfully provided many postsecondary students with a second chance at education, one of their original goals. The complex of social, economic, cultural, and political objectives that they had been designed to accomplish rendered Canadian community colleges unique. In the 1980’s, the community colleges faced a number of problems that needed to be addressed. First, they lacked their own measures of success, relying instead on more general and less tailored measurements that robbed
Education in Canada
■
311
them of a distinctive sense of identity and purpose. Second, many of the colleges’ policies and practices (regarding, for example, employment contracts and capital facilities) had become obstacles to change. Finally, there was a lack of a unifying goal among the community colleges across the nation. In the late 1980’s, a community college baccalaureate degree movement began in British Columbia that sought to increase access to conventional university programs. A government-appointed access committee recommended that British Columbia institute baccalaureate degree programs in densely populated regions outside Vancouver and Victoria. Accordingly, the province selected some of its twoyear community colleges and converted them into four-year institutions. Accredited universities in turn partnered with the four-year community colleges to award baccalaureate degrees to their students. Impact Canadian second-language and multicultural educational programs benefited during the 1980’s from a significant growth in the amount of both teacher training and curricular development designed to support these programs. By the end of the decade, it was clear that the educational establishment, if not the majority of Canadians, had embraced policies of multiculturalism. Public schools were expected to reexamine their cultural roles, reexamine their programs, and foster this new understanding of Canadian identity. Private schools across Canada kept lobbying for more public aid in those provinces that had already granted some aid. The combined trends of multiculturalism and public grants for private education led some to predict that families whose linguistic, ethnic, cultural, or educational interests were not yet served by Canadian schools would emerge to demand support. Meanwhile, the problems faced by community colleges provided a challenging test in educational leadership. British Columbia provided a model for other jurisdictions, should they wish to consider introducing community college baccalaureate programs into their own postsecondary educational systems. Further Reading
Axelrod, Paul. The Promise of Schooling: Education in Canada, 1800-1914. Toronto: University of Toronto Press, 1997. Explores the social context in which educational policy was formed and implemented; provides a Canadian school experience
312
■
The Eighties in America
Education in the United States
through common patterns of development. _______. Values in Conflict: The University, the Marketplace, and the Trials of Liberal Education. Montreal: McGill-Queen’s University Press, 2002. Argues that in the race for riches, schools and universities are forced by government policy to narrow their educational vistas. Manzer, Ronald. Public Schools and Political Ideas: Canadian Public Educational Policy in Historical Perspective. Toronto: University of Toronto Press, 1994. Argues that educational politics and policies are constituted by consensus and difference in an ongoing dialogue about political principles. Anh Tran See also
Canadian Charter of Rights and Freedoms; Minorities in Canada; Multiculturalism in education.
■ Education in the United States Definition
Policies, practices, and cultural trends affecting academic instruction in the United States, from preschool through graduate and professional schools
Through rising educational attainment, an increasing portion of Americans went through U.S. schools in the 1980’s, although falling birthrates in earlier years meant that school enrollments dropped. Although attainment increased, this was a decade of concern over school achievement. This concern was one of the reasons that, although the decade began with political beliefs that federal involvement in education should be lessened, the 1980’s saw major federal educational initiatives. Over the course of the twentieth century, U.S. education levels incereased significantly. During the 1980’s, this trend continued, as Americans were educated in higher numbers and at greater levels than in previous decades. The percentages of Americans with high school and college diplomas had increased, particularly after 1960, until in 1980 69 percent of adult Americans were high school graduates. A decade later, that figure had risen to 78 percent. Percentages of adult Americans who were college graduates also rose during the 1980’s, from 17 percent to more than 21 percent. While the average level of U.S. education continued to increase over the 1980’s, the number of en-
rolled students changed little. The unusually large baby-boom generation, born roughly between 1946 and 1964, had largely reached adulthood by 1980 and had begun to move out of most levels of the school system. The new generation was smaller, so there were fewer minors in need of education during the 1980’s. Indeed, the number of students enrolled in elementary and secondary schools actually decreased between 1980 and 1985, from 46,208,000 to 44,979,000. By 1990, enrollments had returned to just above the 1980 level, at 46,451,000. Most of the growth during the decade occurred among institutions of higher education, since students enrolled beyond the high school level grew by more than 1.7 million, from 12,097,000 in 1980 to 13,819,000 in 1990. Increasing Diversity
Racial and ethnic minority students became a larger proportion of the U.S. student population during the 1980’s. According to U.S. census data, 73 percent of public school students were white non-Hispanics in 1980. Some 16 percent were African American, 9 percent were Hispanic, and 2 percent were classified as “other.” A decade later, the percentage of white non-Hispanic public school students had fallen to just over 67 percent, while 17 percent of students were African American, 12 percent were Hispanic, and 4 percent were “other.” The notable increase in Hispanic students and students of other races (primarily Asians) was largely due to an increase in the enrollment of students who were immigrants or children of immigrants. Efforts to desegregate schools continued in the 1980’s, but there were fewer new incidents of courtordered busing to achieve racial desegregation than there had been in the 1970’s. In addition, more school districts that had been under desegregation orders were declared unitary, meaning that they had eliminated the vestiges of desegregation and were no longer under court control. In a case that was historically notable, the parents of seventeen children in Topeka, Kansas, in 1979 petitioned the courts to reopen the 1954 Brown v. Board of Education case, on the grounds that the Topeka school board had not yet eliminated the heritage of discrimination that had been that case’s impetus. In April, 1987, U.S. District Judge Richard Rogers closed this chapter of American educational history, when he ruled that Topeka had ended school desegregation.
The Department of Education For much of American history, elementary and secondary education
The Eighties in America
has been seen as mainly the responsibility of local communities, with some support from the states. In the wake of World War II and the G.I. Bill, the federal government began to become more active in educational policy and funding, and in 1953, the Department of Health, Education, and Welfare became a cabinet-level department of the government’s executive branch. By the end of the 1970’s, federal educational policy had become such a priority that Congress decided to create a department devoted solely to education. Signed into law by President Jimmy Carter in October, 1979, Congress’s act split the Department of Health, Education, and Welfare into two new departments: the Department of Health and Human Services and the Department of Education. The Education Department began operating in May, 1980, under the leadership of Secretary Shirley Hufstedler, appointed by President Carter. President Ronald Reagan inherited the Department of Education when he took office in 1981. The Republican president had opposed what he and many others in his party saw as excessive federal intervention in local affairs, and President Reagan suggested that he would abolish the new department. When Reagan appointed Terrell Bell as the new Secretary of Education in 1981, it was widely expected that Bell would simply oversee the department’s destruction. Democrats, however, still controlled the House of Representatives throughout the 1980’s, and to abolish a cabinet-level department requires an act of Congress. Partly as a result, the Department of Education survived the decade. The federal government did spend less money on education during the two Republican administrations of the 1980’s. The real value of federal spending on education declined slightly between 1980, when the government spent $78.4 billion in 1999 dollars, and 1989, when it spent $77.5 billion in 1999 dollars. Budgeted spending on elementary and secondary education decreased during that period from $31.9 billion to $25.8 billion in 1999 dollars. Budgeted spending on postsecondary education decreased by an even greater percentage, from $22.2 billion to $17.3 billion, again in 1999 dollars. In 1985, President Reagan appointed a successor to Bell, William Bennett, who served as secretary of education until 1988. A social conservative, Secretary Bennett opposed many of the forms of multicultural
Education in the United States
■
313
education that had spread through U.S. schools, and he argued in favor of curricula rooted in the classics of Western civilization. Bennett also encouraged the teaching of moral principles, and he tried to convince American colleges to control drug use on campuses. Lauro F. Cavazos followed Bennett as secretary of education. Appointed by President Reagan in 1988, he continued to serve into the administration of President George H. W. Bush. Secretary Cavazos, the first Hispanic appointed to a cabinet post, was forced to resign in 1990 as a result of an investigation into his alleged improper use of frequent flier miles. A Nation at Risk Although Terrell Bell had been expected to direct the dismantling of the Department of Education, his term of office saw a major new initiative in federal educational policy. In 1981, Bell convinced President Reagan to appointment a commission to study the quality and shortcomings of the American educational system. The National Commission on Excellence in Education published its findings as A Nation at Risk in 1983. This influential and widely cited volume maintained that the educational foundations of the nation were being eroded by mediocrity.A Nation at Risk stressed that if the American economy was to function in the information age, the workforce needed a basic education. The report further emphasized that a high, shared level of education was necessary for realizing American social and political ideals, and it expressed a commitment to enabling all Americans from all backgrounds fully to develop their abilities through schooling. It argued that all students should concentrate on the academic “basics” and that schools should ensure demonstrable mastery of these basics. Among other recommendations, the report suggested that all high school students receive four years of instruction in English, three years in mathematics, three years in science, three years in social science, and one-half year in computer science. During the rest of the decade, many states began to implement some of the recommendations put forward in A Nation at Risk, including more rigorous student assessment programs and graduation exit examinations. In response to the report’s recommendations for higher education, many universities also began raising entrance requirements and demanding that college applicants follow the strict state high school curriculum recommended by the report.
314
■
Education in the United States
Educational Achievement
Data on educational achievement during the 1980’s provide some support for the view that American schools were producing mediocre students, but they indicate relatively little improvement in this situation following A Nation at Risk. In 1980, when the National Assessment of Educational Progress (NAEP) measured student reading comprehension, it found that 68 percent of nineyear-old students were able to understand uncomplicated reading material, combine ideas from the material, and make inferences based on the material. By 1988, this percentage had actually gone down to 63 percent, and by 1990 it had reached 59 percent. Among thirteen-year-olds taking the 1980 NAEP, 61 percent were able to satisfy the next tier of reading comprehension skills: to search for specific information, interrelate ideas, and make generalizations about literature, science, and social studies materials. This percentage also decreased to 59 percent in 1988 and 1990. Mathematics performance was somewhat more mixed. The percentage of nine-year-olds able to perform numerical operations and beginning problem solving increased from 19 percent in 1982 to 28 percent in 1990. However, the percentage of thirteenyear-olds functioning at the next level—that is, capable of moderately complex mathematical procedures and reasoning—remained constant at 17 percent during those years. Scholastic Aptitude Test (SAT) scores, often taken as a primary indicator of readiness for college, changed little throughout the 1980’s. The average score on the verbal part of the test was 424, out of a possible range from 200 to 800, in the 1980-1981 school year. Ten years later, the average score was the same. The average mathematics score in 1980-1981 was 467. This average increased by only 9 points to 476 in 1990-1991. Racial and ethnic gaps in achievement tests continued to exist throughout the decade, perpetuating the debate over affirmative action in college admissions. In 1980-1981, the average score of white students on the verbal portion of the SAT test was 442, compared to 332 for African Americans, 373 for Mexican Americans, and 397 for Asian Americans. After a decade, the average verbal score remained the same for white students, while students of color realized minor gains, reaching averages of 352 for African Americans, 380 for Mexican Americans, and 410 for Asian Americans. On the mathematics part of the test, Asian Americans consistently out-
The Eighties in America
performed other groups, while African Americans lagged behind. Average mathematics scores in 19801981 were 513 for Asian Americans, 483 for whites, 362 for African Americans, and 415 for Mexican Americans. By 1990-1991, Asian scores had gone up to 525, white scores to 489, Mexican American scores to 429, and African American scores to 385. Concerns over both overall levels of achievement and continuing inequalities in achievement led to new national educational efforts at the end of the 1980’s. Goals 2000 In 1989, the nation’s governors and President George H. W. Bush met at the National Education Summit. The president and the governors set forth six national goals for American schools, to be achieved by the year 2000. By that year, all children in America were supposed to start school ready to learn. The high school graduation rate was to increase to at least 90 percent. American students would leave grades four, eight, and twelve having demonstrated age-appropriate competency in challenging subject matter, including English, mathematics, science, history, and geography. U.S. students were to be the first in the world in scientific and mathematic achievement. Every adult American would be literate and would possess the knowledge and skills to compete in the global economy and to exercise responsible citizenship. Every American school would be free of drugs and violence, and every school would offer a disciplined environment conducive to learning. The plan was not implemented, however, during President Bush’s term of office. Impact The United States emerged from the 1980’s with a widespread belief in the need to improve and equalize the American educational system, combined with more conservative political perspectives than in the 1970’s. There was nearly universal agreement that American students were falling behind and that if this pattern continued it could harm nearly every aspect of American life. However, the proper way to address this looming crisis—and, in particular, the question of whether multiculturalism was part of the solution or part of the problem—remained a subject of sometimes rancorous debate. Subsequent Events The Goals 2000: Educate America Act was finally signed into law in 1994 by President Bill Clinton, establishing a variety of agencies and mechanisms to realize its goals. The law was not successful enough to solve the educational prob-
The Eighties in America
lems of the nation, problems that continued into the twenty-first century. Further Reading
Caldas, Stephen J., and Carl L. Bankston III. Forced to Fail: The Paradox of School Desegregation. New York: Praeger, 2005. Examines the consequences of the efforts to desegregate schools during the late twentieth and early twenty-first centuries. Contains case studies of school districts that include developments in school desegregation around the nation during the 1980’s and other decades. Ravitch, Diane. Left Back: A Century of Failed School Reforms. New York: Simon & Schuster, 2000. Critical examination of attempts to reform American education that contrasts progressivist social approaches to education with more traditional academic approaches. The author was involved in the back-to-basics efforts of the 1980’s. Reese, William J. America’s Public Schools: From the Common School to “No Child Left Behind.” Baltimore: Johns Hopkins University Press, 2005. An excellent general history of American public education. Carl L. Bankston III See also Bennett, William; Bush, George H. W.; Demographics of the United States; Drug Abuse Resistance Education (D.A.R.E.); Education in Canada; Magnet schools; Multiculturalism in education; Nation at Risk, A; National Education Summit of 1989; Reagan, Ronald; School vouchers debate.
■ El Niño The Event
Occasional disruption of the normal pattern of Pacific Ocean currents that disrupts weather patterns, especially in the Western Hemisphere Date September, 1982-March, 1983 Place California, the Southwest, Hawaii, and the Mississippi Valley in the United States; Peru; Australia; and South Africa The strong El Niño event of 1982-1983 caused severe storms and flooding in the United States and other nations with Pacific coasts. Americans alone suffered more than two billion dollars in storm damage. For the first time, scientists linked global climatic disasters to recurring temperature oscillations in the Pacific Ocean.
El Niño
■
315
El Niño, a nickname for periodic aberrant weather on America’s Pacific coast, first came into prominence during the winter of 1982-1983, when a series of storms devastated the coast of California, Hawaii and U.S. Pacific territories were hit by cyclones, and southwestern states and states along the Mississippi River experienced massive flooding. The nickname, which means “the boy” in Spanish, was an allusion to the Christ child and was coined to refer to a warm oceanic current that appeared around Christmas off the coast of Peru at approximately four-year intervals. El Niño as a Climatological Phenomenon
Beginning in 1957, coordinated international collection of oceanographic data showed that El Niño was linked to the Southern Oscillation, a western Pacific phenomenon. In most years, an area of low pressure over Indonesia and North Australia brings high rainfall to that region. Moreover, under normal conditions, high pressure over eastern Polynesia blesses those islands with sunny, storm-free weather, and both the winds and the surface oceanic currents of the tropical Pacific move from east to west. During the Southern Oscillation, pressure over Indonesia rises, transpacific winds and currents weaken, and currents off the coast of South America reverse direction. In strong El Niño-Southern Oscillation (ENSO) events, the pressure differential between Indonesia and eastern Polynesia disappears. Winds and oceanic currents reverse direction, and the resulting climatic effects are propagated outside the tropics. Historically, ENSO events have occurred approximately once a decade. Strong El Niños in 1957-1958 and 1972-1973 generated international concern because of their devastating effects on Peruvian fisheries.
Effects of the 1982-1983 El Niño
The 1982-1983 El Niño was the strongest on record. It began early in Indonesia, in the spring of 1982. Some climatologists believe that the eruption of El Chinchón volcano in Mexico, which spewed massive amounts of dust into the atmosphere, intensified the event. Australia, Indonesia, the Philippines, South India, and South Africa all experienced severe droughts. Cyclones ripped through Hawaii and French Polynesia, causing $280 million in damage. Warm ocean temperatures damaged coral reefs. Ocean levels rose by almost twenty inches in Peru and by eight inches in California. In Ecuador and Peru, floods and land-
316
■
El Niño
The Eighties in America
During an El Niño storm on January 27, 1983, the Crystal Pier in San Diego, California, collapses under the onslaught of surging waves. (AP/Wide World Photos)
slides left six hundred people dead; total damage, including fisheries, approached one billion dollars. North of the equator, Central America and Mexico experienced drought. Along the length of California, a series of violent Pacific storms smashed into the coast, flattening waterfront developments and causing massive erosion. Entire hillsides of luxury homes gave way and slid into the ocean. A tornado struck the Watts section of Los Angeles. Although only a few lives were lost, the damages totaled $1.1 billion. Unusually warm ocean water displaced fish northward, depressing fisheries in Oregon and Washington. In the southwestern United States, high rainfall and unusually warm temperatures caused a number of costly floods. Warm air over the Pacific deflected the jet stream northward, which in turn allowed warm, wet air to penetrate into much of the Mississippi drainage region. The combination of high winter rainfall and
melting snow in the Rockies created massive flooding that left sixty-five people dead and caused $1.2 billion in damage. At crest, the Mississippi River nearly overwhelmed diversion works, sending overflow water into the Atchafalaya River, which would have given the Mississippi a new course and left New Orleans without a river. Impact The 1982-1983 El Niño is estimated to have caused a total of $8.7 billion worth of damage, including nearly $3 billion worth in the United States and its Pacific territories. The weather caused more than one thousand deaths, not including those deaths due to malnutrition and disease in droughtstricken and flooded areas. The weather disruption and the destruction it caused acted as an impetus to scientific research in oceanography, climatology, and the study of other low-frequency, extreme natural phenomena. Scientists believed that if they could
The Eighties in America
understand weather patterns better, they could help prepare for future extreme events. Examining tree rings in the American Southwest and Peru, scientists were able to trace the four-year El Niño cycle of drought and abundant moisture and pinpoint several peak events, indicating that the cycle had been operating for at least a thousand years and that storms and floods like those of 19821983 can be expected to recur irregularly in the future. Cyclical natural disasters leave characteristic signatures, which trained surveyors can read if they know what to look for. The information thus gathered on El Niño was incorporated into building codes, enabling both private and public planners to avoid the highest-risk designs and locations for their structures. Further Reading
Canby, Thomas Y. “El Niño’s Ill Wind.” National Geographic 165, no. 2 (February, 1984): 144-183. Spectacular photographs of a global disaster; clear explanation of the associated overall climate pattern. D’Aleo, Joseph S., and Pamela G. Grube. Oryx Resource Guide to El Niño and La Niña. Westport, Conn.: Oryx Press, 2002. Comparisons of the 1982 event with a subsequent event in 1997; provides worldwide coverage of the effects of each. Glynn, P. W., ed. Global Ecological Consequences of the 1982-83 El Niño-Southern Oscillation. New York: Elsevier, 1990. Collection of scholarly papers that emphasizes destruction of coral reefs and impact on fisheries. Philander, S. George. Our Affair with El Niño: How We Transformed an Enchanting Peruvian Current into a Global Climate Hazard. Princeton, N.J.: Princeton University Press, 2004. Traces the historical development of understanding of El Niño; sociological in approach. Ramage, Colin. “El Niño.” Scientific American 254, no. 6 (June, 1986): 76-84. Emphasizes the meteorological and oceanographic aspects of El Niño. Martha A. Sherwood See also
Natural disasters; Science and technology.
Elections in Canada
■
317
■ Elections in Canada The Event Canadian politicians run for office Date February 18, 1980; September 4, 1984; and
November 21, 1988 Three Canadian federal elections in the 1980’s distinctively and dramatically affected Canada, particularly in terms of its relationship with the United States. The Canadian election of 1980 brought Pierre Trudeau and his Liberal Party back to power after several months in opposition. The 1984 election would see the Liberals, then under the leadership of John Turner, decisively defeated by the Progressive Conservative Party and its leader, Brian Mulroney. Mulroney and the Conservatives would win a second victory in 1988 in an election fought over the issue of free trade with the United States. The Election of 1980 In December, 1979, the government of Prime Minister Joe Clark lost a vote of no confidence in Parliament and was forced to call an election for the following February. The election quickly became a referendum on the competency of the Clark government, which had experienced a series of gaffes during its short tenure in power. Some 44 percent of voters opted for the Liberals, including 68 percent of voters in the province of Quebec, where the Liberals won seventy-four of seventy-five seats, enabling them to return Trudeau to the office of prime minister with a majority government. The Election of 1984 The Liberals’ victory was soon overshadowed by high unemployment and inflation, spiraling government debt, and federal government policies—principally the National Energy Program (NEP)—that increasingly alienated western Canada. In February, 1984, Trudeau announced his retirement. His replacement was John Turner, who had been out of political office for a number of years while he awaited the end of Trudeau’s career. Turner’s political rustiness soon showed in a series of mistakes. More crucial, before leaving office Trudeau made a series of patronage appointments to reward Liberals. Turner found himself having to defend these appointments, along with the Liberals’ economic record, as he called an election for September 4, 1984. Electoral momentum soon swung to his chief opponents, the Progressive Conservatives under the leadership of a businessman from Quebec, Brian Mulroney, who campaigned on a plat-
103
32.5
% of Vote 147
Seats Won
Liberal
44.3
% of Vote 32
Seats Won 19.8
% of Vote
New Democratic
0
Seats Won
211
50.0
% of Vote 40
Seats Won
Liberal
28.0
% of Vote 30
Seats Won 18.8
% of Vote
New Democratic Party
0
Seats Won
0.1
% of Vote
Social Credit
169
42.9
% of Vote 83
Seats Won
Source: Political Database of the Americas.
Totals
Seats Won
Liberal
32.0
% of Vote
43
Seats Won
20.4
% of Vote
New Democratic Party
0
Seats Won
0
% of Vote
Social Credit
0
Seats Won
1
Seats Won
0
Seats Won
Other
Other
Other
4.7
% of Vote
3.0
% of Vote
1.7
% of Vote
295
Seats
Total
282
Seats
Total
282
Seats
Total
13,168,343
Votes
Total
12,548,721
Votes
Total
10,947,914
Votes
Total
Elections in Canada
Progressive Conservative
November 21, 1988 Incumbent prime minister and political party: Brian Mulroney, Progressive Conservative New prime minister and political party: Brian Mulroney, Progressive Conservative New official opposition leader and party: John Turner, Liberal
Totals
Seats Won
Progressive Conservative
1.7
% of Vote
Social Credit
September 4, 1984 Incumbent prime minister and political party: John Turner, Liberal New prime minister and political party: Brian Mulroney, Progressive Conservative New official opposition leader and party: John Turner, Liberal
Totals
Seats Won
Progressive Conservative
■
February 18, 1980 Incumbent prime minister and political party: Joe Clark, Progressive Conservative New prime minister and political party: Pierre Trudeau, Liberal New official opposition leader and party: Joe Clark, Progressive Conservative
Results of Canadian Elections, 1980-1988
318
The Eighties in America
The Eighties in America
Elections in Canada
■
319
form of economic reform and better relations with the United States. The key turning point in the campaign was a leadership debate during which Mulroney decisively bested Turner over the patronage issue. The end result was a landslide victory for Mulroney: His party won 211 seats in the Canadian House of Commons, the most ever by a political party, and 50 percent of the popular vote. The Liberals were reduced to 40 seats, only 10 more than the third-place New Democratic Party. The Election of 1988 Once in office, the Mulroney government made the critical decision to pursue a free trade agreement with U.S. president Ronald Reagan, a course of action that Mulroney had previously decried. A deal was finally reached in 1987, and, while it generated little reaction in Washington, in Canada a widespread outcry ensued over the agreement’s implications for Canadian sovereignty. Unable to get the legislation through the Canadian senate, Mulroney opted to call a federal election for November 21, 1988. For the second time in Canadian history, a federal election was fought over the issue of free trade with the United States. The Liberals under John Turner quickly positioned themselves as the key opponents to the free trade agreement with the United States. A series of Liberal television commercials about the dangers of free trade struck a chord with Canadians, and the Liberals were soon ahead in public opinion polls. During the leadership debate, Mulroney and Turner clashed over the free trade issue, with each articulating his future vision for Canada while simultaneously invoking his patriotism. In the end, Liberal support proved short-lived. Enough voters swung back to the Conservatives to give them 43 percent of the votes and a majority government on election day, although the Liberals doubled their representation in Parliament. Free trade was soon passed. Impact All three elections of the 1980’s had a major impact on Canada. The 1980 election returned Trudeau and the Liberals to power, and Trudeau quickly defeated efforts at Québécois independence. However, the failure of his government’s economic policies set the stage for the massive election victory of Mulroney and the Progressive Conservatives in 1984. In turn, the arrival of Mulroney led to an effort at improving relations with the United States, principally through the pursuit of a free trade agreement with the Reagan administration. Once this agree-
From right: Pierre Trudeau, John Turner, and Turner’s wife, Geills, at the Liberal Leadership Convention in 1984. (Library and Archives Canada)
ment was reached, a further election was needed to return Mulroney to power and to allow his government to enact the requisite trade legislation. This legislation represented a historical shift for Canada, as it decisively moved toward greater economic integration with the United States. Further Reading
Bliss, Michael. Right Honourable Men: The Descent of Canadian Politics from Macdonald to Chrétien. Toronto: HarperCollins Canada, 2004. Collection of short biographies of Canadian prime ministers including Clark, Trudeau, Turner, and Mulroney, all of whom were involved in elections in the 1980’s. Clarkson, Stephen, and Christina McCall. Trudeau
320
■
The Eighties in America
Elections in the United States, midterm
and Our Times. Vol. 2. Toronto: McClelland & Stewart, 1997. Detailed study of the political career of Pierre Trudeau, including his involvement in the 1980 election. Granatstein, J. L. Yankee Go Home? Canadians and Anti-Americanism. Toronto: HarperCollins Canada, 1997. A history of Canadian anti-Americanism, with a detailed examination of the 1988 free trade election. Lee, Robert Mason. One Hundred Monkeys: The Triumph of Popular Wisdom in Canadian Politics. Toronto: Macfarlane Walter & Ross, 1990. Journalistic account of the 1988 free trade election. Sawatsky, John. Mulroney: The Politics of Ambition. Toronto: McClelland & Stewart, 1992. Biography of Prime Minister Brian Mulroney, including his 1984 and 1988 election victories. Simpson, Jeffrey. Anxious Years: Politics in the Age of Mulroney and Chrétien. Toronto: Key Porter Books, 2002. A look by a Canadian journalist at Canadian politics in the 1980’s and 1990’s. _______. Discipline of Power. Toronto: University of Toronto Press, 1976. Award-winning study of the short-lived government of Joe Clark and of Pierre Trudeau’s return to power in the 1980 election. Thompson, John Herd, and Stephen J. Randall. Canada and the United States: Ambivalent Allies. Montreal: McGill-Queen’s University Press, 2002. A history of U.S.-Canadian relations, including the 1988 free trade election. Steve Hewitt See also Business and the economy in Canada; Canada and the British Commonwealth; Canada and the United States; Canada-United States Free Trade Agreement; Foreign policy of Canada; Inflation in Canada; Mulroney, Brian; National Energy Program (NEP); Quebec referendum of 1980; Reagan, Ronald; Trudeau, Pierre; Turner, John.
■ Elections in the United States, midterm The Event Congressional elections Date 1982 and 1986
The 1982 and 1986 midterm elections showed the Democratic Party could maintain its hold on Congress and slowed the Republican surge in the South.
In 1980, Ronald Reagan had defeated incumbent president Jimmy Carter to become president of the United States. His landslide victory had in part been a result of so-called Reagan Democrats, traditionally left-leaning moderate voters who abandoned their normal party of choice to elect a conservative Republican. The election therefore signaled a significant shift in the balance of political power in the country. Nevertheless, the Democrats were able narrowly to maintain control of the House of Representatives, where the party had been in the majority since 1954. The 1980 election did end Democratic control of the Senate, and control of Congress therefore became split between the two parties. 1982 Midterms
By 1982, as interest rates and unemployment rose and President Reagan’s popularity fell, Republican hopes of retaking the House of Representatives dimmed. The Democrats, led by House Speaker Tip O’Neill, ran a campaign that poked fun at the House Republicans’ strong support of an unpopular president, calling them “Reagan’s robots.” The name stuck, particularly in areas of high unemployment, as Reagan’s economic policies sought to wring inflation from the economy at the cost of millions of manufacturing jobs. Particularly vulnerable were the House’s freshman Republicans elected in 1980. Many were elected as a reaction to the failure of Democrat Jimmy Carter’s administration, but as the economy worsened, Republicans were blamed for the poor economy as well. As a result, the Republicans lost twenty-six House seats, thirteen of which had belonged to first-term Republicans. The party’s distress was felt in eastern and midwestern industrial states, with heavy Republican losses in Pennsylvania and more Republican defeats in Illinois, Michigan, and Ohio, all of which suffered from high unemployment and declines in the auto and steel industries. Michigan’s unemployment rates were the highest in the country, and the Republican governor of Michigan was defeated in the election, as were the Republican governors of Ohio, Wisconsin, and Minnesota. Illinois saw two incumbents barely hold onto their seats: Two-term Republican governor James Thompson survived by a few thousand votes, while House Minority Leader Robert Michel was reelected by an even closer margin. Michel’s district included the Caterpillar Corporation, which teetered on bankruptcy, and he saw his usually Republican constituents flock to an un-
The Eighties in America
known challenger, cutting his reelection margin to a few thousand. Democratic winners included Governor Michael Dukakis in Massachusetts and Governor Bill Clinton in Arkansas. In California, Republicans were able to maintain a Republican seat in the U.S. Senate with the election of Pete Wilson, and they picked up the governorship as well, as George Deukmejian was elected. 1986 Midterms The 1986 midterm elections saw the Democrats at a disadvantage, as President Reagan’s popularity hovered near 60 percent. However, the Republican Party faced several difficulties in capitalizing on their advantage. The party had picked up twelve Senate seats in 1980, giving it a majority, but those seats now needed to be defended from Democratic challengers. Because the Republicans needed to defend so many seats, they could devote fewer resources to each one, and many of those seats belonged to freshman senators, who were particularly vulnerable. Complicating matters further, many of the seats that the Republicans needed to defend were also in the traditionally Democratic South. If southern voters returned to their Democratic roots, Republicans would lose control of the Senate. The Republicans therefore sought to nationalize the midterms, depending on the status of the president and the party generally to carry contests in particular states. Incumbent Republicans focused on Reagan’s popularity and his support for the Strategic Defense Initiative (SDI). A few weeks before the election, Reagan met with Soviet leader Mikhail Gorbachev and refused to surrender the SDI program in exchange for large cuts in the number of nuclear weapons deployed by each country. His decision to stand up to the Soviets became a rallying cry for his candidates. Reagan’s diplomacy and personal popularity did not transfer to his party’s Senate candidates, however. Instead, Democrats picked off eight first-term senators from the South and Midwest. Defeated Republican incumbents included Mark Andrews of North Dakota and James Abdnor of South Dakota, both states suffering from a poor farm economy. In the South, Reagan Democrats returned to their habitual party affiliation, while African Americans voted overwhelmingly for the Democrats. In North Carolina, incumbent Republican senator John East committed suicide after being diagnosed with can-
Elections in the United States, midterm
■
321
cer. His replacement, James Broyhill, was appointed to his position in July, 1986, putting him at a fundraising and name-recognition disadvantage. Georgia’s first Republican senator since reconstruction, Mack Mattingly, was defeated, when Democrats returned to the fold and elected Representative Wyche Fowler to the upper house. Three other senators, Slade Gorton of Washington, Paula Hawkins of Florida, and Jeremiah Denton of Alabama, were unable to gain a second term. Hawkins was defeated by popular governor Bob Graham, while Denton ran mainly on his eight years as a prisoner of war in Vietnam rather than on his legislative record. Republicans did pick up one seat, winning in Missouri, where Senator Thomas Eagleton, a 1972 vice presidential candidate, had retired. In Arizona, Senator Barry Goldwater’s retirement left an empty seat, which was filled by John McCain. The Democrats thus realized a net gain of eight Senate seats, giving them fifty-five seats and control of the chamber. In the House, the Democrats picked up five seats, with a few notable names winning and losing. In Georgia, the actor Ben Jones, who played Cooter on The Dukes of Hazzard, was unable to unseat a Republican incumbent. In Iowa, however, The Love Boat’s Fred Grandy (who played the purser, Gopher) won a seat. Impact The 1982 midterm results slowed the Reagan Revolution, limiting the domestic programs that Reagan could cut, while placing additional pressure on his administration to improve the economy. The 1986 election ended Republican control of the Senate. It placed the Democrats in control of domestic politics until 1988, and it also led to the defeat of conservative jurist Robert Bork’s nomination to the Supreme Court. Further Reading
Farrell, John. Tip O’Neill and the Democratic Century. New York: Random House, 2002. Detailed examination of House Speaker Tip O’Neill’s battles during his thirty-year career, including his political fights with President Ronald Reagan Gould, Lewis. Grand Old Party: A History of the Republicans. New York: Random House, 2003. Wideranging look at the creation of the party, its dominance in the latter half of the nineteenth century, its fall, and its revival under Reagan. Reeves, Richard. President Reagan. New York: Simon & Schuster, 2005. Critical biography of the presi-
322
■
The Eighties in America
Elections in the United States, 1980
dent, covering his administration, achievements, and failures. Witcover, Jules. Party of the People: A History of the Democrats. New York: Random House, 2003. Starting with Thomas Jefferson and ending with Bill Clinton, this book looks at the ups and downs of the oldest American political party. Douglas Clouatre
challenge for his own party’s nomination from Massachusetts senator Ted Kennedy. Kennedy had launched an aggressive media onslaught, branding the incumbent president as unelectable. He represented himself, by contrast, as able to recapture the lost legacy of his fallen brothers and to bring another era of Kennedy Camelot to the nation. The Reagan Surge
See also
Business and the economy in the United States; Congress, U.S.; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; O’Neill, Tip; Reagan Democrats; Reagan Revolution; Unemployment in the United States; Strategic Defense Initiative (SDI).
■ Elections in the United States, 1980 The Event American politicians run for office Date November 4, 1980
The 1980 elections brought about what came to be called the “Reagan Revolution.” Republican Ronald Reagan defeated incumbent Jimmy Carter to become president of the United States, and his party gained control of the Senate for the first time since 1958. The Democrats retained control of the House of Representatives, but the size of their majority was decreased. Early in 1980, the Democratic presidential administration of Jimmy Carter appeared to be beleaguered on all fronts, domestic, economic, and foreign. Carter had been elected in 1976 on a pledge to control inflation and revive the U.S. economy. In the eyes of a large portion of the American public, he not only had failed to deliver on that promise, but also had allowed the situation to worsen over time. In 1979, Americans were taken hostage at their embassy in Iran, and the Soviet Union invaded Afghanistan. Carter’s ineffectual reactions to both events weakened his administration’s credibility, and as the hostage crisis and the Soviet-Afghani conflict both stretched into 1980, Americans began to feel that he was impotent in the face of world crises and did not properly understand the gravity of the global threats facing the United States. Carter’s problems involved not just a resurgent Republican Party, which seemed ready to leave behind the Watergate scandal and embark on a new direction, but also a determined
Former California governor and film star Ronald Reagan had captured national attention as an attractive, articulate representative of the Republican Party’s conservative wing. He had stood for his party’s nomination on two prior occasions, in 1968 and 1976, only to fall short each time. In 1980, however, he seemed poised as the favorite to capture the prize that had eluded him—only his age appeared as a potential obstacle: Reagan was sixtynine years old, which meant that, if elected, he would be the oldest president to assume the office. To get that far, he first had to defeat the other candidates competing for the Republican nomination, including former director of central intelligence George H. W. Bush; two Illinois representatives, Philip Crane and John B. Anderson; former treasury secretary John B. Connally, Jr.; Senator Robert Dole of Kansas; and Tennessee senator Howard Baker. John Sears, Reagan’s first campaign manager, did not see the need to exert great effort against what seemed to be a weak field. He generally limited his candidate’s appearances in Iowa, whose caucuses were the nation’s first event of the primary election season and the first test of strength for any individuals seeking their party’s nomination. Thus, in Iowa, Bush—who had never been elected to public office—stunned the nation by scoring an upset victory over Reagan. Abruptly abandoning Sears’s strategy of aloofness, which did not suit his gregarious temperament in any case, Reagan embarked on an extensive series of stumping tours throughout New Hampshire, which was the next battleground in the primary process. Reagan won the New Hampshire primary on February 26, 1980. Shortly thereafter, Sears was replaced by the more aggressive William J. Casey. From that point on, Bush rapidly lost steam, especially after Reagan bested his rival in a televised debate, forging well ahead of the pack. Only Bush and Anderson could pose even a mild threat to Reagan’s chances for the nomination. At the Republican National Convention in Detroit (July 15-18), Reagan
The Eighties in America
Elections in the United States, 1980
was nominated on the first ballot with 1,939 delegate votes. Anderson received 37 votes, and Bush got 13. Reagan then had to choose a vice presidential running mate, a choice he delayed until the last minute. Former president Gerald Ford of Michigan was initially placed at the top of the list. It was reasoned that Ford could provide the unifying link between the moderate and conservative wings of the party and help Reagan secure victories in certain significant midwestern states. Reagan attempted to convince Ford to run with him, but Ford asked for too much in return: He wanted to choose who would fill several significant cabinet posts. Bush, Reagan’s second choice, was offered the chance to run for vice president and accepted. The results of the convention and the party’s acceptance of Reagan’s conservative agenda did not satisfy John Anderson, who, as one of the last stalwarts of the Republicans’ liberal wing, bolted the party in disgust and continued to run for president as an independent candidate, ultimately garnering a significant number of votes for such a candidate. Kennedy Versus the President
Most polls indicated that Ted Kennedy led substantially over President Carter when he officially entered the race in November of 1979. In the long run, though, Kennedy’s past—particularly the controversial Chappaquiddick incident of 1969—was dredged back up, damaging his credibility in the eyes of many Democratic voters. Carter thus regrouped to win the Iowa caucuses and most of the early primaries. Continued domestic unease and foreign policy disasters eroded Carter’s support, however, and Kennedy was able to
■
win the New York primary. Thus, when the Democratic National Convention was held in New York City from August 10 to 14, the party was bitterly divided between the incumbent president and the senator from Massachusetts. Behind in the delegate count, Kennedy supporters proposed an “open convention” rule, whereby delegates would be released from their obligation to cast their first-ballot vote for the candidate who had won their state’s primary election. Such a rule would greatly have enhanced Kennedy’s chances of winning the party’s nomination. The motion was defeated, however, by a vote of 1,936 to 1,390. Carter was subsequently nominated with 2,129.02 votes to Kennedy’s 1,150.48 votes, but he had won at the cost of party unity. Walter Mondale was once again nominated as the vice presidential candidate. “The Great Communicator’s” Victory Almost immediately, the upbeat tone of the Reagan campaign struck a chord with rank-and-file voters. Reagan advocated implementing a radical conservative economic program (referred to by Bush during the primary contest as “Voodoo economics” and later dubbed “Reaganomics”), significantly limiting the scope of federal power, and taking a tough foreign policy stance against communism. He moved rapidly ahead in the polls. Vulnerable on both the foreign and the domestic fronts, Carter could only mount a negative campaign, denigrating conservative Republican ideas as being out of touch with the American mainstream and as potentially dangerous insofar as they could provoke nuclear conflict. This approach did not resonate with many voters, who
1980 U.S. Presidential Election Results Presidential Candidate
Vice Presidential Candidate
Political Party
Ronald Reagan
George H. W. Bush
Jimmy Carter
Walter Mondale
John Anderson
Patrick Lucey
Independent
Edward Clark
David Koch
Barry Commoner
LaDonna Harris
Other Source: Dave Leip’s Atlas of U.S. Presidential Elections.
323
Popular Vote
% of Popular Vote
Electoral Vote
% of Electoral Vote
Republican
43,903,230
50.75
489
90.89
Democratic
35,480,115
41.01
49
9.11
5,719,850
6.61
0
0
Libertarian
921,128
1.06
0
0
Citizens
233,052
0.27
0
0
252,303
0.29
0
0
324
■
Elections in the United States, 1980
The Eighties in America
Presidential candidate Ronald Reagan, center right, campaigns with his wife, Nancy, in Columbia, South Carolina, on October 10, 1980. (Courtesy, Ronald Reagan Library/NARA)
were left with the impression that the president had been reduced to desperately flailing away at his opponent. Carter’s initial reluctance to debate also cost him: In effect, he boycotted the September 21, 1980, debate in Baltimore, Maryland, by refusing to participate if independent candidate Anderson was allowed to appear. As a result, the first debate pitted Anderson against Reagan. Reagan coolly dominated the debate, decimating any hope that Anderson might have had for victory, and lunged further ahead in the polls. Reagan agreed to debate Carter one-onone in Cleveland, Ohio, on October 28. Again, Reagan proved to be the better speaker, earning his nickname, the Great Communicator. He managed from the start both to aggrandize his own positions
and to belittle Carter, who was unable to retake the initiative. Reagan projected an image of unflappable, calm reassurance. Carter, on the other hand, seemed angry and erratic. It soon became clear that Carter had only one hope of victory, which was nicknamed by Reagan’s camp the “October surprise.” If the Iranian hostages were to be released in the final days before the election, it was generally believed that the resulting swell of good feeling for the Carter administration could easily have resulted in enough of a momentum shift to deliver Carter a second term. In the event, however, the hostages remained in captivity through election day, and Reagan won a landslide victory over the incumbent; winning the electoral vote, 489 to 49. The popular vote tallied 43,903,230 for Rea-
The Eighties in America
gan and 35,480,115 for Carter. John Anderson received 5,719,850 popular votes and no electoral votes. In the end, though Anderson was considered to have performed well for an independent candidate, he did not do well enough to affect the final outcome. The Libertarian Party candidate, Ed Clark, received 921,128 votes, and Barry Commoner of the Citizens Party received 233,052 votes. Congressional Elections
Republican advances in the congressional elections were no less spectacular: The party gained twelve seats in the Senate, for a total of fifty-three seats to the Democrats’ forty-six. Republicans were in the majority in the Senate for the first time in twenty-two years. Incumbent Democrats fell like cut wheat, including some who had held significant leadership positions. The new Republican senators included Jeremiah Denton of Alabama, Frank H. Murkowski of Alaska, Paula Hawkins of Florida, Mack Mattingly of Georgia (who had narrowly defeated the longserving Herman Talmadge), Steven D. Symms of Idaho, Dan Quayle of Indiana, Charles E. Grassley of Iowa, Warren B. Rudman of New Hampshire, John P. East of North Carolina, James Abdnor of South Dakota, Slade Gorton of Washington, and Bob Kasten of Wisconsin. Abdnor defeated George McGovern, the former 1972 Democratic presidential candidate, by nearly 19 percentage points; Quayle carried nearly 54 percent of the vote to oust veteran Birch Bayh; and Symms prevailed in a very close race over Frank Church. In the House of Representatives, the Republicans gained 35 seats. The Democrats retained their majority in the lower house, but their advantage was cut to 242 versus 192 Republicans and 1 independent representative.
Impact The term “Reagan Revolution” was used to describe the emphatic turnaround that began with the 1980 elections. After being dismissed by the majority of the American electorate as a fringe element in 1964, in 1980 conservative Republicans attained the glow of respectability. Ronald Reagan and his brand of conservatism were to dominate the U.S. political scene throughout the 1980’s and into the early 1990’s. For the first time, ethnic and suburban bluecollar and professional workers, who traditionally voted heavily Democratic, migrated in significant
Elections in the United States, 1980
■
325
numbers to the Republican fold. These new socalled Reagan Democrats would help elect Republican presidents and congressional candidates in 1984 and 1988 and would remain a potent political force through the rest of the twentieth century. Further Reading
Carter, Jimmy. Keeping Faith: Memoirs of a President. Toronto: Bantam Books, 1982. Provides Carter’s own account of the 1980 election and of his actions in office during the late 1970’s. Drew, Elizabeth. Portrait of an Election: The 1980 Presidential Campaign. New York: Simon & Schuster, 1981. Reasonably impartial rendering of the election, although the author expresses considerable disgust at the degree of negative campaigning and at the bitterness engendered by the CarterKennedy face-off. Jordan, Hamilton. Crisis: The Last Year of the Carter Presidency. New York: G. P. Putnam’s Sons, 1982. After-the-fact reminiscences of the White House chief of staff, written in diary form, in which he attempts to shed light on the many misfortunes that befell the Carter administration in 1980. Ranney, Austin, ed. The American Elections of 1980. Washington, D.C.: American Enterprise Institute for Foreign Policy Research, 1981. Fine, detailed series of essays on nearly every facet of the elections. Contains excellent, relevant statistical tables. Strober, Deborah Hart, and Gerald S. Strober. The Reagan Presidency: An Oral History of the Era. Rev. ed. Washington, D.C.: Brassey’s, 2003. Utilizes interviews with family members, campaign aides, and other participants to paint a picture of the circumstances behind the nomination struggle, the selection of George H. W. Bush as vice presidential running mate, and the actual head-tohead campaign against Jimmy Carter. Raymond Pierre Hylton See also Bush, George H. W.; Congress, U.S.; Conservatism in U.S. politics; Elections in the United States, midterm; Elections in the United States, 1984; Elections in the United States, 1988; Iranian hostage crisis; Liberalism in U.S. politics; Mondale, Walter; Reagan, Ronald; Reagan Democrats; Reagan Revolution; Reaganomics.
326
■
The Eighties in America
Elections in the United States, 1984
■ Elections in the United States, 1984 The Event American politicians run for office Date November 6, 1984
Buoyed by a well-liked and superlatively charismatic incumbent president and by a generally upbeat economic climate in the nation, the Republican Party won its most onesided presidential election victory of the twentieth century, easily reelecting Ronald Reagan over former vice president Walter Mondale. President Reagan’s coattails, however, did not invariably extend to GOP congressional candidates: The margin of the Republicans’ Senate majority was reduced by two, and the Democrats retained their dominant position in the House of Representatives, although their majority was reduced by sixteen. Walter Mondale had received his political “baptism of fire” in 1948, as a twenty-year-old operative for Minnesotan Hubert Humphrey’s successful senatorial campaign. He then went on to become a protégé of Humphrey, and both men embraced the liberal ideology of Franklin D. Roosevelt’s New Deal, which was particularly popular in traditional Minnesota politics. By 1984, Mondale saw himself as the inheritor of Humphrey’s political mantle, and he had an impressive resume to back up that claim: He had served as Minnesota’s attorney general from 1960 to 1964 and in the U.S. Senate—taking Humphrey’s old seat—from 1964 to 1976. As vice president in the Jimmy Carter administration from 1977 to 1981, he had seen his party go down in flames during the 1980 election to the Republican ticket of Ronald Reagan and George H. W. Bush. This emphatic election rebuke and the success of the early Reagan years made Mondale all the more bent on securing the vindication of his political principles by avenging the Democratic Party’s 1980 debacle. A Republican “Coronation”
There seemed very little doubt that, as a supremely popular incumbent, President Reagan could have his party’s nomination for the asking. In 1983, however, it was not clear that he would choose to do so. Already the oldest serving president at age seventy-three and having barely survived John Hinckley’s 1981 assassination attempt, Reagan was under pressure from some family members to step down while he could still do so with dignity. His age was a potential campaign issue as well. Reagan believed, however, that the Reagan Revolu-
tion was not yet complete, and he had lingering reservations about Vice President Bush, who had been his rival for the 1980 Republican nomination and who was significantly more moderate than was Reagan. As a result, the president decided to seek a second term. There was no serious challenge to his decision at the 1984 Republican National Convention in Dallas, Texas, on August 20-23, which more closely resembled a pageant than a convention. Reagan was unanimously renominated, and, although some slight amount of opposition was expressed to Bush, the vice president was renominated as well, with support from all but four of the delegates. Mondale’s Challenge
Without an incumbent to seek their nomination, the selection process for the Democrats was much more complicated. Although Mondale was considered the front-runner, he had to compete with several other candidates for the nomination, including John Glenn, the former astronaut and senator from Ohio; former South Dakota senator and 1972 Democratic presidential candidate George McGovern; civil rights leader Jesse Jackson; Senator Gary Hart of Colorado; Senator Fritz Hollings of South Carolina; Senator Alan Cranston of California; and former Florida governor Rueben Askew. None except Jackson or perhaps Glenn seemed likely to enter the race already commanding a substantial base of voters. In the Iowa caucuses, all went predictably: Mondale garnered 45 percent of the vote and emerged the clear winner. In the New Hampshire primary, however, Mondale was stunned by Gary Hart, who came out of nowhere to defeat him by 10 percentage points. Glenn trailed at a distant third, and Jackson was a more distant fourth. The rest of the field achieved negligible results and was soon out of the picture. Glenn, who had been tabbed by some pollsters as Mondale’s strongest rival, faltered and effectively withdrew after “Super Tuesday,” a day when several important primary elections were held simultaneously. A rather subdued speaker, Glenn experienced persistent difficulties in communicating his message to the voters and was thus unable to motivate a substantial swing in his favor. Jackson waged a hard-hitting campaign; his revivalist style secured 485 delegates from the South. However, his surge was hampered when anti-Semitic remarks he had made in private were reported in the press. Jackson referred to Jews as “Hymies” and
The Eighties in America
Elections in the United States, 1984
■
327
1984 U.S. Presidential Election Results Presidential Candidate
Vice Presidential Candidate
Political Party
Popular Vote
% of Popular Vote
Electoral Vote
% of Electoral Vote
Ronald Reagan
George H. W. Bush
Walter Mondale
Geraldine Ferraro
Republican
54,455,472
58.77
525
97.58
Democratic
37,577,352
40.56
13
2.42
620,409
0.67
0
Other
0
Source: Dave Leip’s Atlas of U.S. Presidential Elections.
to New York City as “Hymietown.” Because the cornerstone of Jackson’s campaign was tolerance for diversity—advocated by his organization, the Rainbow Coalition—his perceived anti-Semitism seemed to represent the height of hypocrisy, and it cost him votes he could otherwise have won from the party’s more progressive members. Moreover, the Democratic National Convention in San Francisco adopted a set of procedural rules that Jackson believed stacked the deck in Mondale’s favor, eliminating what little chance he had of winning new delegates on the convention floor. Hart, by contrast, proved a tenacious opponent for Mondale, because he was able to identify himself as representing a new, centrist movement within the party. He claimed that his ability to resist conventional leftist wisdom would strengthen the Democrats’ ability to win the votes of moderates who had deserted them in 1980 to elect Reagan. Hart affected the Kennedy style and appearance and, indeed, presented himself as an updated version of John F. Kennedy, the youngest person to be elected president. However, the Mondale campaign was able to project an image of greater stability and substance in its candidate’s program, and he rallied in the later primaries. At the Democratic National Convention, Mondale prevailed over Hart with 2,191 delegates to the latter’s 1,200. Jackson retained his 485 Southern delegates, but could win no more. Mondale’s choice of a vice presidential running mate was a calculated gamble. Conventional political wisdom dictated that it should be someone who commanded a significant regional constituency that Mondale lacked, such as Colorado’s Hart, respected Texas senator Lloyd Bentsen, or Governor Michael Dukakis of Massachusetts. Mondale, however, determined upon a more daring course of action: He con-
sidered running mates who could deliver a different kind of constituency based on their race, gender, or ethnicity, as well as sending a message of change, diversity, and inclusiveness that would appeal to a broad range of voters throughout the nation. Contenders for the vice presidential nomination therefore included Henry Cisneros, Hispanic mayor of San Antonio, Texas; San Francisco mayor Dianne Feinstein; Representative Geraldine Ferraro of New York; and Los Angeles’s African American mayor, Tom Bradley. (Jackson seems never to have been considered.) Win or lose, Mondale believed, his choice would set a historical precedent and score a major coup against the Republicans, who were beginning to be criticized as the party of old white men. Mondale ultimately chose Ferraro, who became the first female nominee for vice president of a major political party in the United States. She delivered an effective acceptance speech, and the publicopinion polls indicated a surge in support for the Democratic candidates in the wake of their convention. The Mondale-Ferraro ticket polled at 48 percent, while Reagan and Bush were supported by 46 percent of the voters polled. Debates and Disaster
The slight advantage enjoyed by the Democrats shortly began to unravel. Mondale’s attempt at candor—admitting that he intended to raise taxes while at the same time accusing the Republicans of hypocrisy on this issue— backfired, and the polls quickly turned against him. Mondale and Ferraro were almost immediately placed on the defensive by a persistent, negative attack campaign orchestrated by Republican political operative Lee Atwater. A rumor, which allegedly originated with Atwater, linked the Ferraro family to
328
■
Elections in the United States, 1984
organized crime, specifically the numbers racket. The vice presidential candidate was forced to spend the better part of a month responding to recurrent insinuations that she and her husband, John Zaccaro, had Mafia ties and that there were irregularities in their financial records. She devoted a press conference to refuting these allegations on August 21. In the first of the televised presidential debates, held in Louisville, Kentucky, on October 7, 1984, Reagan, uncharacteristically, stumbled badly. At times he seemed lost, confused, and hesitant. Mondale, though far from overwhelmingly brilliant, was nonetheless the clear winner, and doubts were rekindled concerning Reagan’s age. The vice presidential contenders emerged from their debate on October 11 with a virtual draw, assisting neither campaign. Bush came forward as a bland, even whiney, figure, while Ferraro accused Bush of patronizing her. Her
The Eighties in America
belligerent response to the vice president caused some viewers to see her as defensive and abrasive. The second presidential debate occurred on October 21, 1984, in Kansas City, Missouri. Reagan managed to recapture his stride. He expressed himself well and smoothly, and he even humorously defused worries about his age, saying, “I am not going to exploit, for political purposes, my opponent’s youth and inexperience.” Whatever chances the Mondale campaign might have had were scuttled. Only the final tallies revealed the full extent to which the disaster predicted for the Democrats had been underestimated. Mondale lost every state in the union, with the sole exception of his home state of Minnesota, and he only won that state by a razor’sedge margin of 3,761 votes. The District of Columbia, with its predominantly African American population, remained firmly supportive of the Democratic Party. The district’s three electoral votes, however,
President Ronald Reagan is sworn in for a second term of office by Chief Justice Warren Burger on January 21, 1985. (Courtesy, Ronald Reagan Library/NARA)
The Eighties in America
coupled with Minnesota’s ten, gave Mondale a bare 13 votes in the electoral college. Reagan had 525. The popular vote was 54,455,472 to 37,577,352 in Reagan’s favor; Libertarian Party standard-bearer David Bergland captured 228,111. The Mondale ticket lost among nearly every voting bloc, with the exception of African Americans, who held more tenaciously to Democratic candidates than ever before. Congressional Elections
The congressional elections were significantly less disastrous for the Democrats. They did lose a net sixteen House seats to the Republicans, but the Republicans had hoped that the landslide of support for the president would translate into a revolution in Congress as well, and that revolution failed to occur. The Democrats, despite their losses, maintained a 253-182 majority and continued to dominate all important committees and key chairs in the institution. They also managed to gain two seats in the Senate, narrowing the Republican majority in that house. Veteran Republican Charles Percy of Illinois, once considered a serious contender for the presidency, suffered a narrow, 2point defeat at the hands of Paul Simon. Another former presidential aspirant, Howard Baker of Tennessee, retired. In the contest to fill his empty seat, Democrat Al Gore, Jr., outpolled Republican Victor Ashe and independent Ed McAteer by 16 percentage points. Otherwise, the Senate’s incumbents were nearly all reelected.
Impact Reagan’s campaign slogan, “Stay the Course,” turned the election into a referendum on the current state of the nation, and his landslide victory was represented as proof that Americans were happy with their leadership and the direction in which the country was headed. Mondale’s defeat, meanwhile, represented the defeat of New Deal politics and of the coalition built by Franklin D. Roosevelt in the 1930’s, which made the Democrats the dominant party for much of the period between 1932 and 1968. However, the New Deal was defeated as much by its acceptance as by its resistance. Certainly, Reagan never attempted to do away with Social Security, and in his speeches referencing Roosevelt, he claimed not to be the latter’s opponent but rather the authentic heir to his legacy. From another perspective, the 1984 election seems to have represented the high-water mark of the Reagan Revolution, when the message of na-
Elections in the United States, 1984
■
329
tional optimism and prosperity, articulated by a particularly effective spokesman, received an emphatic mandate from an unprecedented proportion of the electorate. It was also the year of greatest strength of the so-called Reagan Democrats, traditionally Democratic blue-collar and lower-middle-class voters who defected in large numbers from their traditional allegiance in order to support Republican candidates. Reagan’s second term would be marked by a decline in popularity and power in the wake of the administration’s embroilment in the Iran-Contra affair, and although Reagan’s ideological investments would continue to define his party and the nation for years to come, they would not in subsequent years be embraced by so sizable a majority of the electorate. Further Reading
Cannon, Lou. President Reagan: The Role of a Lifetime. New York: Simon & Schuster, 1991. Places the 1984 landslide in the context of the events surrounding it and—taking a generally critical view of the Reagan administration—sees the election (especially the Louisville and Kansas City debates) as the zenith of Reaganism, before a decline set in. Ferraro, Geraldine, with Linda Bird Francke. Ferraro: My Story. New York: Bantam Books, 1985. Unique political testament, written shortly after the events and offering invaluable personal insight into the trials of campaigning and the traumatic impact that candidates’ families often endure. Goldman, Peter, et al. The Quest for the Presidency, 1984. New York: Bantam Books, 1985. Written by Newsweek journalists in a journalistic vein, this account attempts to ferret out some of the behindthe-scenes maneuvering and personalities that affected the election in 1983 and 1984. Pemberton, William E. Exit with Honor: The Life and Presidency of Ronald Reagan. Armonk, N.Y.: M. E. Sharpe, 1997. Stresses the overcoming of Reagan’s disastrous performance in the first debate as being the key event of the 1984 presidential campaign. Raymond Pierre Hylton See also Atwater, Lee; Bush, George H. W.; Congress, U.S.; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the
330
■
The Eighties in America
Elections in the United States, 1988
United States, 1988; Ferraro, Geraldine; Hart, Gary; Jackson, Jesse; Liberalism in U.S. politics; Mondale, Walter; Reagan, Ronald; Reagan Democrats; Reagan Revolution.
■ Elections in the United States, 1988 The Event American politicians run for office Date November 8, 1988
Riding on the tide of outgoing president Ronald Reagan’s legacy, and conducting a cutthroat campaign, Vice President George H. W. Bush led the Republican Party to an unexpectedly large margin of victory over Democratic candidate Michael Dukakis. However, Democratic candidates for Congress fared much better, and their party actually increased its numerical margin of control in both houses. In the spring of 1987, the Democrats believed that there was reason for optimism for the following year. President Ronald Reagan, though still immensely popular with the American public, was presiding over a somewhat tarnished administration. The Iran-Contra affair had sent shockwaves through Washington; economic prosperity appeared to be threatened; and, as a two-term president, Reagan was barred from running again. None of the speculative successors—chief among them, Vice President George H. W. Bush, Senator Bob Dole, former Delaware governor Pete du Pont, or Congressman Jack Kemp—had the affability, ease when dealing with the public, and sheer charismatic presence of the “Great Communicator.” The front-runner for the Democratic Party nomination, Senator Gary Hart of Colorado, had many of these “Reaganesque” qualities, and he consciously sought to tap into the “Camelot” aura of the Kennedys. He had presented himself to the voters in 1984 as a fresh wind of change, as opposed to the New Deal liberalism of former vice president Walter Mondale and other opponents. Though Hart had fallen shy of capturing the Democratic nomination, a number of political pundits had believed that he could have given Reagan a run for his money had he been, rather than Walter Mondale, able to secure the party’s nomination in 1984. In the spring of 1987, probably pitted against Bush (certainly a lesser opponent than Reagan), Hart was even projected as the most likely individual to become the forty-first
president. As expected, Hart declared his candidacy in April. His prospects for the presidency would change dramatically the following month. Rumors of Hart’s marital infidelity had long been circulating, and on May 3, 1987, the Miami Herald broke a story about a current affair he was allegedly having. A few days later, the newspaper published photographs of Hart on board the yacht Monkey Business with an attractive twenty-nine-year-old named Donna Rice seated on his lap. His support rapidly plunging, Hart withdrew from the running on May 8. The Donna Rice scandal ended whatever chances he had. Even though he put himself back in contention in December, he polled dismally in the New Hampshire primary. Unable to overcome the onus he had placed on himself, Hart permanently quit the field on March 8, 1988. Battle of the “Seven Dwarfs”
Hart’s abrupt downfall left a crowded field of (except for one) littleknown and nondescript rivals who would be sardonically labeled the “seven dwarfs” by the news media: Massachusetts governor Michael Dukakis, former Arizona governor Bruce Babbitt, Senator Paul Simon from Illinois, Congressman Dick Gephardt from Missouri, Senator Joe Biden from Delaware, Senator Al Gore of Tennessee, and the Reverend Jesse Jackson. Many people found it difficult to see what rationale the media employed for placing the dynamic African American civil rights leader Jackson in this “seven dwarfs” grouping. Popular New York governor Mario Cuomo, whose name was constantly bandied around as a dark horse candidate, steadfastly refused to enter the race. Biden suffered a self-inflicted wound in the early going when it was alleged that he had plagiarized part of a speech by British Labour Party leader Neil Kinnock. In his defense, Biden’s supporters stated that he had merely done nothing more than inadvertently neglect to credit Kinnock on this particular occasion. However, Dukakis campaign strategist John Sasso and cohort Paul Tully spliced together a devastatingly embarrassing video of the incident and then saw that this was leaked to the press. Thus, the damage done to Biden’s campaign proved irreparable, and the Delaware senator stepped down. When the source of the video was ultimately revealed (the Gephardt camp had originally drawn the most suspicion), Dukakis was forced to drop Sasso and Tully from his campaign staff. (Sasso would quietly return to
The Eighties in America
Elections in the United States, 1988
the Dukakis fold in September.) The Iowa caucuses revealed that the initiative had passed—at least temporarily—to Gephardt, who outpolled his closest opponents, Simon and Dukakis. Gephardt’s status as front-runner made him a target for his pursuers, who, joined by Gore, blasted his record in the media, effectively depicting the Missouri representative as “flip-flopping” in his opinions and voting record on major issues. The New Hampshire primary marked a turning point for Dukakis, who campaigned on familiar ground and touted his record as the “Massachusetts miracle worker,” claiming to have secured economic prosperity for his home state. “The Duke” finished first by a substantial margin, followed by Gephardt and Simon. From then on, Dukakis pressed his advantage, winning in Minnesota and South Dakota and then scoring decisively in what came to be termed the “Super Tuesday” primaries of March 8, 1988, where he secured the delegations from six states. Dukakis’s momentum continued, and having prevailed in California and the larger midwestern and eastern states (with the exceptions of Illinois, which went to Simon, and Michigan, which threw its support to Jackson), he had locked up the nomination by the time the Democratic National Convention assembled in Atlanta on July 18-21. Gore had withdrawn; Simon, Gephardt, and Babbitt had virtually conceded, leaving Jackson—who had run very strongly in the South—as the only effective opponent. The convention nominated Dukakis by 2,687 votes as opposed to Jackson’s 1,218. Dukakis’s selection of Texas senator Lloyd Bentsen as his vice presidential running mate stirred expressions of anger from Jackson’s supporters, who had hoped that, as the second-place finisher, their candidate would be-
■
come America’s first African American vice presidential nominee for a major party. Charges of racism were even expressed against the Massachusetts governor, though the motivating factor—which paralleled the 1960 Kennedy-Johnson ticket—was the balancing of an eastern liberal candidate with a southern moderate, thereby increasing the chances of acquiring substantial Texas electoral votes. The Republican Nomination
Vice President George H. W. Bush had to struggle to prove himself to be the heir to Reagan’s mantle. Also in the running for the Republican Party nomination were Senator Bob Dole of Kansas, himself a vice presidential candidate on the losing Ford ticket in 1976; Congressman Jack Kemp of New York; Evangelist Pat Robertson; former secretary of state Alexander M. Haig; and former Delaware governor Pete du Pont. Former Nevada senator Paul Laxalt, former secretary of defense Donald Rumsfeld, and White House chief of staff Howard Baker, after expressing mild interest, had speedily withdrawn. Dole upset front-runner Bush in the Iowa caucuses, with Robertson taking second place ahead of the vice president. However, a strong campaign in New Hampshire (amply aided by the efforts of New Hampshire governor John H. Sununu) enabled the Bush campaign to shift into high gear and to do spectacularly well in the Super Tuesday primaries, carrying sixteen states—mainly in the South. Kemp shortly withdrew, and though Dole—who had become bitter over what he perceived as Bush’s distortion of his Senate voting record on taxes—persevered almost to the end, Bush was unstoppable by the time the Republicans convened at New Orleans on August 15-18. Bush raised some eyebrows over the selection of Indiana senator Dan Quayle, whom many considered to be a
1988 U.S. Presidential Election Results Presidential Candidate
Vice Presidential Candidate
George H. W. Bush
Dan Quayle
Republican
48,886,597
53.37
426
79.18
Michael Dukakis
Lloyd Bentsen
Democratic
41,809,476
45.65
111
20.63
Ron Paul
Andre Marrou
Libertarian
431,750
0.47
0
0
466,863
0.51
1
0.19
Other Source: Dave Leip’s Atlas of U.S. Presidential Elections.
Political Party
331
Popular Vote
% of Popular Vote
Electoral Vote
% of Electoral Vote
332
■
Elections in the United States, 1988
political “lightweight” and who would later become the butt of jokes that belittled his intellectual capacities. Committing himself to the continuation of Reagan’s conservative agenda with the utterance of the long-remembered phrase “Read my lips, no new taxes,” Bush began to steadily erode Dukakis’s onetime 17 percent edge in the polls. The Campaign
The Bush election team, directed by James Baker, with Lee Atwater as the tactical “point man,” went on the offensive from the beginning and maintained an aggressive mode throughout the campaign. A relentless barrage of negative advertising, “press leaks,” and rumor-mongering was leveled at Dukakis, in large part orchestrated by the hard-nosed operative Atwater. Early innuendos about Dukakis having had mental problems and about his wife Katharine having burned an American flag were successfully countered. More serious were the allegations (translated into “attack ads” in the media) that Dukakis was a reckless, free-spending ultraliberal; that he was “soft” on crime and opposed the death penalty; that he had vetoed a measure that would have mandated that the Pledge of Allegiance be recited in classrooms; and—the most notorious and damaging of the ads—that the prison furlough program in Massachusetts had allowed a felon named William Horton, who had been convicted of murder, to escape to Maryland, where he had committed assault and rape. Dukakis’s speaking style hurt his cause. To much of the public he came across as being cold, unsympathetic, and uninspiring, never really mounting an effective counterattack and making his opponent seem less standoffish in comparison. The televised debates, which have often turned elections around, only strengthened the Republican ticket. Dukakis was given an edge over Bush in the first debate, and Bentsen embarrassed Quayle in the vice presidential debate with a memorable put-down over a putative comparison between the Indiana senator and the late president John F. Kennedy (“Senator, I served with Jack Kennedy . . . Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy”). The second Bush-Dukakis debate spelled disaster for the Democrats. Asked a rather impertinent hypothetical question about how he would feel toward criminals if his wife were raped, Dukakis answered in a clinical, unemotional manner that reinforced his characterization as aloof and unfeeling.
The Eighties in America
“Read My Lips” In the following excerpt from George H. W. Bush’s acceptance speech at the Republican National Convention, in August, 1988, the presidential nominee made a promise he would later come to regret: I’m the one who will not raise taxes. My opponent says he’ll raise them as a last resort, or a third resort. But when a politician talks like that, you know that’s one resort he’ll be checking into. My opponent won’t rule out raising taxes. But I will. And the Congress will push me to raise taxes and I’ll say no. And they’ll push, and I’ll say no, and they’ll push again, and all I can say to them is “Read my lips: No new taxes.”
The final polling on November 8, 1988, revealed a crushing victory for the Republican presidential ticket: Bush and Quayle had prevailed by 426-111 votes in the electoral college, with a popular vote margin of 48,886,597 to 41,809,476. The most substantial third-party totals were compiled by the Libertarian Party (Ron Paul) at 431,750 and the New Alliance (Leonora Fulani) with 217,221. Congressional Races Bush’s victory did not translate into Republican gains on the congressional level. In fact, in the races for seats in the House of Representatives, the Republicans registered a loss of two seats and the Democrats strengthened their existing majority to 260-175. In the Senate, the Republicans dropped one seat and the Democratic edge went to 55-45. In a significant change, the longserving and highly influential conservative southern Democrat John C. Stennis of Mississippi declined to run for another term, and Republican Trent Lott won the seat with 54.1 percent of the vote over Democrat Wayne Dowdy. Lawton Chiles, Florida’s veteran Democratic senator, resigned, and Republican Connie Mack III defeated Democrat Buddy MacKay for the vacated seat. Connecticut’s Lowell P. Weicker, Jr., who had gained some note for his role in the 1973-1974 Watergate investigations, was bumped from the Senate by Democratic candidate Joseph Lieberman. Other incumbent Republicans who lost their seats included Chris Hecht of Nevada and Da-
The Eighties in America
vid Karnes of Nebraska. Bentsen, simultaneously running to retain his Senate seat in Texas, easily defeated Republican Beau Boulter. The most onesided contest was the race in Maine, where incumbent Democratic senator George J. Mitchell amassed nearly 82 percent of the vote over his Republican adversary, Jasper S. Wyman. Impact
The elections evinced a continuing conservative slant among the American electorate, though there were certainly signs that the confidence that the people had placed in Ronald Reagan and his agenda was less effectively translated into support for anyone other than Reagan. It remains open to question to how great an extent Reagan’s residual popularity, the effectiveness of the Bush-Atwater attack strategy, and Dukakis’s inability to present himself as a realistic and effective alternative to Reaganism played their role in the presidential election. Certainly, all factors have to be taken into account. Bush’s pledges to continue the Reagan program and his failure to measure up to some of its principles (and particularly his campaign promise of “no new taxes”) would come back to haunt him in his unsuccessful 1992 bid for reelection against Bill Clinton. Further Reading
Germond, Jack W., and Jules Whitcover. Whose Broad Stripes and Bright Stars? The Trivial Pursuit of the Presidency, 1988. New York: Warner Books, 1989. Offers a reasonably impartial narrative based on the paraphrased interviews of one hundred participants in the 1988 campaigns. Goldman, Paul, et al. The Quest for the Presidency: 1988. New York: Simon & Schuster, 1989. A detailed and often sarcastically witty account of the scandals, strategies, blunders, and personal traits of the various presidential contenders and the staffers who ran their campaigns. Sheehy, Gail. Character: America’s Search for Leadership. New York: William Morrow, 1988. Though written in 1988 when the election had not yet been decided, the book probes the characters, motives, strengths, and weaknesses of the major candidates and ventures observations that later seemed prophetic. Raymond Pierre Hylton See also Atwater, Lee; Bush, George H. W.; Congress, U.S.; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, midterm;
Elway, John
■
333
Elections in the United States, 1980; Elections in the United States, 1984; Hart, Gary; Horton, William; Jackson, Jesse; Liberalism in U.S. politics; Quayle, Dan; Robertson, Pat; Sununu, John H.
■ Elway, John Identification
Hall of Fame professional football player Born June 28, 1960; Port Angeles, Washington A dominant force in football beginning in the 1980’s, Elway is recognized as one of the game’s greatest players. John Elway played National Collegiate Athletic Association football as the quarterback for the Stanford Cardinal from 1979 to 1982. During that time, he passed for 9,349 yards and 77 touchdowns. Elway was a consensus college All-American in 1982. He was also an excellent baseball player and batted .361 for Stanford in 1982. Drafted by the New York Yankees in 1981, Elway played a season of minor-league ball in the Yankee farm system. Elway was drafted into the National Football League (NFL) by the Baltimore Colts in 1983. Soon thereafter, he was traded to the Denver Broncos. During the 1986-1987 season, he led the Broncos to the Super Bowl. Although the Broncos lost, Elway threw for 304 yards and a touchdown. In the 1987-1988 campaign, Elway received the NFL Most Valuable Player (MVP) Award and again led the Broncos to the Super Bowl, where they were defeated by the Washington Redskins. Elway led the Broncos back to the Super Bowl again during the 1989-1990 season, where they were soundly defeated by the San Francisco 49ers. After three failed attempts, Elway would finally lead the Broncos to victory in the 1997 and 1998 Super Bowls. During his career with the Broncos from 1983 to 1998, Elway completed 4,123 passes, threw for 51,475 yards and 300 touchdowns, had a completion percentage of .569, and rushed for 3,407 yards and thirtythree touchdowns during season play. During the playoffs, including five Super Bowls, he completed 355 passes, threw for 4,964 yards and twenty-seven touchdowns, had a completion percentage of .545, and rushed for 461 yards and six touchdowns. He was selected to play in the NFL Pro Bowl nine times. Impact John Elway is the only player in NFL history to throw for over 3,000 yards and rush for over 200
334
■
Empire Strikes Back, The
The Eighties in America
Denver Bronco John Elway looks for an open receiver downfield during the 1987 AFC Championship game against the Cleveland Browns. (AP/Wide World Photos)
yards in seven straight seasons (1985-1991). Known for his fourth-quarter heroics, he led the Broncos to forty-seven comeback victories. He led the Broncos to 148 wins, the most wins by any quarterback. Elway is the only quarterback in the history of the NFL to start in five Super Bowls. He threw for 1,128 yards in those games, second only to Joe Montana’s 1,142 yards in Super Bowl competition. Elway was inducted into the College Football Hall of Fame in 2000 and into the Pro Football Hall of Fame in 2004. In 2005, he was selected as one of the fifty greatest quarterbacks to play in the NFL, ranking third behind Johnny Unitas and Montana. Further Reading
Latimer, Clay. John Elway: Armed and Dangerous. Lenexa, Kans.: Addax, 2002. Rosato, Bob, and Clay Latimer. John Elway. Dallas: Beckett, 1999. Alvin K. Benson See also
Football; Jackson, Bo; Sports.
■ Empire Strikes Back, The Identification Science-fiction film Director Irvin Kershner (1923) Authors Story by George Lucas (1944-
screenplay by Lawrence Kasdan (1949and Leigh Brackett (1915-1978) Date Released on May 21, 1980
); )
The Empire Strikes Back continued the massive success of the Star Wars franchise, establishing that the first film’s popularity had not been a fluke. It reinforced Hollywood trends that were just beginning to be established in reaction to Star Wars’s success, including a drive to create expensive, effects-driven spectacles and to exploit the merchandising opportunities such spectacles could generate. The Empire Strikes Back (1980) was the middle film of the first Star Wars trilogy (1977-1983). The first film, initially released as Star Wars and later retitled A New Hope, had become a massive hit in 1977-1978. It had remained in American first-run theaters for almost a year, eventually earning more than $215 million domestically and more than one-half billion dollars
The Eighties in America
worldwide in its initial release. Fans were well aware that Star Wars was the first film in a projected trilogy and that the trilogy itself was part of a larger saga that could encompass multiple trilogies. They awaited the next film eagerly. The Star Wars films, conceived by filmmaker George Lucas, told of a repressive galactic empire and a group of rebels fighting to free the galaxy from tyranny and restore a democratic republic. They were shot in a style that intentionally alluded to the movie serials of the 1930’s. Each episode of this “serial,” however, was feature-length, and they were released years apart, rather than weekly. Lucas admitted to being influenced by Universal’s Flash Gordon serials, even including a floating city in The Empire Strikes Back like the one depicted in one of those cliffhangers.The Empire Strikes Back was another blockbuster, although it did not have the remarkable longevity of the first film. In addition to being a box-office success, the film won the 1981 Academy Award for Best Sound and the People’s Choice Award for favorite movie. John Williams’s score won the BAFTA Film, Golden Globe, and Grammy awards. The script won a Hugo Award at the annual World Science Fiction Convention for best dramatic presentation. As had been the case with the first Star Wars film as well, a radio drama featuring an expanded story line was heard for weeks on public radio in 1983. The screenplay was written by Leigh Brackett, who died of cancer while working on it, and modified and completed by Lawrence Kasdan. Brackett had been publishing science-fiction stories and novels since 1940, many of them being “space operas” in the mold of Lucas’s movie series. Impact
Many critics found The Empire Strikes Back to be the strongest of all the Star Wars movies. Its predecessor in the series had set a new standard for movie special effects, and The Empire Strikes Back, released just three years later, represented another palpable advance. When the trilogy’s final film, The Return of the Jedi, was released in 1983, it too featured effects that dwarfed the achievement of the original. The three films together, then, not only transformed special effects but also gave the impression that the field was entering a phase of constant technical improvement, creating an audience expectation and demand for ever-more-impressive effects in each new major fantasy or science-fiction motion picture. Before Star Wars, science-fiction films had not generally required large budgets, nor had they been
Environmental movement
■
335
driven by the sort of hyper-realism such budgets could achieve. One notable exception had been Stanley Kubrick’s 1968 film 2001: A Space Odyssey, but the Star Wars films incorporated effects that were more dazzling (if less pointedly realist) than those of Kubrick’s work. Later science-fiction films could not achieve major success unless they too spent the money required to create top-quality special effects. The Star Wars films helped rejuvenate another science-fiction franchise. After enjoying an unexpected level of popularity in syndication, the Star Trek television series (1966-1969) had remained in limbo. The success of the Star Wars movies increased the television franchise’s popularity as well, creating the impetus for both feature films and new television series set in the Star Trek universe. Thus, even though the remaining Star Wars movies generated less enthusiasm than the first two, the success of The Empire Strikes Back as a sequel helped make science fiction a highly marketable genre, both in film and in popular culture generally. Further Reading
Arnold, Alan. Once Upon a Galaxy: A Journal of the Making of “The Empire Strikes Back.” London: Sphere Books, 1980. Bouzereau, Laurent, ed. Star Wars—The Annotated Screenplays: “Star Wars, a New Hope,” “The Empire Strikes Back,” “Return of the Jedi.” New York: Ballantine Books, 1997. Clute, John, and Peter Nicholls, eds. The Encyclopedia of Science Fiction. London: Little, Brown, 1993. Paul Dellinger See also Action films; Blade Runner; E.T.: The ExtraTerrestrial; Film in the United States; Ford, Harrison; Science-fiction films; Sequels; Special effects.
■ Environmental movement Definition
Activism on the part of various groups and individuals dedicated to protecting the environment
During the 1980’s, environmentalists continued to argue that Americans needed to adopt more responsible, sustainable lifestyles. However, the economic optimism and growth of the decade combined with the perception at the time that environmentalism was antibusiness and antiprosperity to weaken the mass appeal of the movement.
336
■
Environmental movement
The Eighties in America
A 1980 Earth Day celebration in Lafayette Park, Washington, D.C. (AP/Wide World Photos)
Environmentalism had gotten off to a fast start in the 1970’s, and environmental groups had registered several notable successes, when Congress passed legislation to protect air and water quality and to deal with hazardous wastes. Gradually, however, a backlash had developed, as the costs of environmental legislation began to be felt. Meanwhile, Ronald Reagan came to the presidency in 1981 determined to reduce governmental regulation, including environmental regulation. The environmental movement thus faced several challenges during the 1980’s. Some of the initial public enthusiasm for environmentalism had waned. The decade’s two Republican administrations often opposed further environmental efforts and, at times, even sought to roll back some of the environmental legislation of the previous decade. Divisions concerning goals and tactics that had previously existed among environmentalists came into the open, making it difficult for vari-
ous groups to maintain a united front. On the other hand, the open attack on the movement fostered by some political leaders led to a renewed sense of urgency among activists who continued their efforts to preserve the environment. The Election of Ronald Reagan Overall, environmentalists could claim several successes during the 1970’s, as Congress passed legislation to deal with various environmental problems. Although many Americans considered themselves environmentalists, most did not belong to organized environmental groups at the beginning of the decade. Nonetheless, the ten major environmental groups had gained members during the 1970’s. This growth in membership began to pose problems for some organizations. As they gained members and financial resources, both from member dues and from foundation and corporate sponsorship, they also added
The Eighties in America
layers of bureaucracy, and in some cases the leadership became distant from the membership. The major groups each had specific agendas, such as land conservation for the Nature Conservancy or hikingtrail preservation for the Sierra Club, but they often shared issues in common as well. During most of the decade, the leadership of the major groups met to discuss those common issues. President Reagan came to Washington mistrustful of environmentalists and their goals. He staffed his administration with similarly suspicious officials, including Secretary of the Interior James G. Watt. Watt had been tied to the Sagebrush rebellion in the West, a movement that opposed federal ownership of western lands. During the 1970’s, representatives from environmentalist groups had enjoyed easy access to executive agencies, but during the 1980’s, it was corporative representatives who enjoyed this access, leaving environmentalists on the outside. Many Americans had also come to think that some federal environmental standards had gone too far, so the major environmental groups often faced difficulties in convincing people of the necessity of further environmental legislation. Environmental groups spent much of the 1980’s struggling to find a niche in Reagan’s America. Often shut out from the corridors of power, they tried to find new means of influencing public policy. Ironically, many Americans still considered themselves environmentalists in theory, so they continued to join environmental groups. In practice, however, they were swayed by alarmist tales of businesses losing millions of dollars as a result of regulations designed to protect obscure species of insect. This struggle between a general commitment to the environment and specific opposition to what were portrayed as extremist environmental bills continued throughout the decade. However, enough people became disillusioned with what they saw as the antienvironmental stance of many Reagan officials to motivate significant continued support for proenvironment lobbyists. Shortly after Reagan’s inauguration, the leaders of nine of the major environmental groups—including the Sierra Club, the Wilderness Society, the National Resources Defense Council, and the National Wildlife Federation, among others—met in Washington. They were joined by Robert Allen of the Henry P. Kendall Foundation. The nine environmental organizations plus the foundation, known
Environmental movement
■
337
collectively as the Group of Ten, had somewhat diverging agendas, but they all actively lobbied Congress on behalf of the environment, working through Washington’s legislative and judicial machinery. As a result, groups that refused to engage in direct lobbying efforts did not join the Group of Ten. These included groups such as the Nature Conservancy, which was committed to maintaining an apolitical identity, as well as groups such as Greenpeace, which simply distrusted the federal government and therefore chose direct action over appeals to legislators. Although the formal structure of the Group of Ten was initially limited to the chief executives of each member organization, subordinates from within each group later met together. Over time, the umbrella group agreed on a common agenda acceptable to all; it was published in 1985 as An Environmental Agenda for the Future. Nevertheless, there remained some sharp policy differences among the leaders of the Group of Ten. For example, Jay Hair of the National Wildlife Fund was more positively inclined toward the Reagan administration than were the other leaders and often forced modifications in the group’s policy statements. These differences continued to trouble the Group of Ten throughout the 1980’s, and the group had broken up by the end of the decade, as some leaders wished to distance themselves from the more confrontational stances of the Sierra Club and Friends of the Earth. The Institutionalization of Environmentalism
During the 1980’s, most of the major environmental groups expended a good deal of effort to expand their membership base. This effort was intended to build on the enthusiasm that many Americans had displayed for environmental causes in the 1970’s and to strengthen the groups’ lobbying clout in order to counteract the increasingly hostile political climate in Washington. In order to expand their base, environmental organizations utilized direct mail campaigns to acquire members and raise funds. Direct mail “prospecting” was expensive, so environmental groups were caught in a cycle of fund-raising, needing to solicit funds in order to maintain the very mailing apparatus that they used to solicit the funds. Several environmental groups also turned to corporate foundations to fund their programs. Money provided by such foundations often came with explicit or implicit strings attached, resulting in some
338
■
Environmental movement
groups lessening their public criticism of corporate polluters. Some corporations with poor environmental records also saw gifts to environmental groups as a means to improve their tarnished reputations. For example, by the end of the decade, Waste Management, a company with numerous violations of Environmental Protection Agency regulations on its record, not only was paying the resulting fines but also was making substantial contributions to several environmental organizations. The company’s president, Philip Rooney, would join the Audubon Society’s board of directors in 1991. Other corporations, such as Exxon, also saw opportunities to improve their images by contributing to environmental organizations or by having executives join organizational boards. They purchased advertising to inform the public of their largesse and to tout their environmental credentials. These contributions did not turn environmental organizations into puppets of their corporate sponsors, but they did result in some organizations lessening their criticism of those sponsors. With an increased emphasis on fund-raising and membership acquisition, some environmental organizations became more bureaucratic, employing larger staffs and requiring more money to pay their salaries. Some of the new employees were devoted to membership services, such as publishing slick magazines for members. Other staffers became part of the congressional lobbying effort that the environmental groups increasingly turned to in order to counter Reagan administration efforts to weaken existing environmental regulations. Some environmental groups also engaged in substantial public relations campaigns designed to maintain the environmental awareness of the American people. Some of these campaigns were devoted to improving the groups’ own reputations in the face of conservative critics, who portrayed the groups as composed of wild-eyed radicals. The groups therefore sought to represent themselves as responsible critics of dangerous administration policies. An ideological tug-of-war existed, in which each side depicted itself as representing normal, mainstream attitudes and depicted its opponent as wildly out of touch with authentic American values. Although litigation had been used in the past by some environmental groups, such as the Sierra Club, an increasing number of organizations employed this tactic during the 1980’s, as they tried to stop the Reagan administration’s rollback of environmental regulations or to force the White House to take appropri-
The Eighties in America
ate actions, such as cleaning up hazardous waste sites. In many ways, in fact, the tactics of the major environmental organizations during the 1980’s were those of mature political actors. These tactics emphasized working within the system, rather than engaging in direct action or making pointed criticisms of industry. Not all environmental groups chose this approach. Greenpeace, for example, continued its use of demonstrations and grassroots activism to try to influence public opinion and policy. During the decade, some grassroots movements also sprang up to emphasize racial or gendered aspects of environmental policies. In some areas, activists began to protest against the placement of waste dumps in African American or Native American communities. The mainstream environmental groups initially did little to help groups protesting environmental racism for fear of detracting from their project to secure the enforcement of existing legislation. By the end of the decade, however, some of the major groups with a signicant history of grassroots activism—such as the Sierra Club— began to support the local groups and to adopt aspects of their agendas. Mass Movement or Organized Institution?
During the 1970’s, there had been a significant grassroots environmental mass movement. It had not always been organized effectively, but it had involved a great many people, who had participated in an active and enthusiastic fashion. During the 1980’s, some of this enthusiasm was lost, as environmentalism became more institutionalized. Students who had supported environmental change in the 1970’s often turned their attention to striving for economic success in the 1980’s. In some cases, environmental groups expanded but at the cost of their sense of grassroots involvement, as decisions came to be made in their Washington offices, often with a look over the shoulder at their financial backers. Indeed, by the end of the decade, all but four of the Group of Ten had moved their headquarters to Washington, and two of the remainder were contemplating such a move. Although retaining some of their earlier missions, some of the mainstream environmental groups appeared little different from other trade associations centered in Washington. Also during the 1980’s, however, a philosophical movement with roots earlier in history began to become more influential. The movement, known as “deep ecology,” asserted that all species of life have
The Eighties in America
equal intrinsic value, influencing the thinking of some environmental activists. Although deep ecology resonated with some would-be environmental activists, it also alienated many lower-class Americans who cared about the environment but not at the expense of their livelihood. When taken to logical extremes, the precepts of deep ecology could lead to massive plant closings and job losses for workingclass people. Many working-class Americans saw deep ecology as an elitist movement. Considered alongside a 1980’s intellectual movement that proposed to protect the environment by privatizing it, the development of deep ecology helps illustrate the intellectual fragmentation of environmentalism. This fragmentation in theory occurred at the same time some environmental organizations were experiencing practical failures in the face of White House opposition to their goals, as well as the complications attending the organizations’ increasing bureacratization. What had been a clearly definable movement at the beginning of the 1980’s had become fragmented and, as some critics opined, seemed to have lost its way by the end of the decade. As a result, some supporters had become disillusioned by decade’s end. In spite of these problems, though, there were also some positive signs at the end of the 1980’s that the environmental movement was expanding its focus and preparing to move forward. Some of the major environmental groups began to reach out to local grassroots groups that were dedicated to achieving environmental justice, thereby expanding both their focus and their support base. In some cases, these outreach efforts helped revive the spirit of grassroots activism that had permeated the environmental movement during the 1970’s. Both the Sierra Club and Friends of the Earth appeared to be turning back to their earlier activist roots by the end of the decade. The rise of new issues such as global warming also helped revitalize the environmental movement. Finally, American environmentalists began to forge alliances with environmentalists elsewhere in the world, laying the groundwork for a genuinely international movement. Impact
The environmental movement underwent several transformations during the 1980’s, largely in response to the economic and political climate of
Environmental movement
■
339
the decade. The movement had become focused on federal regulation in the 1970’s in the wake of its success in convincing President Richard M. Nixon to create the Environmental Protection Agency. The White House of the 1980’s, however, was committed to rolling back federal regulation, so lobbying organizations were forced to fight not to improve the situation but simply to maintain the gains they had already made. Moreover, environmental organizations found that they were not immune to the increasing corporatization and bureaucratization of all aspects of American society, and they struggled to negotiate the consequences of those trends. The movement also confronted the same issues of diversity operative elsewhere in 1980’s American culture, as it was accused of being too white, too male, and too middle-class in its orientation. By the end of the decade, environmentalism was struggling to recapture some of the public’s enthusiasm, but it seemed ready to turn a corner as it entered the 1990’s. Further Reading
Dowie, Mark. Losing Ground. Boston: MIT Press, 1996. Argues that the major environmental groups lost their focus beginning in the 1980’s. Gottlieb, Robert. Forcing the Spring. Rev. ed. Washington, D.C.: Island Press, 2005. The leading history of the environmental movement, with a focus on events between 1960 and 1990. Lewis, Martin W. Green Delusions. Durham, N.C.: Duke University Press, 1992. Critique of radical environmentalism. Sale, Kirkpatrick. Green Revolution: The American Environmental Movement, 1962-1992. New York: Hill and Wang, 1993. Another history of the movement, offering some information different from that in Gottlieb’s work. Snow, Donald, ed. Inside the Environmental Movement: Meeting the Leadership Challenge. Washington D.C.: Island Press, 1992. Essays on some of the leadership issues faced by the environmental movement in the 1980’s. John M. Theilmann See also Air pollution; Business and the economy in the United States; Conservatism in U.S. politics; Multiculturalism in education; Reagan, Ronald; Water pollution; Watt, James G.
340
■
The Eighties in America
Epic films
■ Epic films Definition
Film genre defined by its broad narrative scope and corresponding stylistic grandeur
Historical epics became a relatively minor genre during 1980’s. The lavish epics of Hollywood’s golden era became cost-prohibitive, as the major studios shifted their emphasis from film production to distribution, and independent production companies ushered in an era of successful teenoriented drama and comedies. Science-fiction epics, however, proved to be an exception to this trend. The fate of Heaven’s Gate (1980), an epic Western set in nineteenth century Montana, helped convince Hollywood to abandon the epic genre in the 1980’s. Directed by Michael Cimino—whose Vietnam War epic, The Deer Hunter (1978), had been widely acclaimed—the film was set during the Johnson County Wars between cattle barons and povertystricken immigrants. Originally running over five hours, the film was roundly panned by critics and did little to recoup its unprecedented $40 million in costs. The film’s production studio, United Artists, never recovered financially and was eventually absorbed by rival Metro-Goldwyn-Mayer (MGM). After such a debacle, few Hollywood studios—most of which were owned by larger companies such as Gulf & Western and Transamerica Corporation—were willing to mount expensive productions entailing such financial risk. Relatively inexpensive teenage dramas and comedies, as well as more expensive but more reliable “franchise” action pictures (including the Die Hard, Rambo, and Indiana Jones films), consistently proved the most successful films financially, if not always critically, during the decade. The Science-Fiction Epic The exception to the rule were the sequels to George Lucas’s Star Wars (1977), as well as other science-fiction films such as Back to the Future (1985) and E.T.: The Extra-Terrestrial (1982), the latter of which was the single most successful film of the decade. Lucas’s incredibly successful space epic allowed him to establish his own company to produce the The Empire Strikes Back (1980) and The Return of the Jedi (1983). The initial Star Wars trilogy qualifies as epic for its breadth of vision, both literal and figurative. Lucas’s animation team skilfully created a galaxy of exotic creatures and locales. The narrative, borrowing motifs from the Holly-
wood Western, Japanese samurai films, and archetypal myth, follows the rise of prototypical cultural hero Luke Skywalker from obscurity to triumph, as he eventually destroys the evil Empire. Adjusted for inflation, the original Star Wars trilogy is easily one of the most financially successful film series of all time. Moreover, in true epic tradition, their popularity created a modern mythology: terms such as “hyperdrive” and “the Force” and characters such as Luke Skywalker, Princess Leia, Yoda, and Darth Vader attained global currency. Some Hollywood films reappraising the Vietnam War, such as Oliver Stone’s Platoon (1986) and Stanley Kubrick’s Full Metal Jacket (1987), were loosely labeled “epic” by a few critics, but both films were in fact smaller in scope and more dramatically intimate than, for instance, Francis Ford Coppola’s mythopoetic Vietnam epic Apocalypse Now! (1979). Similarly, Martin Scorsese’s The Last Temptation of Christ (1988), with Jesus as an unwilling martyr, was not an epic so much as an ironic inversion of the biblical epic tradition of Hollywood’s Golden Age. Epics Abroad
The historical epic remained alive in the 1980’s largely through the work of non-American producers and directors. Great Britain’s Sir Richard Attenborough produced and directed one of the most widely celebrated films of the decade, his epic film biography of Indian nonviolent revolutionary Mahatma Gandhi. Gandhi (1983) chronicled not only the life of its individual subject but also his pivotal role in bringing about the end of the British colonial system following World War II. The film won the Academy Awards for Best Picture, Best Director, and Best Actor (Ben Kingsley, in the lead role). A few years later, working with an international cast and international financing, Italian Bernardo Bertolucci directed The Last Emperor (1987). Also an epic retelling of the end of an imperial system, the film followed the last emperor’s life from his privileged childhood in the Forbidden City to his final days as a humble gardener under Communist rule. The film won nine Academy Awards, including Best Picture and Best Director. Perhaps the greatest epic film of the decade was Japanese master Akira Kurosawa’s Ran (1985; confusion). A loose retelling of Shakespeare’s King Lear (pr. c. 1605-1606), Ran tells a tale of personal and national disintegration in which the Lear-like lead figure’s inner chaos culminates in a panoramic battle in which there are no
The Eighties in America
winners. The film has been interpreted both as an exercise in self-analysis, in which Kurosawa worked through his own battles with depression, and as symbolic of Japan’s involvement in World War II. Impact The bottom-line mentality of the corporations owning the major Hollywood studios led to a decline in the production of traditional epic spectacles, even as science-fiction spectacles became increasingly popular. The relative lack of competition from Hollywood allowed foreign film companies to gamble on historical epic productions that not only provided spectacle but also offered interpretations of history for their contemporary audiences, both in the United States and abroad. After the disillusionment of the post-Vietnam War era of the 1970’s, the antiimperialist messages of epics such as Gandhi and The Last Emperor found an enthusiastic audience among American filmgoers. The prestige and seriousness of such films, along with their healthy box-office receipts, would lead Hollywood to rediscover the historical epic in the 1990’s in productions ranging from Dances with Wolves (1990) to Braveheart (1995). Further Reading
Prince, Stephen. The New Pot of Gold: Hollywood Under the Electronic Rainbow, 1980-1989. Berkeley: University of California Press, 2002. Historical analysis of the transition of the Hollywood film industry from production to distribution. Also examines the role of new technologies such as the videotape and cable television. Part of the prestigious History of the American Cinema series. Sklarew, Bruce, et al, eds. Bertolucci’s “The Last Emperor”: Multiple Takes. Detroit: Wayne State University Press, 1998. Developed from a scholarly symposium, this text collects essays that reconsider Bertolucci’s epic from a variety of academic disciplines and perspectives. Sobchack, Vivian. “‘Surge and Splendor’: A Phenomenology of the Hollywood Historical Epic.” Representations 29 (Winter, 1990): 24-49. Provides an overview of the epic genre, including an analysis of its diminution during the 1970’s and 1980’s. Wright, Will. “The Empire Bites the Dust.” Social Text 6 (Autumn, 1982): 120-125. Argues that the Star Wars trilogy is essentially an epic Western set in space: The Empire is a technology-driven modern society, against which the individualistic hero Luke must combat with his humanizing “Force.” Luke Powers
Erdrich, Louise
■
341
See also
Academy Awards; Back to the Future; Empire Strikes Back, The; Film in the United States; Full Metal Jacket; Ford, Harrison; Heaven’s Gate; Last Temptation of Christ, The; Platoon; Scorsese, Martin; Stone, Oliver.
■ Erdrich, Louise Identification Native American author Born June 7, 1954; Little Falls, Minnesota
An Ojibwe, French, and German American writer, Erdrich became an important voice in an increasingly multicultural American literary culture. Louise Erdrich gained prominence in 1984, when she won the National Book Critics Circle Award for her novel Love Medicine (1984; revised and expanded, 1993). This innovative tribal saga presents its story through multiple narrators. The author used this technique to represent Native American traditions of storytelling and egalitarianism. Erdrich, a member of the Turtle Mountain Band of Chippewa of North Dakota, was raised by parents who worked at a North Dakota boarding school for Native Americans. Her mother’s French Ojibwe culture and her German American father’s traditions came to inform her writing. Erdrich was inspired as a teen by Kiowa author N. Scott Momaday, who won the Pulitzer Prize in 1969 for his novel House Made of Dawn (1968). His example inspired a literary movement that came to be known as the Native American Renaissance. By the 1980’s, greater numbers of young Native American writers like Erdrich pursued college educations rather than training for vocations. Erdrich attended Dartmouth College and Johns Hopkins University, where she earned an M.A. in creative writing in 1979. She married writer Michael Dorris in 1981 and wrote collaboratively with him until his death in 1997. Erdrich’s novels challenge stereotypes of Ojibwe people. Her Native American characters embody a pantheon of spiritual beings that includes Deer Woman, Windigo, and Missepeshu (Water Monster). Humor and other Ojibwe sensibilities appear in her works. She weaves French Catholic traditions alongside these. Her work also often represents a Native American sense of time. For example, Love Medicine’s narrative begins in 1980, procedes backward in time to 1930, and then returns to 1980. This rearrangement of chronology connects the past to the
342
■
E.T.: The Extra-Terrestrial
The Eighties in America Impact Erdrich was one of a group of extremely significant women of color who together changed the landscape of U.S. literary production in the 1980’s. Others included Maxine Hong Kingston, Toni Morrison, and Gloria Anzaldúa, each of whom added her voice to a rapidly diversifying marketplace of ideas during the decade. Erdrich’s 1980’s writings in particular challenged the American myth of the open frontier. She gave voice to the dynamic communities that originally resided in the upper Midwest, and she also portrayed nuanced interactions between them and various European settler groups. These works redefined racial categories as well as introducing Ojibwe histories and narrative structures into American literary traditions. Further Reading
Louise Erdrich. (Michael Dorris)
Beidler, Peter G., and Gay Barton, eds. A Reader’s Guide to the Novels of Louise Erdrich. Columbia: University of Missouri Press, 1999. Chavkin, Allan, ed. The Chippewa Landscape of Louise Erdrich. Tuscaloosa: University of Alabama Press, 1999. Wong, Hertha Dawn Sweet. Louise Erdrich’s “Love Medicine”: A Casebook. New York: Oxford University Press, 2000. Denise Low See also
present and emphasizes the present importance of history. Erdrich once told a reviewer: “I’ll follow an inner thread of a plot and find that I am actually retelling a very old story, often in a contemporary setting.” Rather than celebrate the colonizer’s conquest of the wilderness, Erdrich challenges the right of the United States to control North America. Her books depict the devastation wrought by European colonizers upon the original inhabitants of the continent’s upper midwestern forests and prairies during the nineteenth century. Erdrich followed Love Medicine with three related books, Beet Queen (1986), Tracks (1988), and Bingo Palace (1994). Bingo Palace addresses the impact of casinos and the Indian Gaming Regulatory Act. She also published books of poems, Jacklight (1984) and Baptism of Desire (1989). Other books published into the twenty-first century include poetry, memoir, children’s literature, essays, and further novels.
Beloved; Indian Gaming Regulatory Act of 1988; Literature in the United States; Multiculturalism in education; Native Americans; Poetry; Religion and spirituality in the United States.
■ E.T.: The Extra-Terrestrial Identification Science-fiction film Director Steven Spielberg (1946Date Released June 11, 1982
)
E.T. was the most successful film of the 1980’s, earning more than three-quarters of a billion dollars worldwide. By incorporating special effects and science fiction into a story that was nonetheless driven primarily by sentiment and human situations, Spielberg captured the imaginations of people throughout the globe and altered the course of his own career. E.T.: The Extra-Terrestrial (1982) tells the story of a boy who befriends a stranded alien, protects it from government captivity, and helps it contact its people
The Eighties in America
so it can be rescued. Nine-year-old Henry Thomas played the boy, Elliott, in a performance critic Roger Ebert called the best little-boy screen performance he had ever seen. Robert MacNaughton and Drew Barrymore played Elliott’s siblings, and Dee Wallace played his mother. It was an intensely personal story for director Steven Spielberg, whose parents’ divorce led to a lonely childhood much like the one he gave to Elliott. By combining a science-fiction premise with realistic familial situations and interactions, Spielberg transformed the former genre, using it to tell a type of story that before had had no place in mainstream science fiction. The movie was the highest-grossing film to that time and the most successful film of the decade. It took in more than $435 million domestically and almost $793 million worldwide. It was nominated for nine Academy Awards and won four, for Best Sound Effects Editing, Best Sound, Best Visual Effects, and Best Original Score. The majestic and sensitive score by John Williams also won Golden Globe, Grammy, and BAFTA Film awards. Spielberg had been developing the idea of the story for years, and when he pitched it to screenwriter Melissa Mathison, she produced a first draft in eight weeks. E.T. juxtaposed its family drama not only with science fiction but also with religious and mythic imagery (the film’s poster featured a human finger reaching out to touch a finger of the alien, a conscious allusion to Michelangelo’s painting Creation of Adam). The movie also incorporated selfconscious references to previous science-fiction tales. E.T. gets his idea for communicating with his departed ship from a Buck Rogers newspaper comic strip. He watches on television a flying saucer sequence from the 1955 movie This Island, Earth. The little alien’s “hand” resembles that of the Martian invaders from the 1953 film adaptation of H. G. Wells’s War of the Worlds (1898) produced by George Pal. Spielberg even re-created the sequence from that film in which a Martian hand reached out and touched a character’s shoulder. In the 1953 film, it was a moment of horror, but when E.T. touches Elliott’s shoulder, it is an affectionate and reassuring gesture. Impact
Before E.T., science-fiction films set on Earth were most commonly akin to horror films. The image of aliens in such films was malignant. They came to drain humans’ blood (The Thing,
E.T.: The Extra-Terrestrial
■
343
1951), wipe them out (1953’s Invaders from Mars and War of the Worlds), steal Earth’s scientists (This Island, Earth, 1955), replace humanity (Invasion of the Body Snatchers, 1956), or worse. Even the relatively benign aliens in The Day the Earth Stood Still (1953) and It Came from Outer Space (1953) threatened, respectively, to “reduce this Earth of yours to a burned-out cinder” if humanity did not abandon its warlike ways or to wipe out a hostile party of men who failed to understand that the aliens merely wanted to repair their spacecraft and leave. Spielberg’s Close Encounters of the Third Kind (1977) came closest to presenting harmless aliens, but even those had kidnapped humans over the years for undisclosed reasons. With E.T., Spielberg gave audiences cinema’s first cuddly alien. Equally important, the antagonists in
The original poster for E.T. highlighted the film’s religious themes by referencing Michelangelo’s famous painting, The Creation of Adam, from the ceiling of the Sistine Chapel. (Hulton Archive/Getty Images)
344
■
The Eighties in America
Europe and North America
the film were representatives of the bureaucratic federal government, which represented a danger to the alien and the boy who protected him. Other movies with friendly aliens, such as Starman (1984) and Cocoon (1985), quickly followed, showing that humankind could indeed get along with visitors from other worlds. (Director John Carpenter’s Starman even pays tribute to It Came from Outer Space by reproducing its opening sequence.) These films all featured people coming to understand the unknown and to discover that it was not frightening after all. They seemed to resonate with audiences in the 1980’s that were tired of the fear and xenophobia of the Cold War. Further Reading
Bouzereau, Laurent, and Linda Sunshine, eds. “E.T.: The Extra-Terrestrial”—From Concept to Classic. New York: Newmarket Press, 2002. Collection of essays exploring the making of the film and its impact on American culture. Freer, Ian. The Complete Spielberg. London: Virgin Books, 2001. Exploration of the filmmaker’s life and career. Friedman, Lester D., and Brent Notbohm, eds. Steven Spielberg: Interviews. Mississippi: University Press of Mississippi, 2000. Compilation of major interviews given by Spielberg that gives a coherent picture of his approach to filmmaking. Powers, Tom, and Martha Cosgrove (contributor). Steven Spielberg. Minneapolis: Lerner, 2005. Useful biography of the director. Part of the Just the Facts biographical series. Paul Dellinger See also
Blade Runner; Empire Strikes Back, The; Film in the United States; Science-fiction films; Special effects; Spielberg, Steven.
■ Europe and North America Definition
Diplomatic and economic relations between European states and the United States and Canada
Relations between North America and Europe in the 1980’s were dominated by Cold War politics and by the agenda of the American president, Ronald Reagan, who sought support from Western Europe in his aggressive resistance to communism.
As the 1980’s opened, the hope that détente could be achieved between Eastern Europe and the West was all but dead. Ronald Reagan, the conservative Republican president elected in 1980, appeared to put the final nails in the coffin of diplomatic understanding when he took office: He denounced the Soviet Union, calling it the “Evil Empire,” and he determined that the proper strategy for dealing with the Soviets was not to forge arms control agreements but rather to expand the U.S. military budget, hoping to bring Moscow to its knees by outspending it. As the decade progressed, however, the situation changed dramatically. Soviet premier Mikhail Gorbachev instituted a program of liberalization and reform that—together with Soviet worries over the Chernobyl nuclear disaster of 1986 and the nation’s domestic economic problems—eventually led to a new era of U.S.-Soviet cooperation and, indeed, to the collapse of the Soviet Union at the end of the decade. Reagan and Europe
The attitudes toward Reagan’s policies in the other countries of Europe were mixed, although Washington’s leadership remained firm. In the United Kingdom, Conservative Party leader Margaret Thatcher had become prime minister in 1979 and stood firmly with Reagan in his antiSoviet attitude, as well as his conservative economic policies. French President François Mitterrand had less respect for Reagan. He once asked Canadian prime minister Pierre Trudeau, “What planet is he living on?” Even Margaret Thatcher once remarked of Reagan, “Poor dear, he has nothing between his ears.” Despite her condescension, Thatcher backed Reagan’s foreign policy initiatives, especially in Eastern Europe. One of the major Eastern European concerns among Western leaders was Poland, whose communist government was encountering organized domestic resistance from the members of the Solidarity trade union. Most North Atlantic Treaty Organization (NATO) members aggressively supported Solidarity against the communists. When Polish prime minister Wojciech Jaruzelski sought to retain control of his country by imposing martial law in 1981, NATO protested vigorously. Meanwhile, North American and Western European bankers grappled with the question of how to treat outstanding Western loans to the Polish government, as Warsaw considered declaring bankruptcy over the crisis.
The Eighties in America
Reagan also sought to strengthen the U.S. strategic position by pressuring Western European nations to allow him to deploy medium-range Pershing and cruise missiles equipped with nuclear warheads within their borders, where they would be within striking distance of Eastern European targets. He wanted Western nations to increase their military budgets, as the United States had done, and to adopt all of his anti-Soviet policies. Many European leaders resisted, believing that a softer approach to Moscow was more prudent. They viewed Reagan as a dangerous “cowboy,” shooting from the hip, although many of his policies had been employed by President Jimmy Carter’s administration as well. The German government in particular resisted missile deployment, because German public opinion was so strongly against it. Reagan knew this, but he hoped that his hard-line public statements might help the government to resist the will of its people. Arms reduction talks continued in a haphazard manner throughout the decade, until Gorbachev came to power and a treaty was worked out in 1987. Reagan objected to contracts between the European Community and the Soviet Union to build natural gas pipelines. As a result, in 1982, Washington banned the use of any U.S. technology in the project. The ban was lifted later in the same year, however, once the administration reached an agreement with the European Community on trade policies. By the end of 1982, many European memberstates of NATO had decided to ignore the objections of antinuclear and nationalist movements within their borders and were moving toward agreement with Washington to deploy intermediate-range nuclear missiles. The American proposal was called the zero option solution, and it linked reduction of U.S. missiles in Western Europe to a corresponding reduction of Soviet intermediate-range missiles and mobile launchers in Eastern Europe. Vice President George H. W. Bush traveled to Europe to lobby for the American plan. He also visited Romania, showing support for President Nicolae Ceaulescu, who publicly opposed Moscow on a number of issues. In the Western Hemisphere, Washington gave some support to London during the Falkland Islands War of 1982, but it chose to remain officially neutral during the conflict. The British reacted with bitterness to this decision by a supposed ally. Meanwhile, Greece under Andreas Papandreou took issue with American bases and nuclear missiles being lo-
Europe and North America
■
345
cated within the country but ultimately relented, because it needed U.S. support against Turkey. Spain also renewed its treaties with the United States, allowing it to maintain bases in Spanish territory until at least 1988. Spain was accepted into NATO in 1982. Canadian-European Relations Prime Minister Trudeau disagreed with Reagan about linking disarmament to Soviet behavior. He also chastised Thatcher when she visited Canada, accusing her of engaging in “megaphone diplomacy.” However, beginning in 1984, when Conservative Brian Mulroney became prime minister, Canada supported Reagan’s policies as well. In 1984, Reagan and Trudeau traveled to Normandy to celebrate the fortieth anniversary of the Allied invasion of France in World War II. Reagan took the opportunity to denounce the Soviets once more for remaining in control of Eastern Europe even after the war was over. Canada in 1982 severed its last legal ties to the United Kingdom with the passage of the Canada Act of 1982, and it became a fully sovereign nation. The cultural ties of most of the nation to the former home country and to its royal family still remained, however, causing continued tension between British Canadians and French Canadians. In fact, when Elizabeth II visited Canada to proclaim the constitution, Quebec sovereignists led by provincial premier René Lévesque refused to attend the ceremonies. Impact The conservative political shift and economic expansion witnessed by the United States in the 1980’s was mirrored in many respects in the United Kingdom, and the Americans and British often allied to set the agenda for the rest of NATO during the decade. There were tensions within the alliance, especially regarding trade and the extent to which the governments of Europe should follow the dictates of Washington rather than those of their own people. On the whole, however, U.S.-European relations were defined by Cold War politics during what would prove to be that war’s last decade. If Europeans were divided over Reagan’s policies during the Cold War, moreover, they would continue in its aftermath to debate the extent to which those policies deserved credit for the revolutionary events in Eastern Europe that brought the decade to a close. Further Reading
Goldstein, Walter, ed. Reagan’s Leadership and the Atlantic Alliance: Views from Europe and America.
346
■
Washington, D.C.: Pergamon-Brassey, 1986. Compiles essays on Reagan’s policies written by American and European scholars in the middle of the decade. The majority are favorable toward the president. Granatstein, J. L., and Robert Bothwell. Pirouette: Pierre Trudeau and Canadian Foreign Policy. Toronto: University of Toronto Press, 1990. Analysis of the Liberal prime minister’s foreign policy by two of Canada’s most prolific historians. Michaud, Nelson, and Kim Richard Nossal, eds. Diplomatic Departures: The Conservative Era in Canadian Foreign Policy, 1984-1993. Vancouver: UCB Press, 2001. An analysis of Conservative prime minister Brian Mulroney’s policies in the crucial years of the mid- and late 1980’s by two distinguished political scientists. Morley, Morris H., ed. Crisis and Confrontation: Ronald Reagan’s Foreign Policy. Totowa, N.J.: Rowman and Littlefield, 1988. A collection of academic essays on U.S. foreign affairs in the 1980’s, including two on Europe. Nossal, Kim Richard. The Politics of Canadian Foreign Policy. Englewood Cliffs, N.J.: Prentice-Hall, 1985. Analysis of Canadian political institutions and the making of foreign policy by a distinguished scholar. Smith, Steven K., and Douglas A. Wertman. U.S-West European Relations During the Reagan Years: The Perspective of West European Publics. New York: St. Martin’s Press, 1992. Account by two distinguished political scientists of Western Europeans’ perceptions of their nations’ diplomatic relations with the United States. Contains bibliographical material. Vanhoonacker, Sophie. The Bush Administration, 19891993, and the Development of a European Security Identity. Burlington, Vt.: Ashgate, 2001. Examines and analyzes the Bush administration’s policies regarding European security during the post-Cold War era. Frederick B. Chary See also
The Eighties in America
Evangelical Lutheran Church in America
Berlin Wall; Foreign policy of Canada; Foreign policy of the United States; Haig, Alexander; Intermediate-Range Nuclear Forces (INF) Treaty; Lévesque, René; Mulroney, Brian; Pan Am Flight 103 bombing; Quebec referendum of 1980; Reagan, Ronald; Trudeau, Pierre; United Nations; West Berlin discotheque bombing.
■ Evangelical Lutheran Church in America Definition
The largest Lutheran church in North America, formed by the merger of several Lutheran bodies Date Created January 1, 1988 After years of discussions and smaller mergers, the majority of North American Lutherans came together in one church headquartered in Chicago, the ELCA. As a result of the mass northern European immigration to North America during the nineteenth century, Lutheranism grew quickly on the continent, as more than fifty Lutheran organizational groups were formed. Over time, these groups merged or disbanded until, by the 1960’s, three large churches had emerged: the Lutheran Church in America (LCA), the American Lutheran Church (ALC), and the Lutheran Church-Missouri Synod (LCMS). Although the ALC and LCMS had declared altar and pulpit fellowship in the 1960’s, the ALC’s decision to ordain women pastors later led the LCMS to withdraw from the fellowship. At the same time, the LCMS began removing professors at Concordia Seminary in St. Louis who used critical-scientific methods to interpret the Bible. These professors established a seminary in exile (Seminex) in 1974 that continued to provide pastors to congregations until four LCMS district presidents were removed for ordaining those pastors. In 1976, three hundred LCMS congregations left the organization to form the Association of Evangelical Lutheran Churches (AELC), which immediately issued the “Call to Lutheran Union” and began meeting with the LCA and ALC to discuss merging the three groups. On September 8, 1982, the LCA, the ALC, and the AELC each met in convention, simultaneously communicating with one another via telephone hookup. The three church bodies voted to merge into one. The seventy-member Commission for a New Lutheran Church, intentionally selected to represent a broad specrum of church members, then met ten times to study the theological understandings of each predecessor body and to discuss ecclesiastical principles. The formal process to bring the Evangelical Lutheran Church in America (ELCA) into being began at a constituting convention in Columbus, Ohio, in May, 1987. The three previous bishops—James Crumley, David Preus, and Will
The Eighties in America
Herzfeld—stepped down with the election of the ELCA’s first presiding bishop, Herbert Chilstrom. On January 1, 1988, the ELCA officially came into being, with headquarters in Chicago and sixty-five regional synods. The ELCA is considered a Protestant denomination, recognizing the two sacraments of Baptism and Communion and following the Old and New Testaments as its source of doctrine, as expressed in the sixteenth century Augsburg Confession. The name Evangelical Lutheran Church in America was carefully chosen with a traditional understanding of “evangelical,” meaning “rooted in the gospel.” Its place on the American scene was to be inclusive, with a goal of establishing 20 percent of new congregations in ethnic minority communities and providing 20 percent minority representation on all churchwide boards and committees. Outreach was centered on mission and service, seeking cooperation with other North American churches and the global community. Impact The formation of the ELCA established theological principles and organizational structures held in common by the majority of North American Lutherans. The five million Lutherans in the ELCA represent two-thirds of all Lutherans in the United States and compose the second largest Lutheran Church in the world and the fifth largest church body in North America. Further Reading
Almen, Lowell G. One Great Cloud of Witnesses. Minneapolis: Augsburg Fortress, 2001. Lagerquist, L. DeAne. The Lutherans: Student Edition. Denominations in America 9. Westport, Conn.: Praeger, 1999. Skrade, Kristofer, ed. The Lutheran Handbook. Minneapolis: Augsburg Fortress, 2005. Fred Strickert See also
Religion and spirituality in Canada; Religion and spirituality in the United States.
Exxon Valdez oil spill
■
347
■ Exxon Valdez oil spill The Event
Environmentally devastating oil spill off the coast of Alaska Date March 24, 1989 Place Prince William Sound, Alaska The Exxon Valdez oil spill created an ecological and economic disaster, severely damaging a fragile ecosystem. In its aftermath, new laws were passed to attempt to prevent future disasters and to ensure that those events that could not be prevented would be cleaned up more effectively and punished more harshly. At 12:04 a.m. on March 24, 1989, the Exxon Valdez oil tanker struck a reef in Prince William Sound, Alaska. The ship had left the Trans-Alaska Pipeline terminal in Valdez, Alaska, at 9:12 p.m. on March 23, 1989, loaded with 53,094,510 gallons of oil from Alaska’s Prudhoe Bay production facilities. As the ship tried to negotiate the entrance to the harbor, called the Valdez Narrows, it struck Bligh Reef. At the time of impact, Captain Joe Hazelwood had left Third Mate Gregory Cousins in control of the wheelhouse, with Helmsman Robert Kagan steering the 986-foot ship. The Ship Runs Aground Captain Hazelwood had left the bridge at 11:52 p.m. At the time of impact, he was in his private quarters. Before leaving, he had alerted the U.S. Coast Guard that he was changing course because of some small ice calves that had drifted into the channel from the nearby Columbia Glacier. Shortly after departing from the normal shipping channel, the Exxon Valdez ran onto the reef. Once he was notified of the problem, Hazelwood contacted the Vessel Traffic Center at 12:27 a.m. and told officials that the ship had run aground on Bligh Reef. As a result of the accident, nearly eleven million gallons of oil spilled out into Prince William Sound and dealt a devastating blow to the region’s fragile environment. Threatened by the spill were animals including sea otters, whales, sea lions, and porpoises, while millions of birds and numerous species of plants also were negatively affected. The food chain of the fragile ecosystem was at stake because of the pervasive penetration of the spill into so much of the area. Isolated shorelines were coated with oil. At the time, neither the U.S. government nor the state of Alaska had a disaster-management system in place to address an oil spill with the scope and sever-
348
■
Exxon Valdez oil spill
ity of the Exxon Valdez spill, which was the largest in U.S. history. The magnitude of the spill and its potential impact overwhelmed existing systems. Thus, the response to the environmental disaster was poorly coordinated, slow, and incomplete. The immediate response was also hampered by poor weather conditions. After a few days, the winds had carried the spill forty miles away from Bligh Reef to the south and southwest. Eventually, the spill reached distant locations on Kodiak Island and the Kenai Peninsula of Alaska. Block Island, Smith Island, Cordova, and Green Island were all severely affected by the spill, as the city of Valdez quickly became the center of primary response activities. Impact The need to clean up the oil spill and to mitigate its effects on the environment as effectively as possible presented a problem that had few simple solutions. Many different methods were used in attempts to cleanse the affected seas and shorelines of
The Eighties in America
oil. Primary among the methods used to clean sea water were burning the oil, using dispersants to break up the oil, using ship-towed booms to corral oil, and skimming oil-contaminated water off the ocean’s surface by mechanical means. Shoreline and beach clean-up efforts employed hot water under pressure, chemical cleaners, mechanical cleaning devices such as front-end loaders, manual efforts, and bioremediation, which attempted to use microbes to devour the oil. Even with extensive private and public efforts, however, the region sustained significant environmental damage, damage that lasted into the twenty-first century. Numerous investigations were conducted to determine responsibility both for the spill and for the problems that occurred afterward. The responsibility of Exxon and Captain Hazelwood were later determined by the courts. Hazelwood had been witnessed drinking alcohol in Valdez before the ship departed. He was later criminally convicted for the
A tugboat guides the crippled Exxon Valdez through the Prince William Sound after the tanker was extricated from Bligh Reef on April 5, 1989, nearly two weeks after it ran aground. (AP/Wide World Photos)
The Eighties in America
negligent discharging of oil and was fired by Exxon. Exxon was sued and in 1994 was found negligent by a jury in Anchorage, Alaska. The company was ordered to pay $287 million in actual damages and $5 billion in punitive damages. (These damages were later reduced to $2.5 billion by the Ninth Circuit Court of Appeals in December, 2006.) Beyond these financial damages, the spill was the ultimate publicrelations disaster for the corporation, which was vilified for years for its perceived indifference to the dangers posed to the environment by supertankers. Subsequent Events Additional measures were undertaken by the state and federal governments to address the poor response to the spill. The U.S. Congress passed the Oil Pollution Act in 1990, ordering new prevention efforts, creating higher levels of corporate liability, and establishing a response fund for the federal government to prepare for future disasters. Alaska’s legislature also responded quickly, passing new laws in the year after the spill. These laws increased the state’s oil-disaster response fund and established measures to prevent future spills, as well as new oversight processes. Increased penalties for polluters were also instituted, along with the creation of a new state department specifically tasked with overseeing oil spills in Alaska.
Exxon Valdez oil spill
■
349
Further Reading
Bryan, Nichol. Exxon Valdez: Oil Spill. Milwaukee: World Almanac Library, 2004. Youth reference that includes excellent photos and timelines. Keeble, John. Out of the Channel: The Exxon Valdez Oil Spill in Prince William Sound. Cheney: Eastern Washington University Press, 1999. Balanced account of the spill and clean-up efforts. Owen, Bruce M., et al. The Economics of a Disaster: The Exxon Valdez Oil Spill. Westport, Conn.: Quorum Books, 1995. Assesses the economic impact of the spill. Picou, J. Steven. Exxon Valdez Disaster. Dubuque, Iowa: Kendall/Hunt, 1999. Investigates the Exxon Valdez disaster and related problems connected to corporate and governmental failures. United States Congress. Senate Committee on Commerce, Science, and Transporation. Exxon Oil Spill: Hearing Before the Committee on Commerce, Science, and Transportation. Washington, D.C.: Government Printing Office, 1989. Detailed investigation of the spill by the congressional committee. Douglas A. Phillips See also
tion.
Environmental movement; Water pollu-
F ■ Facts of Life, The Identification Television comedy series Date Aired from August 24, 1979, to May 7, 1988
The Facts of Life centered on the adolescent experiences of a group of girls attending a private boarding school. It offered archetypes and role models for female teens of the decade. The Facts of Life began as a summer offering at the National Broadcasting Company (NBC) in 1979, and in early 1980 the network decided to bring the program back as a regular series. It was a spin-off of the hit Diff’rent Strokes (1979-1986) and featured that show’s housekeeper, Edna Garrett (Charlotte Rae), as the housemother at Eastland Academy, a private boarding school for girls in upstate New York. The first season’s regular cast included seven students, but the cast was too cluttered for a half-hour show, and in the second season only four students were regular characters. (One of the girls written out of the show at the beginning of the second season was Molly Ringwald, who went on to become a top teen film star of the decade.) The sitcom’s setting also shifted slightly in the second season, from the dormitory living room to the campus cafeteria and student lounge, as the focus narrowed to the four Eastland Academy students who lived with and worked for Mrs. Garrett, who became the school’s dietician. The four featured student roles included Blair, the spoiled rich girl; Natalie, the perky heavy-set girl; Tootie, the naïve younger African American girl; and a new character, Jo, the tough working-class girl on a scholarship. The seasons rolled by, and as the four girls aged and graduated, they continued to live and work with Mrs. Garrett, running a bakery that then became a gift shop. George Clooney, at the time unknown, joined the cast in a supporting role in 1985; Rae left the series in 1986, replaced by Cloris Leachman as Mrs. Garrett’s sister Beverly Ann. The infectious music of the theme song was lighthearted and welcoming, while the lyrics reminded viewers that “the facts of life are all about you.” Posi-
tive self-worth and self-realization were regularly stressed in the characters’ experiences, but the students were also allowed material comforts and some glamorous travel. The Facts of Life girls had international adventures in two feature-length made-fortelevision movies: The Facts of Life Goes to Paris (1982) and The Facts of Life Down Under (1987). As the series went on, the young women blossomed: Blair found compassion, Jo developed confidence, Tootie grew shapely, and the bookish Natalie—in a ratings ploy— was the first to lose her virginity. Impact The Facts of Life created four memorable characters in Blair, Jo, Natalie, and Tootie. Adolescent and preteen girls in the 1980’s could select any one of the four as the one they most identified with, and through regular viewing of the series, they would see their favorite triumph in positive ways, while learning to live in harmony with other young women under the guidance of an older, wiser matriarch. Further Reading
Dalton, Mary M., and Laura R. Linder, eds. The Sitcom Reader: America Viewed and Skewed. Albany: State University of New York Press, 2005. Winzenburg, Stephen M. TV’s Greatest Sitcoms. Frederick, Md.: PublishAmerica, 2004. Scot M. Guenter See also Brat Pack in acting; Golden Girls, The; Sitcoms; Television.
■ Fads Definition
Widely popular but short-lived fashions, entertainments, and products
The material culture of the 1980’s produced many fads, as consumers willing to spend money on new products, diets, and fashions found themselves chasing the “next big thing” throughout the decade.
The Eighties in America
The 1980’s was known as a decade of consumerism. Yuppies (young upwardly mobile professionals or young urban professionals) enjoyed high disposable incomes and had a propensity to spend them. Reacting to the perceived selfishness driving this conspicuous consumption, novelist Tom Wolfe, author of The Bonfire of the Vanities (1987), labeled the babyboom generation the “Me generation.” The selffocus of the Me generation gave rise not only to selfindulgence but also to a desire for self-improvement and increased health consciousness. All three factors created significant marketing opportunities, and mass-marketed fad products proliferated throughout the decade. Toys and Games
New technology led to advances in video games and to a surge in the popularity of both video arcades and home video-game systems such as those manufactured by Atari and Nintendo. One of the most popular and well-known of the video games that sparked 1980’s fads featured the little, round, yellow, dot-gobbling character known as Pac-Man. Pac-Man achieved worldwide fame in the early 1980’s as an arcade game by the Japanese company Namco. Ms. Pac-Man would follow shortly thereafter. Even the briefly popular television character Max Headroom was representative of the growing emphasis on video technology. Children throughout the United States sought out Cabbage Patch Kids, dolls that came with their own names, personalities, and birthdays. The immense popularity of Cabbage Patch Kids spawned rivals, as well as collectible trading cards that featured the Garbage Pail Kids. Other popular toys included Strawberry Shortcake, Rainbow Brite, and the Care Bears, who came in a variety of pastel colors and a variety of names. Children in the 1980’s were also enamored of little blue creatures called Smurfs and four crime fighters known as the Teenage Mutant Ninja Turtles. Many of these characters spawned mass merchandizing campaigns, which often included movies, television shows, accessory lines, and video and board games. Adolescents and adults purchased the hand-held mechanical puzzle known as Rubik’s Cube, released worldwide at the beginning of the decade. The surface of the cube was covered with fifty-four variously colored squares, and the puzzle was to manipulate them so that each of the cube’s six faces contained nine squares of the same color. Popular games in-
Fads
■
351
cluded hacky sack and tetherball. The board game Trivial Pursuit, in which players answered trivia questions in a variety of categories, also enjoyed a brief period of immense popularity. While all these games remained on the market after the end of the 1980’s, they achieved the height of their popularity in that decade. Entertainment During the 1980’s, horror movies were staples at the box office. Many original horror films spawned franchises during the decade, as multiple sequels were produced to such movies as Poltergeist (1982), Halloween (1978), Friday the 13th (1980), A Nightmare on Elm Street (1984), Hellraiser (1987), and Child’s Play (1988). Teen movies were also popular, especially those starring members of the socalled Brat Pack—including Emilio Estevez, Anthony Michael Hall, Rob Lowe, Andrew McCarthy, Demi Moore, Judd Nelson, Molly Ringwald, and Ally Sheedy—and directed or written by John Hughes. Well-known movies of this type included The Breakfast Club (1985), Fast Times at Ridgemont High (1982), Pretty in Pink (1986), Sixteen Candles (1984), St. Elmo’s Fire (1985), and Weird Science (1985). Later in the decade, films such as Heathers (1989) and Less than Zero (1987) attempted to reach the same audience in a different fashion. Recorded music became portable entertainment as people carried large radios known as “boom boxes” or small personal radios with headphones, such as the popular Sony Walkman. Popular music such as New Wave, punk, hip-hop, and hair metal gave rise to new fad fashions and new popular dances, such as break dancing. Comedian Gallagher achieved fame by entertaining plastic-draped audiences with his sledge-o-matic, which smashed watermelons and other objects. Fashion and Language Successful yuppies wore power suits and shoulder pads to the office, while “preppies” favored Izod clothing. The hit television series Miami Vice sparked a brief surge in the popularity of pastel suits, worn with T-shirts rather than dress shirts and ties, as well as white shoes. The 1980’s health craze made workout clothes popular attire, both inside and outside the gym. Popular 1980’s hairstyles included so-called big hair, maintained by can after can of hairspray, and the mullet, a style in which hair was worn shorter on the sides and longer in the back. Adolescents, influenced by fashionable music su-
352
■
The Eighties in America
Falwell, Jerry
perstars such as Madonna and Michael Jackson, favored denim jackets, piles of chunky jewelry, leg warmers, lingerie worn outside of clothing rather than underneath, and either fingerless lace gloves or a single sequined glove on one hand. They also wore slap bracelets, which formed to the wearer’s wrist when slapped on, and friendship bracelets. Tennis shoes, or sneakers, such as those made by Converse and Vans, and plastic “Jelly shoes” were popular footwear choices. Popular 1980’s fashions also featured neon bright colors. Clara Peller, in her eighties at the time, created a widely used 1980’s catchphrase when she demanded, “Where’s the beef?” in well-known television commercials for the Wendy’s fast food restaurant. Nancy Reagan’s participation in the federal government’s so-called war on drugs, meanwhile, gave rise to the catchphrase “Just Say No.” Popular yellow signs displayed in many 1980’s car windows reminded other drivers that there was a “Baby on Board.” Valley girls spoke with distinctive accents, while adolescents everywhere adopted the slang expressions “Not!” and “psych” into their everyday vocabularies. Impact
The popularity of fads in the 1980’s highlights the reason why the decade has become known for its excesses and extremes. Many fads, while only enjoying short-lived periods of widespread popularity, help launch other products or media that go on to enjoy more durable fame. For example, Pac-Man helped bring widespread popularity to video games, thereby creating a mass market for both coinoperated machines and the home-based game systems that eventually replaced them. Fads also often become nostalgic symbols of the time in which they were popular, and many 1980’s products such as Cabbage Patch Kids still have small but loyal fan clubs. Further Reading
Batchelor, Bob, and Scott Stoddart. The 1980’s. Westport, Conn.: Greenwood Press, 2006. Explores the decadence of the Me Generation and its impact on popular culture; includes a timeline and a bibliography. Berger, Arthur Asa. Ads, Fads, and Consumer Culture: Advertising’s Impact on American Character and Society. 2d ed. Lanham, Md.: Rowman and Littlefield, 2003. Overview of advertising and consumer culture in the United States.
Best, Joel. Flavor of the Month: Why Smart People Fall for Fads. Berkeley: University of California Press, 2006. Sociological analysis of the life cycles of fads and the conditions needed to create them. Panati, Charles. Panati’s Parade of Fads, Follies, and Manias: The Origins of Our Most Cherished Possessions. New York: Harper Perennial, 1991. Covers a variety of fads from the period from 1890 to 1990. Smith, Martin J., and Patrick J. Kiger. Poplorica: A Popular History of the Fads, Mavericks, Inventions, and Lore That Shaped Modern America. New York: HarperCollins, 2004. Places the development of various fads in historical context. Marcella Bush Trevino See also Advertising; Cabbage Patch Kids; Consumerism; Dance, popular; Fashions and clothing; Gallagher; Hairstyles; Hobbies and recreation; Horror films; Leg warmers; Max Headroom; Pac-Man; Slang and slogans; Teen films; Toys and games; Video games and arcades.
■ Falwell, Jerry Identification
Baptist minister, television personality, and conservative political activist Born August 11, 1933; Lynchburg, Virginia Died May 15, 2007; Lynchburg, Virginia As founder of the conservative political action group the Moral Majority, Jerry Falwell played a leading role in the efforts of conservative Christians to influence the political process during the 1980’s. Jerry Falwell was the pastor of the Thomas Road Baptist Church in Lynchburg, Virginia, a forerunner to the megachurches of later decades; host of a popular television program, The Old-Time Gospel Hour; and founder of Liberty Baptist College (later Liberty University). He was one of the most famous preachers in the United States during the late 1970’s and the 1980’s. In the 1960’s, he had resisted engagement in the Civil Rights movement, arguing that preachers ought to concentrate on evangelism. He preached a sermon in 1965 titled “Ministers and Marches” that argued this position. By the late 1970’s and thereafter, however, Falwell had become a proponent of active involvement by conservative Christians in the political process. Falwell came to national prominence at the end
The Eighties in America
of the 1970’s by using his influence as a so-called televangelist to form a political action group called the Moral Majority in the summer of 1979. This group has sometimes been credited with helping elect Ronald Reagan as president the following year. Led by Falwell, the Moral Majority condemned abortion, homosexuality, and the Equal Rights Amendment and supported a balanced budget and increased military spending. The group’s public prominence was short-lived, however, since Falwell dissolved it in 1989, although the Christian Coalition, founded by Pat Robertson in 1989, carried forward significant aspects of the Moral Majority’s agenda. Thereafter, the influence of the “Religious Right,” as Falwell and other conservative Christians came to be known, was still a prominent force in American politics. The extent of this political influence, though, was widely debated. Falwell remained a regular commentator on public events into the twenty-first century.
Family Ties
■
353
Impact The early twentieth century witnessed a decisive split between conservative and liberal Christians. After this division, conservative Christians tended to retreat into their own cultural enclaves, distant from public affairs, while it was the liberal Christians who were more likely to engage in activism. In the last half of the twentieth century, though, conservative Christians reacted to the U.S. Supreme Court’s decisions removing government-sponsored prayers from public schools and, still later, its opinion in Roe v. Wade (1973) granting constitutional protection to abortion rights. They began to take public action against these decisions and other events they found anathema to their values. Falwell helped leverage this public opposition into a more widescale engagement by conservative Christians in the political process, turning them into a significant lobbying force in American politics. Further Reading
D’Souza, Dinesh. Falwell, Before the Millennium: A Critical Biography. Chicago: Regnery Gateway, 1984. Falwell, Jerry. Falwell: An Autobiography. Lynchburg, Va.: Liberty House, 1997. Harding, Susan Friend. The Book of Jerry Falwell: Fundamentalist Language and Politics. Princeton, N.J.: Princeton University Press, 2000. Timothy L. Hall See also Bakker, Jim and Tammy Faye; Conservatism in U.S. politics; Elections in the United States, 1980; Hustler Magazine v. Falwell; Moral Majority; Reagan, Ronald; Religion and spirituality in the United States; Robertson, Pat; Televangelism.
■ Family Ties Identification Television comedy series Date Aired from September 22, 1982, to May 14,
1989 A successful long-lived situation comedy, Family Ties tackled family issues, social ideologies, and politics and helped make a star of Michael J. Fox.
Jerry Falwell, right, with Phyllis Schlafly at a Moral Majority news conference in 1984. (AP/Wide World Photos)
A distinctively 1980’s situation comedy, Family Ties was a long-running series that fused personal issues affecting the family with the societal conventions and politics particular to the decade. The show ran for seven seasons on the National Broadcasting
354
■
Family Ties
The Eighties in America
(Tina Yothers), and a mid-series new addition, baby Andrew (Brian Bonsall). Steven ran public television station WKS, and Elyse was an architect. The couple tried to instill their liberal, democratic views along with their sense of civic duty and humanitarianism in their children. These views existed in humorous tension with Alex’s conservatism, love of Ronald Regan, and idolization of Richard M. Nixon and with Mallory’s obsession with shopping and superficial appearances. Young Jennifer and Andy were caught in the middle of the two factions and often provided a voice of reason. The ideological battle provided a comedic break when the show tackled difficult and emotional issues, such as running away from home, addiction, suicide, teen pregnancy, and mental illness. In addition to launching Michael J. Fox’s career, Family Ties featured an impressive cast of secondary characters and cameo appearances, including Tom Hanks, Geena Davis, and Courtney Cox, who all appeared on the show prior to becoming superstars.
The cast of Family Ties around 1982. Clockwise from bottom left: Justine Bateman, Meredith Baxter-Birney, Michael Gross, Tina Yothers, and Michael J. Fox. (Hulton Archive/Getty Images)
Company (NBC), debuting on September 22, 1982, and ending on May 14, 1989. Its initial ratings were disappointing, but the show’s popularity skyrocketed by its second season, when it was strategically placed between the number-one-rated The Cosby Show and another popular sitcom, Cheers, on NBC’s dominant Thursday-night lineup. At the height of its popularity, from 1985 through 1987, Family Ties was number two in the Nielsen ratings, receiving a 33 percent share of the U.S. viewing audience. Set in Columbus, Ohio, the show was based on a unique spin on traditional family sitcoms: hip, liberal parents rebelling against their conservative, consumerist children. Flower children of the 1960’s Elyse (Meredith Baxter-Birney) and Steven Keaton (Michael Gross) were the parents of Alex P. (Michael J. Fox), Mallory (Justine Bateman), Jennifer
Impact Family Ties experienced incredible popularity because the show’s premise was well-suited to audiences of the 1980’s. The show depicted a conflict in the values of successive generations that was very much in the mind of many Americans. By juxtaposing Steven and Elyse’s commitment to their 1960’s community-oriented principles against the self-centered allegiance to Reaganomics held by Alex and Mallory’s commitment to spending money, Family Ties offered a humorous but recognizable account of the radically diverse—and at times diamet rically opposed—political, social, and economic views held by Americans during the 1980’s. Further Reading
Dalton, Mary M., and Laura L. Linder. The Sitcom Reader: America Viewed and Skewed. Albany: State University of New York Press, 2005. Marc, David. Comic Visions: Television Comedy and American Culture. Malden, Mass.: Blackwell, 1998. Mills, Brett. Television Sitcom. London: British Film Institute, 2005. Sara Vidar See also Back to the Future; Cheers; Cosby Show, The; Designing Women; Facts of Life, The; Fox, Michael J.; Golden Girls, The; Married . . . with Children; M*A*S*H series finale; Reaganomics; Sitcoms; Television; Wall Street; Wonder Years, The.
The Eighties in America
■ Farm Aid Identification
Nonprofit organization and benefit concerts Date Organization established August, 1985; concerts held September 22, 1985, July 4, 1986, and September 19, 1987 Place Champaign, Illinois; Austin, Texas; and Lincoln, Nebraska Farm Aid raised money on behalf of struggling U.S. farmers. It provided those farmers with information, access to services, and financial support during the farm crisis. Inspired by the July, 1985, Live Aid concerts, singer Willie Nelson contacted Neil Young and John Cougar Mellencamp to establish the Farm Aid organization and plan a benefit concert to help farmers suffering during the agricultural crisis of the 1980’s. Nelson aspired for music to bring urban and rural people together, educating them about farmers’ problems. On September 22, 1985, the first Farm Aid concert has held. Nelson and Young opened the show, performing a song in tribute to farmers. Approximately eighty thousand people gathered in Champaign, at the University of Illinois’s Memorial Stadium, to hear more than fifty musicians, most incorporating farm themes in their songs. Farmers attended in groups, some riding the Farm Aid Express train from Iowa to Champaign. Some performers told audience members to ask their congressional representatives to promote legislation supporting family farmers. The Nashville Network’s live broadcast of the concert reached an audience of 24 million people on cable television. Approximately three hundred television stations and four hundred radio stations aired portions of the Farm Aid concert. Within a week, Farm Aid distributed $100,000 from the concert to groups and farmers in seventeen states for food and other uses. Farm Aid received a total of $7 million from the concert and post-concert donations. Nelson stated some money would be used to establish a hotline, provide legal advice, and assist farmers stripped of land to secure new employment. Nelson had believed only one Farm Aid concert would be necessary, but the farm crisis continued.
Farm Aid
■
355
He welcomed forty-five thousand concertgoers to Farm Aid II on July 4, 1986, at Manor Downs racetrack outside Austin, Texas. The following year, approximately seventy thousand people attended Farm Aid III on September 19, 1987, at the University of Nebraska’s Memorial Stadium in Lincoln. Nelson told the audience that two hundred U.S. farms were lost daily, stressing the urgency of continuing to help farmers. Instead of staging another centralized concert in 1989, Nelson included Farm Aid as a component of sixteen of his regular concerts throughout the United States, arranging for press conferences with farmers to increase awareness of the farm crisis and each area’s concerns. Impact During the 1980’s, Farm Aid raised over eleven million dollars to help farmers, providing resources for disaster assistance, medical care, and other relief. Farm Aid gave a $250,000 grant to fund the United Farmer and Rancher Congress at St. Louis, Missouri, in September, 1986, where agriculturists discussed improving farm policies. In June, 1987, Nelson and Mellencamp were called before Congress to describe Farm Aid’s impact on the nation’s farmers. Farm Aid helped finance such groups as National Save the Family Farm Coalition, which worked to seek agricultural reforms. As the agricultural economy slowly improved, Farm Aid focused on enhancing the quality of farm products and methods. Further Reading
George-Warren, Holly, ed. Farm Aid: A Song for America. Introduction by Eric Schlosser. Music essays by Dave Hoekstra. Concert photography by Paul Natkin and Ebet Roberts. Emmaus, Pa.: Rodale, 2005. Greenhouse, Steven. “Musicians Give Concert to Aid Nation’s Farmers.” The New York Times, September 23, 1985, p. A16. “Harvest Song: Willie Plans a Benefit.” Time 126 (September 23, 1985): 32. Elizabeth D. Schafer See also
Agriculture in the United States; Cable television; Country music; Farm crisis; Food Security Act of 1985; Live Aid; Mellencamp, John Cougar; Music.
356
■
Farm crisis
■ Farm crisis Definition
Foreclosure on thousands of American farms, particularly in the Midwest Date Took place between 1981 and 1987 The farm crisis restructured American farming and revealed new realities of U.S. agriculture. In the wake of the crisis, it became clear that the farm lobby’s political clout had been significantly reduced from its former levels; that U.S. agriculture had become dependent upon international markets and was simultaneously threatened by international competitition at home; that former client nations were themselves growing enough food to export, increasing that competition; and that large government subsidies had become necessary for U.S. farms to survive. The United States entered the international grain market in the 1970’s, at a time when there was a world food shortage. Prices for grain tripled between 1972 and 1974, as American farmers dominated the markets. Federal price supports increased their profits even more. To increase profits, farmers planted more acres and bought more land and equipment, going into debt. Farmland prices, which averaged $196 per acre in 1970, had increased to $796 per acre by 1981. Some farmers became millionaires when land values quadrupled. Large bank loans to farmers were based upon inflated appraisals, as farmers assumed their prosperity would continue. In 1979, Paul Volcker, head of the Federal Reserve, tightened the money supply in an effort to reduce double-digit inflation rates. Banks had less money to lend, so interest rates rose. The annual interest rates on some farm loans increased to more than 20 percent. As the 1980’s began, however, farmers were still reaping profits from the international grain market. In 1981, at the trend’s peak, the United States sold $44 billion worth of grain overseas. The Crisis International markets had become extremely important to American farmers’ profits; farmers sold one-third of their yield overseas. In 1980, the Soviet Union invaded Afghanistan, and President Jimmy Carter responded by imposing a grain embargo against the Soviets. Overnight, America’s farmers lost one of their most important grain markets. In 1981, when Ronald Reagan became president, the rate of inflation was 13.5 percent. More than three hundred farms folded; Secretary of Agriculture John Block predicted more would follow.
The Eighties in America
In 1982, the appraised worth of America’s farmland crashed as the dollar devalued. Suddenly, tens of thousands of farmers had more debt than assets. A recession began, as the economy deflated. Many industrial jobs vanished. At the same time, there was a record world food harvest, so U.S. grain sales shrank. American harvests were so large there was a silo shortage. Grain prices hit bottom. The collective debt of the nation’s farmers reached $21.5 billion. Farm income dropped by one-third, with farmers making $1 billion less than they had in 1929, at the beginning of the Great Depression. Inflation had decreased to 5.1 percent, but national unemployment had risen to 10.8 percent. Over 30,000 farms closed in 1982. Of the 270,209 farms borrowing money from the U.S. Department of Agriculture’s Farmer’s Home Administration (FmHA), 66,470 were in arrears, including one-half of all borrowers in Florida. The number of foreclosed farms owned by the FmHA doubled in the first eight months of 1983. Secretary Block proposed the Payment in Kind program: Farmers who agreed not to plant one-half of their acreage would be allowed to take the amount they would otherwise have grown from government grain stores. They would then be able either to sell the government grain as if it were their own or to feed it to their livestock. By the middle of 1983, the national recession had improved—but not on the farm. In 1984, 31 percent of FmHA loans were in arrears. Roughly 10 percent of Iowa’s farms disappeared. Some sixty-four thousand farms owed a total of $30 billion, with a collective debt-to-asset ratio of 70 percent, meaning they were practically insolvent. This group constituted 11 percent of all mid-sized farms in the United States. In 1985, the average annual income of the members of the Kansas State Farm Management Association (who were presumably the state’s best farmers) dropped to $4,822. Only three years earlier, it had been $11,053. Nearly 40 percent of farms asking for emergency aid in 1985 could not show a positive cash flow. They were judged not to be viable concerns, and their aid requests were rejected. A spokesman for FmHA announced that the administration would work to save only those farms that had a chance of surviving. However, a moratorium was declared on federal land seizures, and a new federal program promised an extra $650 million in loan guarantees. President Reagan’s 1985 farm bill phased out
The Eighties in America
price supports in order to allow the marketplace to determine price. The FmHA would no longer loan money to farmers but would work through banks. Reagan said taxpayers should not be required to bail out every farmer in difficulty. The administration still gave $26 billion in subsidies to farmers. In March, 1985, Reagan joked to an audience, “I think we should keep the grain and export the farmers.” During 1985, there was an increase in militant farm activism, shootings, and suicides. Rural extremist organizations, such as the Order, Posse Comitatus, and the Farmer’s Liberation Army, gained members. The year 1986 witnessed the disappearance of 2.7 percent of the nation’s remaining farms—60,310 farms, or about 165 farms per day, representing the greatest average daily loss since 1941-1945. International grain sales totaled only $23.6 billion. By the middle of 1987, however, newspapers were asking if the farm crisis might be over. About 15 percent of the remaining farms still had debt problems, but two years before it had been 30 percent that were in trouble. Farmers’ debt was down by 35 percent from its 1982 level. Government subsidies totaled $5.6 billion, and farmers, although still in debt, were earning more money than ever. The federal government accounted for 50 percent of wheat growers’ profits and 40 percent of corn growers’ profits. By late 1987, the farm crisis was no longer front-page news. Many farmers left farming voluntarily. Impact
Many casual observers assumed that small farms went out of business because of their size. The main cause, however, was overwhelming debt. Midsized farms with small debt-to-earnings ratios did best in the crisis. Older farmers avoided much trouble because they did not borrow heavily. An Iowa State University study revealed that the average age of a midwestern farmer with low debt-to-asset ratio was sixty-one. The average age of farmers with debtto-asset ratios between 11 and 40 percent was fiftythree. The average age of farmers with ratios between 41 and 70 percent was forty-eight. The average age of farmers with ratios over 70 percent was fortysix. Predictions that the farm crisis would spread to white-collar workers proved incorrect. By the middle of the 1980’s, the general economy outside the agricultural sector was booming. By the 1990’s, parts of the Midwest switched to successful service economies. The federal government continued to provide billions of dollars in farming subsidies annually, and
Fashions and clothing
■
357
the choice of which crops to subsidize largely determined which crops would be produced on American farms. Further Reading
Barlett, Peggy F. American Dreams, Rural Realities. Chapel Hill: University of North Carolina Press, 1993. Study of the experiences of residents of Dodge County, Georgia, during the farm crisis. Dudley, Kathryn M. Debt and Dispossession. Chicago: University of Chicago Press, 2004. Narrative account and oral history of the farm crisis in Minnesota. Dyer, Joel. Harvest of Rage. Boulder, Colo.: Westview Press, 1997. Examination of rural militancy; Dyer connects the farm crisis of the 1980’s to the Oklahoma City bombing of 1995. James Pauff See also Agriculture in the United States; Black Monday stock market crash; Business and the economy in the United States; Conservatism in U.S. politics; Demographics of the United States; Economic Recovery Tax Act of 1981; Farm Aid; Food Security Act of 1985; Globalization; Income and wages in the United States; Reagan, Ronald; Reaganomics; Recessions; Unemployment in the United States.
■ Fashions and clothing Definition
Articles of dress and accessories
Significance Fashion in the 1980’s exhibited a sharp departure from the steady march of modernism, which had promoted the use of new synthetic fibers, wild prints, and bare styles such as hot pants. Instead, fashions of the decade incorporated past trends from different times and places, such as classical Egypt, the Victorian era, and 1970’s “retro” styles. The emergence of vintage clothing as a favorite style of dress for the young demonstrated the importance of nostalgia to the decade, in clothing no less than in other facets of American culture.
As in many modern periods, youth fashions of the 1980’s were distinct from the styles of dress of adults. The former tend to metamorphose rapidly, embracing constant change and diversity to express personal style. People over forty, on the other hand, are often more conservative in their clothing choices. Youth fashions during the 1980’s were both influ-
358
■
Fashions and clothing
enced by and contributed to American popular culture. Pop star Madonna and the movie Flashdance (1983), for example, both inspired distinctive styles of dress. Meanwhile, an increase in disposable income drove the national appetite for high fashion, as designers such as Donna Karan and Ralph Lauren marketed sophisticated clothing lines for women. Such lines were designed to be wearable in the workplace, an increasingly important part of women’s lives. Youth Fashion Madonna had a strong influence on youthful styles, from her wildly teased and colorful hair to her lace bodices and fishnet stockings. “Big hair,” in fact, was one of the noted styles of the 1980’s, starting the decade with strange hair colors and cuts. From 1985 to the end of the 1980’s, asymmetrical cuts were the rage, with hair worn at different lengths on opposite sides of the head, or cut a different length in front than it was in back. Earrings became fashionable for male teenagers, and they added yet another symbol to the developing alternate sexuality counterculture. Some teens pierced their right ears to indicate that they were gay. A leftear piercing thus came to some to indicate heterosexuality. Leg warmers also came into fashion early in the decade. Designed initially for dancers to wear over their thin tights between performances or during rehearsals, black leg warmers had been popular with amateur and professional performers for years. After Jennifer Beals wore them in Flashdance, however, they caught on in wider circles and became available in wild colors and weaves. They were worn over jeans and under dresses and were even layered in different colors. Nightclub patrons wore them over evening sandals or pumps. However, leg warmers enjoyed only a brief period of popularity, and the craze was over by 1985. A more sustained innovation of the 1980’s was acid-washed denim. By 1987, young men and women discovered acid-washed jeans, manufactured utilizing a chemical process that stripped off the top layer of the demin to reveal a white surface with blue undertones. This style came in black as well, and a denim jacket could be worn to match the jeans. Acid-washed denim, or a fashion designed to imitate the original process, was still being worn a decade later. Another fashion was adopted by the youth subcul-
The Eighties in America
ture that defined itself as “gothic,” or “goth,” which emerged from the punk culture of the late 1970’s and early 1980’s. Punks had distinctive clothing, dyed hair, and pierced body parts, and their general attitude was anger. In the mid-1980’s, Anthony H. Wilson, who managed the rock band Joy Division, tagged the band “gothic,” and the term was appropriated to summarize a lifestyle. Goths dressed in black, often dying their hair to match, and they shared a dark and brooding mind-set associated with gothic novels. Young people often became goths to express alienation from traditional society, and while the goth subculture emphasized mysticism and the dark side of life, it also embraced tolerance, freethinking, and mixing traditional gender roles. Women’s Fashions One of the most striking women’s fashions was shoulder pads, which were ubiquitous in the 1980’s and remained popular in the first few years of the 1990’s. The style’s popularity was influenced by a prime-time soap opera, Dynasty, whose star, Linda Evans, had naturally broad shoulders enhanced with pads. Suits with shoulder pads soon came to be seen as a way to “power dress” in the workplace. This was a subtle way to indicate women were the equal of men in the office, and many women’s outfits had Velcro stitched to the inside shoulder to allow the attachment of differently sized pads. Late in the decade, a new trend toward wearing casual clothes in the workplace began, with “casual Fridays” adopted widely among companies. Lingerie worn as outerwear was another fashion trend in the 1980’s, when, by the middle of the decade, it was not unusual to see slip or bra straps deliberately showing. Madonna’s fashion choices were influential here as well. Lacy camisoles worn under jackets or sweaters became popular, as did camisoles with built-in bras. Slip dresses and skirts, made of shiny or flimsy fabrics and trimmed with lace, were featured in fashion magazines. Women began to prefer satin for lingerie, and the Jockey Company, known as a manufacturer of men’s underwear, started to produce French cuts trimmed with lace for more conservative women. The teddy, a onepiece camisole attached to tap pants, was a stylish undergarment. Historicism, or looking to the past for new styles, played a significant role in the world of fashion in the 1980’s. John Galliano, an influential designer, created Empire dresses in 1986 as a distinct nod to
The Eighties in America
Fashions and clothing
■
359
cluding stylish clothes that could be worn to work. Brand names became increasingly important. Middle-class people could afford to buy clothes by highfashion designers such as Ralph Lauren and Calvin Klein, who obliged their clientele by offering lines of casual clothes. The “preppy” look was also inspired by the Reagans. The word “preppy” had been around for a long time and was used to describe a person who had attended a private preparatory school. The style and manner associated with the wealthy, old-money students who attended such schools were both common and recognizable enough by 1980 to inspire a best-selling satire. The Official Preppy Handbook (1980), edited by Lisa Birnbach, purported to explain the lifestyles and illustrate the fashions of preppies, including their polo shirts, white bucks, A-line skirts, and cardigans. As the decade progressed, these styles entered the American mainstream: Izod polo shirts, sporting a distinctive alligator, were one of the most popular clothing items of the decade. Although muted colors were fashionable for most of the decade, bright red, called “Reagan red,” made a splash, and exercise clothes, which boomed during the 1980’s, were popular in bright, primary colors.
Youth fashions were among the most distinctive of the 1980’s. (Hulton Archive/Getty Images)
another era. Karl Lagerfeld, a Paris designer, used a variety of historical sources for his 1980’s designs and was especially influenced by clothes portrayed in the paintings of Antoine Watteau. Norma Kamali, once considered a revolutionary designer, changed her style from modernism in the 1970’s to historicism in the 1980’s, an appropriate reflection of the past in the context of contemporary design. President Ronald Reagan, who was elected in 1980, and his wife, Nancy, were important fashion influences. They wore tasteful, expensive clothes and held lavish parties at the White House, making it fashionable to be elegant again. Ball gowns and sophisticated styles for women came into fashion, in-
Men’s Fashions Pinstriped business suits for men came back in fashion for the first time since the 1930’s and 1940’s, although the pinstripes of the 1980’s were narrower than those of previous decades. Jacket lapels and neckties were also narrower, with the total effect being a slimmer, more tailored silhouette. Button-down collars made a return, a notable example of the preppy style. Although pastel colors dominated fashion in the early 1980’s, by 1984 there was a return to conservative colors in men’s clothes. The exception was Hawaiian shirts with colorful flower prints, which became fashionable for men as the decade wore on. There were other influences, however, and the popularity of the Miami Vice television show encouraged young men to wear T-shirts under expensive suit jackets. Athletic shoes became acceptable for casual dressing, with Air Jordan basketball shoes making their debut in 1985. Although athletic shoes had been worn casually before, they became high-priced, high-fashion items in the 1980’s. Other manufacturers introduced expensive athletic shoes, and Adidas sneakers soared in popularity among young men. Nike also had a hefty share of the market, driven by
360
■
Fast Times at Ridgemont High
their brand Air Max. Leather high-tops became fashionable as well. Trendy underwear was worn by men as well as women. With baseball star Jim Palmer as the new Jockey promoter, new focus was directed at skimpy bikinis and bold prints, which were modeled by star athletes in magazine ads. Oakland Raiders quarterback Howie Long appeared in ads for Hanes bikinis and colorful briefs, creating a demand for those items. Colored and patterned bikinis, or low-rises, which ensured a trim pants silhouette, became wildly popular among men of all ages. Another trend in the 1980’s was the outdoor look, an ensemble look that featured huge hiking boots, jeans, and flannel shirts for city wear. Leather jackets, popularized by Michael Jackson, became trendy, especially if they were oversized and worn with a slouch. Late in the 1980’s, brown leather aviator jackets, styled after Word War II fighter pilot jackets, made a comeback. Impact While some fashion trends of the 1980’s, such as leg warmers, faded quickly, others became part of the mainstream and lasted. Although gay men have often been thought of as trendsetters in the fashion world, in the 1980’s, elements of gay fashion exploded into the mainstream. Tattooing and piercing entered the mainstream as well. Women in the workplace attempted to demonstrate their professionalism by adopting power suits, but soon casual clothes became more common in the office. Fashion models started to become celebrities in the 1980’s, and the most famous of them were termed “supermodels.” Individual models became associated with the brands of makeup or perfume that they represented, and they enjoyed wide public recognition. Moreover, as the country recovered from recession and the energy crisis, glamour regained its fashionable status in the 1980’s. Further Reading
Acker, Kerry. Everything You Need to Know About the Goth Scene. New York: Rosen Publishing Group, 2000. An analysis of the phenomenon of the goth style and attitude that was so popular during the 1980’s. Austin, Stephanie. The Preppy Problem. New York: Fawcett, 1984. Explores the pros and cons of the new popularity of preppy fashions. Brewer, Christopher. Fashion. New York: Oxford University Press, 2003. Examines the relationship between the important fashion magazines, such
The Eighties in America
as Vogue and Elle, and the high fashion designers. Martin, Richard, and Harold Koda. The Historical Mode: Fashion and Art in the 1980’s. New York: Rizzoli, 1989. A beautifully illustrated book that traces the roots of 1980’s fashion to their historical sources. Sheila Golburgh Johnson See also
Androgyny; Business and the economy in the United States; Designing Women; Fads; Flashdance; Hip-hop and rap; Jackson, Michael; Madonna; Miami Vice; Power dressing; Preppies; Reagan, Nancy; Reagan, Ronald; Valley girls; Women in the workforce.
■ Fast Times at Ridgemont High Definition Teen comedy film Director Amy Heckerling (1954) Author Book and screenplay by Cameron Crowe
(1957) Released August 13, 1982
Date
Fast Times at Ridgemont High was one of the best 1980’s films documenting the realities of high school from a teenage perspective. It included frank but humorous discussions of sex, drug use, family relationships, work, and the difficult transitions to adulthood. A number of budding stars and future Academy Award winners appeared in the film as struggling teens. Directed by Amy Heckerling and based on the undercover experiences of Cameron Crowe in a Southern California high school, Fast Times at Ridgemont High (1982) broke new ground in the mainstream portrayal of youth culture in the 1980’s. Crowe adapted his best-selling book into a screenplay that followed the lives of about a dozen characters through a year at Ridgemont High School. Though the film is set in Southern California, its larger theme of loss of innocence and becoming an adult had much wider appeal. The experiences of Brad Hamilton (played by Judge Reinhold), his sister Stacy Hamilton (Jennifer Jason Leigh), and their mutual friend Linda Barrett (Phoebe Cates) form the emotional center of the film. Brad begins his senior year with a bright outlook, but his fortunes quickly change. By the end of his senior year, he has lost his girlfriend and gone through a string of fast-food jobs. His sister, Stacy, is
The Eighties in America
the film’s transitional character. The freshman has lost her virginity, maintains a part-time job at the mall, and is close friends with coworker and selfproclaimed sexual expert Linda. Stacy’s attempt to have fun and be promiscuous brings difficult consequences. After being courted by the shy Mark Ratner (Brian Backer), Stacy becomes infatuated with ticketscalper Mike Damone (Robert Romanus). After an awkward fling, Stacy becomes pregnant, and with no support from Mike, she decides to have an abortion. At the same time, Linda becomes engaged to her college boyfriend and questions her own life choices. The film’s other characters are by turns equally dramatic, engrossing, and humorous. Sean Penn’s performance as surfer Jeff Spicoli set the standard for depicting California stoner culture. Forest Whitaker played Charles Jefferson, the school’s African American football star who allegedly “only flew in for games.” Notable tertiary roles include Anthony Edwards and Eric Stoltz as Spicoli’s stoner friends and the seventeen-year-old Nicholas Coppola (later Nicolas Cage) as Brad Hamilton’s friend. A short-lived television show followed, but it could not equal the power of the film. Impact With a careful balance of humor and reality, Fast Times at Ridgemont High examined the attitudes of 1980’s adolescents. The film introduced middle America to West Coast culture, while revealing the national youth trends of socialization at the shopping mall, promiscuous sex, drug use, and a relaxed attitude toward education. More sophisticated than many of the decade’s teen films, Fast Times at Ridgemont High was still entertaining; it was a modest success at the box office, and it put both Heckerling and Crowe on the map as talents to watch. Further Reading
Crowe, Cameron. Fast Times at Ridgemont High. New York: Simon and Schuster, 1981. Kepnes, Caroline. “Higher Learning.” Entertainment Weekly, nos. 612/613 (September 7, 2001): 168. Maslin, Janet. “The Screen: ‘Ridgemont High.’” The New York Times, September 3, 1982, p. C6. Aaron D. Purcell See also
Abortion; Education in the United States; Film in the United States; Generation X; Teen films; Valley girls.
Fatal Attraction
■
361
■ Fatal Attraction Identification American film Director Adrian Lyne (1941) Date Released September 18, 1987
Fatal Attraction was the second highest grossing film released in 1987. A psychological thriller about the consequences of infidelity, the film led to much discussion both of fidelity in marriage and of the representation of femininity in motion pictures. Fatal Attraction begins with a married man (Dan, portrayed by Michael Douglas) having a brief fling with a business associate (Alex, played by Glenn Close) while his wife Beth (Anne Archer) is out of town. The affair turns dangerous when Alex both wants to continue the relationship and proves to be mentally unstable. Alex seems to have many types of psychological disorders, including impulsive actions, suicidal tendencies, and extreme mood swings between intense anger and adoration. While Dan thinks the relationship is over, Alex begins to stalk him; she comes to his office but is rebuffed. Alex then begins calling Dan’s office, and when Dan stops taking her calls there, she begins calling his home. In one of the film’s most memorable moments, Alex kills the pet rabbit belonging to Dan’s daughter and leaves it boiling in a pot on the kitchen stove for the family to find. She kidnaps the daughter from school but returns her home after Beth is injured in a car wreck. Eventually, the scorned woman decides to eliminate her competition—the wife. Alex sneaks into the house and tries to kill Beth with a knife. Dan hears the screaming and comes to his wife’s aid. Dan fights Alex in the bathtub until she seemingly drowns. As Dan and the audience relax after the struggle, Alex emerges from the tub, still swinging the knife. Beth had gone to get a gun and arrives just in time to save her husband by shooting Alex. This time, Alex stays dead. The film received six Academy Award nominations, including Best Actress (Close), Best Supporting Actress (Archer), Best Director, Best Film Editing, Best Adapted Screenplay, and Best Picture. Cultural discussions of the film referred often to its gender politics. Fatal Attraction is a classic example of a film that turns a potential villain (the cheating husband) into a victim. Moreover, the representation of out-ofcontrol female desire clearly exploited cultural fears of aggressive and sexually frank women. This was
362
■
Fax machines
made most clear by the decision to change the ending of the film. In the original ending, Alex commits suicide in such a way as to make it look as though Dan killed her. Dan is arrested for her murder. Test audiences reacted negatively to an ending in which the male protagonist was punished too severely and the female antagonist seemed to succeed, so the end was reshot. Impact
Fatal Attraction triggered cultural conversations on a variety of levels. Some discussions questioned whether Dan or Alex was the true victim, or whether it was actually the innocent wife who had been forced to kill to save her husband. Many men saw the film as a cautionary tale about the dangers of infidelity. Students of culture saw it as indicative of the way Hollywood portrayed female power and sexuality in the 1980’s. So powerful was that portrayal that Glenn Close became typecast for years. Just the thought of her role could make some men cringe in horror.
Further Reading
Dougan, Andy. Michael Douglas: Out of the Shadows. London: Robson, 2001. Parker, John. Michael Douglas: Acting on Instinct. London: Headline, 1994. Dale L. Flesher See also
Academy Awards; Action films; Close, Glenn; Douglas, Michael; Feminism; Film in the United States; Horror films; Marriage and divorce.
■ Fax machines Identification
Machine that transmit and receive facsimiles of written documents over telephone lines
The development of fax machines during the 1980’s provided an inexpensive, fast, and reliable means for electronically transmitting correspondence, contracts, handwritten notes, and illustrations. The original concept for a facsimile (fax) machine was patented by Alexander Bain in 1843. The first commercial fax system was produced in France in 1865. It was too slow to be of any practical use. Fax machines did not begin to gain practical acceptance and popularity until the 1970’s; even then, they were at first prohibitively expensive. The price
The Eighties in America
of fax machines started to decrease in the late 1970’s and the 1980’s. The stimulus that initiated the widespread use of fax machines came in 1983, when a standard protocol for sending faxes at rates of 9,600 bits per second (bps) was adopted. This became known as the Group 3 standard. In 1985, GammaLink produced the first computer fax board. As machines became faster and cheaper, fax sales took off in the 1980’s. The more were sold, the more useful they became, since it made sense to purchase a fax machine only if one’s colleagues also possessed them. In 1983, over 100,000 machines were sold. That number was doubled in 1986. Canon introduced the first plain-paper fax machine in 1987. By 1989, over 4 million fax machines were in use in the United States. During the 1980’s, fax machines became an integral part of telecommunications around the world. News services used them to send news articles and photos to news offices and television companies. Using fax machines, weather services sent weather charts, maps, and information to weather stations and television companies worldwide. Banks and financial institutions used them to send important personal information and legal documentation. Many businesses used faxes to share records and databases. Impact By the late 1980’s, fax machines had dramatically changed how communication occurred around the world. Combining the functions of a digital scanner, a modem, and a printer, the machines could copy, transmit, and reproduce handwritten or printed materials, drawings, maps, and photographs with a high degree of resolution. Images could be sent almost anywhere in the world at any time. The evolution of fax machines led to a wide variety of brands and styles and a wide range of capabilities. Consumers could eventually buy “all-in-one” printers that included a fax machine, a photocopier, a scanner, and a printer, all in one system. Adoption of the Group 3 standard of fax transmission in 1983 eventually led to Group 4 fax machines. These faxes worked with Integrated Services Digital Network (ISDN) lines and could scan with a resolution of 400 dots per inch (dpi), enabling the copying and transmission of engineering drawings. The development of fax machine technology also played an important role in the production of cellular phones that began in the 1980’s.
The Eighties in America Further Reading
Ceccarelli, Marco, ed. International Symposium on History of Machines and Mechanisms Proceedings. New York: Springer, 2000. Fishman, Daniel. The Book of FAX: An Impartial Guide to Buying and Using Facsimile Machines. Chapel Hill, N.C.: Ventana Press, 1988. Margolis, Andrew. The Fax Modem Sourcebook. New York: John Wiley & Sons, 1995. Alvin K. Benson See also
Cell phones; Computers; Globalization; Inventions; Science and technology.
■ Feminism Definition
Collection of theories and social movements whose common goal is to empower women politically, socially, and economically
Mainstream American feminism in the 1980’s continued to focus on the notion of equality between men and women, while many academic feminists argued that striving for equality effaced important differences between the sexes and prevented feminine values from being respected. Meanwhile, a growing backlash developed against feminism, as many women disavowed the label “feminist,” and the movements witnessed attacks on reproductive rights, increasing sexual violence, and continuing economic disparities between men and women. The defeat of the Equal Rights Amendment (ERA) to the U.S. Constitution was a major blow to feminism and women’s rights in the United States. The ERA stated: Equality of rights under the law shall not be denied or abridged by the United States or by any state on account of sex. The Congress shall have the power to enforce, by appropriate legislation, the provisions of this article. This amendment shall take effect two years after the date of ratification.
Passed by Congress in 1972, the ERA was sent to the states for ratification. Unlike other constitutional amendments, the ERA was given a time limit within which it had to be ratified—seven years. By the end of the seven years, only thirty-five of the required thirty-eight states had ratified the amend-
Feminism
■
363
ment. Faced with growing opposition from conservative groups, the initial move to pass the amendment slowed and five states even voted to rescind their ratifications. By the 1980’s, support for and opposition to the amendment was largely split along party lines, with Democrats in favor of its adoption and Republicans against. With the election of Ronald Reagan to the presidency, the conservative swing in the country eliminated any immediate hopes of passing the ERA. New Strategies Faced with the defeat of the ERA and a backlash against feminism, feminist advocates and activists made major changes in their organizations and strategies. For many, the 1980’s became a decade of “defensive consolidation,” as feminists found that they had to defend gains they thought were already won. A growing anti-abortion movement picketing women’s health care clinics and harassing doctors who worked in these clinics forced supporters of legal abortion to respond with organized rallies and to act as escorts for women using the clinics. Supporters also went to court at the local and federal levels to defend legal abortion. A building consensus over protecting women against violence was under attack. The trivialization of sexual harassment and even rape caused feminists to organize and work to change codes of conduct on college campuses. In 1980, the Equal Employment Opportunity Commission (EEOC) ruled that sexual harassment was a violation of the Civil Rights Act. The Supreme Court affirmed this interpretation of the law in 1986. Feminists also began to focus more on domestic violence. The National Family Violence Survey of 1985 found that over 16 percent of all couples experienced at least one act of violence each year. One television docudrama, The Burning Bed (1984), raised the national awareness of domestic violence and increased support for feminists seeking to strengthen laws relating to domestic violence against women. Economic and Poltical Issues Although women theoretically made economic gains throughout the 1970’s because of the Civil Rights Act, the EEOC, and affirmative action policies, the Justice Department did little during the 1980’s to enforce those policies. It became harder for women (and for minorities) to bring individual or class-action suits against employers for discrimination. Individual women continued to break ground: Sandra Day
364
■
The Eighties in America
Feminism
O’Connor became the first female Supreme Court justice and Geraldine Ferraro was the first woman nominated as a vice presidential candidate by a major political party. However, large numbers of women were still segregated to “pink-collar” jobs in the service sector of the economy, jobs that were often lowpaying and without benefits. This unequal division of labor served to increase the gendered wage gap, but even within the professions, women continued to earn less than men while doing the same work with the same education and years of experience. Feminists focused on changing employers’ hiring and promoting policies, arguing that gender equity made for good business. Feminist Theories Feminism is not a monolithic movement or ideology. Because of the defeat of the ERA, the backlash of the Reagan years, and internal issues in the movement, uncertainty existed as to which of a variety of feminisms entailed the most useful core values and strategies. One dominant feminist theory that formed the basis for much of the U.S. women’s movement in the 1960’s and 1970’s was liberal feminism, which resisted women’s inequality, locating it specifically in the division of labor into women’s work and men’s work in a world that still saw women as primary caregivers and homemakers. Another major strand of feminism rose to prominence in the 1970’s and greatly influenced the academic discipline of women’s studies in the 1980’s. This strand focused on issues of representation, saw fundamental differences between men and women in Western societies, and argued that patriarchal cultures improperly validated masculine values and denigrated feminine values. The academic feminist study of representation and difference was further subdivided into two camps. One camp (often called biological determinism, or essentialism) believed that some values are inherently masculine or inherently feminine. The other camp believed that there is no inherent biological connection between sex, gender, and values, but that each society creates such connections, making “masculinity” and “femininity” themselves historically and culturally specific constructions. Those feminists who focused on the representation of gender were sometimes suspicious of the call for “equality,” because they worried that “equality” meant the acceptance of male values and the opportunity for women to become just like men, rather than valuing
their own difference. During the 1980’s, however, this criticism failed seriously to shape debates outside academia. In the political arena, feminists continued to resist inequality. Since gender inequality persisted despite changes in the law, feminists began to look for other reasons for its continued existence. Socialist feminists saw a need for changes in the workplace itself, including pay equity for those with “pink-collar” jobs, salaries and benefits for homemakers, better and more affordable day care, and equal access to all government positions. Very few of these demands were met. Some radical feminists cited the increase of violence against women and children as a cause of continued inequality. They emphasized that women continued to be exploited and objectified in pornography and prostitution, as well as experiencing sexual harassment at work. Other feminists focused on the multiple oppressions of women of color and immigrant women, who were exploited not just because of their gender but also because of their race and status as immigrants. Although the 1980’s was a decade of backlash and roadblocks for feminism, then, feminists continued to work for and theorize about the transformation of society into a more genderequitable one. Further Reading
Faludi, Susan. Backlash: The Undeclared War Against American Women. New York: Crown, 1991. Close examination of the challenges women faced in the 1980’s. Ferree, Myra Marx, and Beth B. Hess. Controversy and Coalition: The New Feminist Movement Across Four Decades of Change. New York: Routledge, 2000. A definitive account of the women’s movement over a forty-year period, with a critical analysis of the 1980’s. French, Marilyn. The War Against Women. New York: Summit Books, 1992. Contextualizes American women’s inequality within a global perspective on the 1980’s. Lorber, Judith. Gender Inequality: Feminist Theories and Politics. Los Angeles: Roxbury, 2005. Using selections from original sources, Lorber explains the full range of feminist theories and their strategies for change. Mansbridge, Jane J. Why We Lost the ERA. Chicago: University of Chicago Press, 1986. Analyzes the development and defeat of the ERA.
The Eighties in America
Watkins, Susan Alice, Marisa Rueda, and Marta Rodriguez. Introducing Feminism. Cambridge, England: Icon Books, 1999. An accessible introductory text that covers the development of the movement, looking at feminism’s challenges and achievements. Susan A. Farrell See also
Abortion; Business and the economy in the United States; Ferraro, Geraldine; O’Connor, Sandra Day; Reagan, Ronald; Supreme Court decisions; Women’s rights.
■ Ferraro, Geraldine Identification
U.S representative and the first female major-party nominee for vice president of the United States Born August 26, 1935; Newburgh, New York Through her representation of her constituents and party, Ferraro advanced the cause of women hoping to hold high political office in the United States.
Ferraro, Geraldine
■
365
ter interviewing several candidates, Mondale settled on Ferraro, hoping that the excitement over the historic nomination of a woman to be vice president would help him win votes. Ferraro’s nomination was initially met with favorable publicity, but problems soon developed. Ferraro was both pro-choice and a Roman Catholic. Her decision to protect a woman’s right to terminate her pregnancy thus put her at odds with her faith, a fact discussed in the media. Immediately after her nomination, moreover, Ferraro promised that both she and her husband, John Zaccaro, would release their tax returns. A month later, she announced that she would release hers, but Zaccaro could not release his for business reasons. This provoked a media storm, forcing Ferraro to seek a compromise: She announced that Zaccaro would release a financial statement instead of a tax return. Initially, he refused to release even this statement; this long, drawn-out incident in the middle of the campaign seriously compromised Ferraro’s credibility. Ferraro comported herself well in her vice presidential debate against George H. W. Bush, even scoring high marks when she caught Bush being patronizing. Still, if her candidacy was calculated to convince more women to vote for the Democratic ticket, it failed to do so: Post-election exit polls indi-
The daughter of an Italian immigrant, Geraldine Ferraro worked her way through college and law school. After a brief legal career, she was elected to the House of Representatives in 1978. Ferraro served three twoyear terms there before Democrat Walter Mondale selected her to be his vice presidential running mate in 1984. Mondale had served as vice president under President Jimmy Carter before he and Carter were defeated in their bid for reelection by Ronald Reagan and George H. W. Bush in 1980. Running against a reasonably popular incumbent seeking a second term, Mondale was a slight underdog and sought a dramatic gesture that might enable him to overtake the Reagan-Bush team. He decided to use his choice of running mate as a statement about the inclusiveness and progressiveness of the Democratic Party, and Geraldine Ferraro discusses the finances of her husband, John Zaccaro, during a news conference on August 21, 1984. Questions about Zaccaro’s financial dealings weakened he sought a woman or racial miFerraro’s vice presidential candidacy. (AP/Wide World Photos) nority to complete his ticket. Af-
366
■
The Eighties in America
Fetal medicine
cated that a majority of female voters had voted for the opposition. Women did later begin to vote for Democrats to a greater degree than for Republicans, however, creating the so-called gender gap. Impact
Although Geraldine Ferraro was not elected, her candidacy preceded—and probably helped lead to—an explosion in the number of women who ran for and won political office in the United States. Beyond the political arena, the sight of a woman in a position of power on the national stage advanced the cause of feminism in the American workplace and in American culture generally. Further Reading
Breslin, Rosemary. Gerry! A Woman Making History. New York: Pinnacle, 1984. Drew, Elizabeth. Campaign Journal. New York: Macmillan, 1985. Ferraro, Geraldine. Ferraro: My Story. New York: Bantam Books, 1985. _______. Framing a Life: A Family Memoir. New York: Scribner, 1998. Witt, Linda, Karen M. Paget, and Glenna Matthews. Running as a Woman: Gender and Power in American Politics. New York: Free Press, 1993. Richard L. Wilson See also Elections in the United States, 1984; Feminism; Liberalism in U.S. politics; Mondale, Walter; Women’s rights.
■ Fetal medicine Definition
Maintenance of health and detection and treatment of diseases in unborn children
Fetal medicine gained prominence in the 1980’s as a medical specialty. Accurate and detailed assessment of the health of unborn children was made possible through the use of high-resolution ultrasound. Malformations, illnesses, and poor fetal growth became diagnosable and even treatable before birth. In the early 1980’s, the introduction of biophysical profile (BPP) scoring greatly facilitated the assessment of fetal health. The BPP assessed fetal movement, tone, breathing, heart rate accelerations, and amniotic fluid volume. A high BPP score indicated a healthy fetus, while a low score reflected a fetus in trouble. Early recognition of a harmful uterine envi-
ronment allowed timely delivery of the fetus and circumvented complications such as stillbirth or brain damage due to lack of oxygen. Diagnostic Technologies
Advances in fetal medicine also made it possible to detect genetic disorders in an unborn child by screening an expectant mother’s blood for specific biochemical markers. The first routinely used marker was alpha-fetoprotein (αFP). Elevated αFP levels required detailed evaluation of the fetus for malformations such as spina bifida and anencephaly. Tests for other biochemical markers—human chorionic gonadotropin (hCG) and unconjugated estriol (uE3)—arrived in the latter part of the decade. Testing the mother’s blood for these three markers helped identify approximately 60 percent of fetuses with Down syndrome. Improved ultrasound imaging techniques made the fetus itself directly accessible for diagnostic testing. Under ultrasound visualization, a needle could be guided through the mother’s abdomen into an umbilical cord vessel and fetal blood could then be drawn for laboratory examination. This procedure, known as percutaneous fetal blood sampling (PUBS) or cordocentesis, was also used to give the unborn child medications or blood products. A mother whose blood is Rhesus negative may form antibodies against the red blood cells of her baby if it is Rhesus positive. These maternal antibodies destroy the fetus’s red blood cells and cause severe, lifethreatening anemia. With the help of PUBS, these fetuses could be transfused during pregnancy until it became safe to deliver them.
Fetal Surgery The ability to visualize an unborn child in great detail allowed early discovery of organ malformations that caused death or were associated with poor long-term neurodevelopment. The thought arose that correction of these anomalies before birth might increase the child’s chances of survival or improve its neurologic outcome. Normal growth of a fetus’s lungs requires the presence of amniotic fluid. Fetal urine is an important component of amniotic fluid, and fetuses with no kidneys or poorly functioning kidneys produce little to no urine and die rapidly after delivery. Some fetuses have an obstruction in their urinary tract that does not allow urine to flow freely out of the bladder into the amniotic cavity. In 1981, Michael Harrison and his colleagues performed the first fe-
The Eighties in America
tal surgery in the United States on a fetus with urinary tract obstruction, facilitating the flow of fetal urine from the bladder into the amniotic sac. Ultrasound imaging also facilitated antenatal diagnosis of hydrocephalus, a buildup of cerebrospinal fluid (CSF) within the skull that can lead to mental retardation. In the second fetal surgery conducted in the United States, William Clewell and his colleagues placed a drain in a fetus with hydrocephalus, thus allowing the CSF to empty into the amniotic cavity. Congenital diaphragmatic hernia (CDH) represents a defect in a fetus’s diaphragm that leads to entry of abdominal contents such as intestine, spleen, or liver into the chest cavity. This malformation impairs good lung growth and is associated with a high mortality rate. The first repair of CDH in a fetus occurred in 1983. Other fetal surgeries performed in the 1980’s included removal of large spinal tumors (sacrococcygeal teratomas) and resection of large lung masses (congenital cystic adenomatoid malformation). Identical twin pregnancies can be complicated by twin-twin transfusion syndrome (TTTS). In such cases, one twin will be very anemic while the other twin is overwhelmed by excessive blood flow. Such a condition can lead to the death of both fetuses. The cause of TTTS appears to be an imbalance of blood flow due to communicating blood vessels in the placenta. In 1988, Julian De Lia, with the help of laser therapy, was able to interrupt these vascular communications to treat TTTS effectively. Impact During the 1980’s, the improved ability to detect a fetus in distress led to a dramatic decline in perinatal mortality. Routine screening of multiple biochemical markers allowed early identification of genetic disorders and brought issues such as termination versus continuation of pregnancy to the forefront of public consciousness. The ability to transfuse a fetus greatly improved the survival of Rhesus-sensitized babies. Even though the media enthusiastically reported stories about “miracle” babies, however, only about 35 percent of fetuses actually survived surgery, and little is known about the quality of life of those that lived. Moreover, although medicine’s ability to provide health care for the unborn seemed almost limitless, adequate prenatal care was not universal, and fetal intervention was limited to a select few. These rapid advances in fetal
Film in Canada
■
367
medicine allowed unborn children to acquire the status of patients in themselves. Further Reading
Brunner, Joseph. “In Their Footsteps: A Brief History of Maternal-Fetal Surgery.” Clinics in Perinatology 30, no. 3 (September, 2003): 439-447. Introduces four pioneers of fetal surgery and their accomplishments. Caspar, Monica, ed. “Fetal Matters.” In The Making of the Unborn Patient. New Brunswick, N.J.: Rutgers University Press, 1998. Critical analysis of fetal surgery; discusses low success rates, risks to mothers, and ethical issues. Manning, F. A. “Reflections on Future Directions of Perinatal Medicine.” Seminars in Perinatology 13, no. 4 (August, 1989): 342-351. Reports the impact high-resolution ultrasound had on the rapid development of maternal-fetal medicine. Scioscia, Angela. “Prenatal Genetic Diagnosis.” In Maternal-Fetal Medicine, edited by Robert Creasy and Robert Resnik. Philadelphia: W. B. Saunders, 1999. Description of biochemical markers used to detect fetal anomalies. Elisabeth Faase See also
Abortion; Baby Fae heart transplantation; Genetics research; Health care in Canada; Health care in the United States; Health maintenance organizations (HMOs); Medicine; Transplantation; Women’s rights.
■ Film in Canada Definition
Motion pictures produced by Canadians and distributed in Canada
After the system of tax breaks for companies making films in Canada underwent changes during the early 1980’s, the Canadian film industry suffered a brief slump, and some promising directors left Canada for Hollywood. The industry made a comeback in the later half of the decade, however: Several major independent and art-house directors filmed in Canada, and some American studio productions shot there as well in order to save money. Beginning in the mid-1970’s, Canada offered lucrative tax breaks to production companies filming in Canada, allowing people who invested in Canadian feature films to deduct 100 percent of their invest-
368
■
Film in Canada
ment. Such films were defined as motion pictures with running times of more than seventy-five minutes whose producer and two-thirds of whose “creative personnel” were Canadian. As a result of the tax law, many films of varying quality were made in Canada.
The Eighties in America
cial benefit when compared to the benefits realized by foreign-owned “Canadian” productions. As a result, the government phased out the tax program. By 1987, the deduction for Canadian film investors had decreased to 30 percent of their investment. Critical Success and “Runaway Shoots”
Seeking an Authentic Canadian Cinema
Film critic Ted Magder describes the films produced during this era as belonging to one of two types—films geared toward an American audience, such as Porky’s (1982), and films specifically about Canada, such as The Grey Fox (filmed 1980, released 1983). The most famous of these films from the 1980’s were The Changeling (1980), Prom Night (1980), and Porky’s— the latter of which was the highest-grossing film in Canadian history, taking in $152 million worldwide. These films were not popular with Canadian critics, however, who pointed out that many of them were not Canadian in content. They may have been shot in Canada, but they were set in the United States. Indeed, Porky’s was not even shot in Canada: It was made in Florida by a Canadian crew. On the other hand, films appreciated by the critics for their Canadian content, such as The Grey Fox, did not make money. Early in the 1980’s, the Canadian Film and Video Certification Office changed the definition of “Canadian film” for the purposes of the tax code. The office introduced a point system designed to narrow the previous definition and ensure that more Canadian performers and screenwriters would participate in Canadian films. The immediate result of the new restrictions was a decline in the production of original Canadian films. Several Canadian directors, such as The Grey Fox’s Philip Borsos, left Canada to make films in Hollywood. In the wake of this slump, Magder notes, the Canadian film industry was faced with two serious challenges: to maintain “cultural sovereignty,” creating authentically Canadian films despite the cultural dominance of the United States, and to address a financial disparity that existed within Canadian productions. Most of the profits reaped by Canadian films went to American companies, making it even more difficult to cultivate an authentically Canadian film industry. The Canadian establishment, moreover, became embarrassed by the content of many of the films made in Canada to capitalize on the tax breaks. Nor did Canadian filmmakers experience much finan-
Canadian film rebounded in the mid- to late 1980’s as a number of films became financially and critically successful internationally. Sandy Wilson’s My American Cousin (1985) was a modest success, and Patricia Rozema’s I’ve Heard the Mermaids Singing (1987) earned raves at the Cannes Film Festival and made $6 million (having been made for only $350,000). In 1987, Denys Arcand’s Le Déclin de l’empire Américain (1986; The Decline of the American Empire) was nominated for the Academy Award for Best Foreign Language Film, as was his Jésus de Montréal (1989; Jesus of Montreal), which also won the Grand Jury Prize at Cannes. Another Québécois film, Jean-Claude Lauzon’s Un Zoo la nuit (1987), was also a success. David Cronenberg, who made several popular Canadian cult films in the 1970’s, as well as 1982’s Videodrome and the more mainstream films The Dead Zone (1983) and The Fly (1986), scored an international critical hit at end of the decade with Dead Ringers (1988). Atom Egoyan, who would go on to more success in the following decade, had his first critical success in 1989 with Speaking Parts. While films of true Canadian content were beginning to make their mark internationally, the shadow of American filmmaking remained. Somewhat reminiscent of the tax-break era, many U.S. productions found their way to Canada at the end of the decade. Magder notes that these “runaway shoots” made up about 50 percent of the films and television productions shot in Canada and approximately 95 percent of the ones shot in British Columbia during the late 1980’s, thanks to the lower costs involved in using Canadian crews and the weaker Canadian dollar. Impact Canadian film in the 1980’s began and ended with the encroachment of the American film industry. Beginning with films such as Porky’s and ending with “runaway shoots” that would continue into the following decades, the Canadian film industry weathered economic challenges and came back strong, producing a number of internationally critically acclaimed films by Arcand, Rozema, Cronenberg, and Egoyan in the latter half of the decade.
The Eighties in America Further Reading
Beard, William, and Jerry White, eds. North of Everything: English-Canadian Cinema Since 1980. Edmonton: University of Alberta Press, 2002. Provides critical essays on animation, black Canadian cinema, aboriginal films, and films about AIDS, as well as essays about directors Borsos, Cronenberg, and Rozema. Leach, Jim. Film in Canada. Oxford, England: Oxford University Press, 2006. Academic articles covering the history of Canada’s film industry and specific filmmakers such as Cronenberg, Arcand, and Egoyan. Covers a wide range of history, but no chapters are specifically devoted to the 1980’s. Magder, Ted. Canada’s Hollywood: The Canadian State and Feature Films. Toronto: University of Toronto Press, 1993. Provides a thorough understanding of the politics and economics behind the Canadian film industry. Chapters 9 through 12 are specific to issues affecting 1980’s cinema. Melnyk, George. One Hundred Years of Canadian Cinema. Toronto: University of Toronto Press, 2004. Covers the first one hundred years of Canadian film history. Good chapters on the Québécois film industry and English Canadian filmmakers Cronenberg and Egoyan. Chapter 8, “The Escapist Seventies,” provides good background on the tax-break era. Julie Elliott See also Business and the economy in Canada; Canada and the United States; Film in the United States.
■ Film in the United States Identification
Motion pictures produced by Americans and distributed in the United States
Films in the 1980’s tended to aim for mass audience approval. Studios began to reject financial models based on creating a steady stream of medium-budget, modest successes, each of which appealed to a specific market segment. Instead, they strove to create a few expensive blockbusters that would appeal to everyone at once. As a result, the major studios produced formula films that sought not to challenge or provoke, but to entertain. During the 1980’s, the film industry expanded, as new methods of production and exhibition and the
Film in the United States
■
369
growth of filmmakers’ technological capabilities resulted in Hollywood’s adoption of new economic strategies. It was the domination of President Ronald Reagan’s conservative ideology, however, that most dramatically affected the majority of mainstream motion pictures. Seeking to allay the chaotic mood of the previous decades, most Hollywood film reassured the American public by embracing a return to “traditional” values—meaning those thought to characterize the nation before the 1960’s. Troubling images of Vietnam, Watergate, assassinations of national figures, and the political activisim of the 1970’s faded beneath fantastic new technologies that aided the major film studios in targeting moviegoers between the ages of twelve and twenty and beguiling them with futuristic heroic fantasies that seemed to make the past new. Technological and Industrial Expansion
The development during the 1980’s of the videocassette recorder (VCR) and other methods of content distribution, including direct broadcast satellites and cable television, altered the customary production and exhibition of films. Filmmakers began to abandon the 35mm feature, as they insisted that if widescreen films were doomed to be cut for smaller screens, there was no point in creating larger images. Some producers, however, aimed to create films that would fit a smaller screen when necessary but could still exploit theatrical technologies in their initial release. For example, Top Gun (1986), directed by Tony Scott, managed to fit both large and small screens. It incorporated impressive widescreen cinematography and multitrack Dolby sound that functioned in theaters. At the same time, it was shot to ensure that important narrative information would not be lost when it was displayed on narrower screens, nor would its sound quality be muddied when played through a VCR and television. As a result, Top Gun grossed nearly $180 million. As the decade progressed, the majority of the studios began to sell in advance the rights to distribute their films nontheatrically, on videocassette and cable television, for example. As a result, premium cable stations Home Box Office (HBO) and Showtime began to finance feature films in order to gain exclusive broadcast rights to those films once they left theaters. They also created films and specials designed exclusively for cable television. As VCRs and cable subscriptions became more popular, studios began
370
■
Film in the United States
to demand more money in return for videocassette and cable distribution rights. Revenue from VCR and cable television distribution of a film frequently exceeded the film’s box-office gross. Film marketing in the 1980’s achieved new heights through publicity and merchandising of film-related products. Marketing departments sought to create as many “event films” as possible, frequently promoting run-of-the-mill films as spectacular ones. Studios also bought into cinema chains, effectively reestablishing a form of monopoly known as vertical integration that had been common during early classical cinema but was ruled illegal in 1948 by the U.S. Supreme Court. This practice proved profitable to studios in the 1980’s era of big business. Also, with the rise in the number of studios being taken over by conglomerates, financial decisions were no longer made by directors but were instead made by attorneys and consultants whose goals were entirely financial rather than aesthetic. As film budgets skyrocketed, many studio executives built their releases around popular film actors who were thought of as box-office draws. Such actors were secured for a given studio through long-term contracts, and their profit-generating potential was such that they could become “attached” to projects that as yet had no script or director. Films of the decade often relied on technical innovation in the absence of challenging or thoughtprovoking narratives. Newly developed technologies were evident in the reemergence of animated features. Earlier cartoons had often been distinguished by low budgets and rough animation. In 1982, however, Don Bluth and his Disney-trained animators made The Secret of NIMH, and in 1986, they completed An American Tail, the first animated film produced by Steven Spielberg. Excellently animated, these films utilized the rotoscoping technique, in which the human characters were shot live and then traced onto animation cels, providing a more human look for those characters. A New Mythology
Beginning with George Lucas’s Star Wars in 1977 and continuing into the 1980’s, film technology advanced by leaps and bounds. Dolby stereo created revolutionary sound technologies, and Lucas founded Industrial Light and Magic (ILM), a company devoted to providing special visual effects for major motion pictures. Lucas thus offered to other filmmakers the same groundbreaking
The Eighties in America
technologies he developed to complete his Star Wars trilogy. His films The Empire Strikes Back (1980) and The Return of the Jedi (1983) each advanced filmmaking technology significantly, greatly expanding the medium’s ability to create believable mythical worlds and immerse audiences within them. The trilogy thus laid the foundation for a major impulse of the 1980’s—that of providing a means for American viewers of mainstream films to escape the present and immerse themselves in older traditions of romance and mythology. In 1981, Lucas produced and co-authored Raiders of the Lost Ark, directed by Spielberg, which grossed $282 million. The film concerned Indiana Jones, an adventurous character created by Lucas whose escapades transported audiences back to the 1930’s and 1940’s Saturday afternoon serials, and engendered two sequels, Indiana Jones and the Temple of Doom (1984) and Indiana Jones and the Last Crusade (1989). Jones’s battles to restrain evil forces became a forerunner for the heroic efforts of John Rambo (played by Sylvester Stallone) in Rambo: First Blood Part II (1985). Rambo’s single-handed rescue of American prisoners in Southeast Asia—despite the military’s and the Central Intelligence Agency’s attempts to undermine him—enacted a fantasy of the heroic American soldier symbolically winning the Vietnam War. In the Rocky series (1976, 1979, 1982, 1985, and 1990), the fierce determination of Rocky Balboa (also played by Stallone) to win against overwhelming odds similarly depicted the blue-collar worker’s victory, thereby validating the American Dream, albeit a version tainted by racism. Teen Films Adult audiences were impressed by the striking photography and Dolby sound in such well-received films as Amadeus (1984), directed by Milos Forman; The Color Purple (1985), directed by Spielberg; Aliens (1986), directed by James Cameron; The Untouchables (1987), directed by Brian De Palma; and Out of Africa (1987), directed by Sydney Pollack. Many other directors, however, had followed the lead of Lucas and Spielberg in tapping into the lucrative teen market and blending action and adventure with science fiction to achieve boxoffice successes. Apparently spurred on by the phenomenal success of Spielberg’s E.T.: The ExtraTerrestrial (1982), in 1984 Ivan Reitman and Joe Dante released both Ghostbusters and Gremlins, which were box-office successes.
The Eighties in America
Mixing science fiction with nostalgia for happier days, Robert Zemeckis directed Back to the Future (1985)—an instant hit in which Michael J. Fox, as Marty McFly, time-traveled back to the 1950’s. The film was followed by two sequels (1989, 1990) in which McFly traveled to the Wild West and the twenty-first century. Adventure and fantasy films, also popular in the 1980’s, included Dragonslayer (1981), directed by Matthew Robbins; Conan the Barbarian (1982), directed by John Milius; The Sword and the Sorcerer (1982), directed by Albert Pynn; The Dark Crystal (1983), directed by Jim Henson; Ladyhawke (1985), directed by Richard Donner; and Willow (1988), directed by Ron Howard. Films about the anxieties of teens in high school, quite prevalent in the 1980’s, included Amy Heckerling’s Fast Times at Ridgemont High (1982), Martha Coolidge’s Valley Girl (1983), Garry Marshall’s The Flamingo Kid (1985), and John G. Avildsen’s The Karate Kid (1984). Many of the teen films featured a group of actors known as the Brat Pack, a term applied to an assemblage of actors who appeared in films as teens on the verge of adulthood. The performances of Rob Lowe, Ally Sheedy, Molly Ringwald, Emilio Estevez, Matt Dillon, Tom Cruise, and others were somewhat controversial, in that they were seen by some critics to be speaking to the dilemmas of the nation’s teens. Others, however, condemned them as soon-to-be Yuppies (young upwardly mobile professionals), a derogatory term used to denote the extent to which this socioeconomic group was materialistic, superficial, and self-centered. John Hughes addressed the class divide among teens in his early films Sixteen Candles (1984) and The Breakfast Club (1985), which featured Brat Pack actors. Hughes also directed Weird Science (1985) and Ferris Bueller’s Day Off (1986) and wrote the screenplay for Howard Deutch’s Pretty in Pink (1986) and Some Kind of Wonderful (1987). Francis Ford Coppola’s 1983 version of S. E. Hinton’s book The Outsiders, featuring Cruise, Dillon, Lowe, and Estevez, was followed by Coppola’s film version of its sequel, Rumble Fish (1983), which featured Dillon, Mickey Rourke, and Nicolas Cage. Edward Zwick’s About Last Night (1986), a film adaptation of David Mamet’s Sexual Perversity in Chicago (pr. 1974, pb. 1977), featured the dating and mating customs of young adults, while Joel Schumacher’s St. Elmo’s Fire (1985) focused on Brat Pack members trying to become successful in the business world. Emilio
Film in the United States
■
371
Estevez extended the Brat Pack formula into a Western, Young Guns (1988), directed by Christopher Cain. Contrary Visions
As critic Robin Wood has noted, within any predominant ideology, there will appear fractures or countercurrents, even in an effort to bombard viewers with feel-good films. As many films of the 1980’s strove to paper over the cracks in American society by diverting audience attention to fantasy or otherwise light entertainment, some directors refused to follow suit. One of the most outspoken and controversial directors, Oliver Stone, continued to examine the darker side of the 1960’s in Platoon (1986) and Born on the Fourth of July (1989). With Platoon, Stone, a Vietnam veteran, revived the war film represented in the late 1970’s by Hal Ashby’s Coming Home (1979) and Coppola’s Apocalypse Now (1979). Platoon grossed a healthy $138 million, more than many teen comedies and other escapist films of the decade. Other Vietnam War films followed: Coppola’s Gardens of Stone (1987), John Irvin’s Hamburger Hill (1987), Lionel Chetwynd’s Hanoi Hilton (1987), Stanley Kubrick’s Full Metal Jacket (1987), and Barry Levinson’s Good Morning, Vietnam (1988). A great many directors other than those responding to Vietnam continued to depict society in less than flattering terms. In Wall Street (1987), Stone indicted 1980’s greed and materialism through the reptilian character of Gordon Gekko, played by Michael Douglas. Martin Scorsese’s Raging Bull (1980), considered by many critics to be the best film of the 1980’s, depicted the film’s protagonist, Jake La Motta, as a man challenged by his own doubts and confusions. Broadcast News (1987), directed by James L. Brooks, presented a criticism of the news media through the handsome, brainless anchor portrayed by William Hurt. John Landis’s Trading Places (1983) suggested that the aggressive, manipulative practices of Eddie Murphy’s and Dan Ackroyd’s destitute characters were an appropriate response to the similar tactics used to cause their destitution. Paul Brickman’s Risky Business (1983), on the other hand, wittily satirized the Reagan era’s celebration of individual enterprise by portraying Tom Cruise’s foray into the business of prostitution as a shining example for upper-class but unremarkable high school students to follow. Independent Films
Independent films are traditionally viewed as those made without studio sup-
372
■
Film in the United States
port or control. Despite difficulties in raising necessary funds, many directors enjoy the opportunities for experimentation and artistic self-expression. A new era of growth for independent films began in 1980, when Robert Redford established the Sundance Institute; in 1985, he initiated a platform for exhibition of independent films. Insisting that diversity was the basis of independent films, Redford began opening doors to films by women, as well as African Americans and other minorities who had little voice in mainstream cinema. The success of Steven Soderbergh’s sex, lies, and videotape, Nancy Savoca’s bitingly satirical look at an Italian American wedding in True Love, and Michael Lehmann’s dark comedy Heathers—all in 1989—began a trend toward older viewers gravitating to smaller films that continued beyond the decade. Numerous talented directors prefer to be independents. John Sayles, who began his career with The Return of the Secausus 7 in 1980 and completed seven other independent films before the end of the decade, remained a first-rate director of films about communities and outsiders. David Lynch, whose eccentric avant-garde film Eraserhead appeared in 1979, went on to release The Elephant Man (1980), Dune (1984), and Blue Velvet (1986). The latter film, whose main character, Jeffrey Beaumont, unexpectedly journeys through the dark underside of American culture, was widely interpreted as a criticism of the false nostalgia being purveyed by many mainstream Hollywood films. John Waters, whose underground, “trashy” films of the 1970’s had a cult following, began to make more mainstream films in the 1980’s, including Polyester (1981) and Hairspray (1988). John Jarmusch began his career directing strange but innovative films in 1984 with Stranger than Paradise. Independent films provided opportunities for women directors previously denied voices. Susan Seidelman directed four films in the 1980’s: Smithereens (1981), Desperately Seeking Susan (1985), Making Mr. Right (1987), and She-Devil (1989), all stories dealing with personal identity and relationships. Lizzie Borden made two radically feminist films in the 1980’s: Born in Flames (1983), a militant film in which women form an army and attempt to take over the media, and Working Girls (1986), a day in the life of a prostitute working in a Manhattan brothel. Kathryn Bigelow began her genre- and genderbending films in the 1980’s with The Loveless (1982),
The Eighties in America
a return to the biker movies of the 1950’s, and Near Dark (1987), a vampire Western of cinematic and technical brilliance. Neo-Noir During the 1980’s, the genre film noir, having originally defined the post-World War II cinematic landscape as a murky universe plagued with mystery, violence, and betrayal, made its comeback as an expression of the moral confusion of the 1980’s. Although utilizing color, portable cameras, and occasionally striking special effects, neo-noir films—so named for their capacity to reappropriate past noir forms for contemporary purposes—began in 1981 with Lawrence Kasdan’s Body Heat. Despite its reworking of the dark, black-and-white 1940’s noir in color, the film proved no less concerned with exploring the underside of the American Dream. It was essentially a loose remake of Double Indemnity (1944), one of the most influential noirs of the 1940’s in which a man, seduced by a woman into killing her husband for insurance money, is betrayed by her, leading to the deaths of both. In Body Heat, Mattie, a modern femme fatale, seduces a seedy lawyer into killing her wealthy husband, while she takes the money, escapes to an exotic land, and leaves him to face life in prison. While other 1980’s neo-noirs depicted men who were seemingly at the mercy of women—for example, Adrian Lyne’s Fatal Attraction (1987), wherein a married man’s secret one-night stand backfires when the scorned woman terrorizes his family— several films pursued the figure of the successful, independent woman. In Betrayed (1988), Debra Winger plays an agent of the Federal Bureau of Investigation (FBI) infiltrating a white supremacist organization. In Bob Rafelson’s Black Widow (1987), Winger again portrays an FBI agent, pursuing a gorgeous femme fatale (Theresa Russell) who killed her husbands for their money. The film explores the agent’s sexual attraction to the glamorous murderer. In David Mamet’s independent directorial debut, House of Games (1987), a female psychoanalyst who has been fleeced by a male con artist turns the tables on him. One notable variation of this pattern is the figure of the “homme fatal,” appearing in Richard Marquand’s The Jagged Edge (1985), in which Glenn Close plays a lawyer hired to defend a charming, wealthy man accused of murdering his wife; she is convinced of his innocence until he attempts to kill her.
The Eighties in America
A great many neo-noirs were made by independent directors during the 1980’s, because this genre can often be cheaply made. Blood Simple (1984), the first film written, directed, and produced by brothers Joel and Ethan Coen, was made for a little over $1 million. Its complicated plot of grisly murder and betrayal was similar to John Dahl’s first feature, Kill Me Again (1989), a “cowboy noir” that detailed the exploits of a woman who set two men against each other in the Nevada desert. In the diverse body of films that composed neo-noir, viewers could see remakes of 1940’s noirs, such as Rafelson’s The Postman Always Rings Twice (1981), wherein the original Lana Turner and John Garfield characters are played by Jessica Lange and Jack Nicholson. They could also encounter generic twists and hybrids, such as Ridley Scott’s science-fiction noir Blade Runner (1982), or parodies, such as Carl Reiner’s Dead Men Don’t Wear Plaid (1982). Horror Films In the years that followed John Carpenter’s phenomenally successful Halloween (1978), “slasher” films flooded the market. Sean Cunningham’s Friday the Thirteenth (1980) created an oftenimitated slasher formula in which teenagers are serially murdered in a superabundance of unwarranted sex and violence. The tendency of such films to follow gratuitous nudity with gratuitous gore spurred much speculation by cultural critics regarding both the ideology and the practical consequences of these films. Psychotic killers—who had emerged in Alfred Hitchcock’s Psycho (1960) and Tobe Hooper’s The Texas Chainsaw Massacre (1974)—had by 1981 proliferated into a huge number of slashers that composed nearly 60 percent of all films released in the United States. Sam Raimi’s The Evil Dead (1983) spawned two sequels, Hooper’s Poltergeist (1982) was followed by two sequels and a television program, and Wes Craven’s Nightmare on Elm Street (1984) gave rise to ten sequels. Horror features appeared regularly throughout the 1980’s with an alarming violence quotient. Mainstream horror films included Tony Scott’s The Hunger (1983), Tom Holland’s Fright Night (1985), Craven’s The Serpent and the Rainbow (1987), and Mary Lambert’s Pet Sematary (1989). One of the most popular and influential horror films of the decade was Kubrick’s 1982 film version of Stephen King’s novel The Shining (1977). Other popular horror thrillers were De Palma’s Body Double (1984),
Film in the United States
■
373
a pornographically reworked combination of Alfred Hitchcock’s Rear Window (1954) and Vertigo (1958); The Jagged Edge; and John Schlesinger’s The Believers (1987). Perhaps in response to the rise of horror films, horror spoofs, as well as genuine horror films that nonetheless embraced black comedy, also became popular. Films like Motel Hell (1980) appeared to satirize the previous decade’s preoccupation with zombies. Stuart Gordon filmed H. P. Lovecraft’s classic horror story Re-Animator (1985) with tongue firmly in cheek, and The Toxic Avenger (1985), a cult midnight movie created by Michael Merz and Lloyd Kaufman, engendered three sequels. However, as box-office receipts began to dwindle, affecting Carpenter’s The Thing (1982), horror films became regular marketable items for home video and cable television. Critics became alarmed by the resultant easy access to gore by children, but the number of horror films produced continued to rise. Impact During the 1980’s, the resources of the studios began to be consolidated and focused on producing films with extremely large budgets that were designed to reap even larger rewards at the box office. Such films were designed to offer audiences the sorts of experiences they could not obtain at home with small screens and inexpensive sound systems. They often featured reworked and repackaged narratives and conventions from earlier decades, using new technologies to build exciting myths and immersive experiences. Lucas and Spielberg were the masters of this sort of filmmaking, and they became as famous as any of their stars. At the same time the major studios were consolidating their resources, however, independent productions and small studios sprang up to fulfill the demand for high-quality, niche entertainment that the blockbuster model ignored. A generation of important new American voices developed on the fringes of Hollywood, driving innovation in style and narrative technique, even as the mainstream blockbuster drove innovation in technology and special effects. Eventually, the success of these smaller films led each of the major studios to create divisions devoted to “independent” productions. Some directors worked within this structure, while others insisted on maintaining true independence, valuing their freedom of creative expression more than the studios’ capital.
374
■
Flag burning
The Eighties in America
Further Reading
Belton, John. American Cinema/American Culture. 2d ed. Boston: McGraw-Hill, 2005. An introduction to American cinema and its relationship to American national identity. Giannetti, Louis, and Scott Eyman. Flashback: A Brief History of Film. 4th ed. Upper Saddle River, N.J.: Prentice Hall, 2001. Contains an excellent chapter on American cinema in the 1980’s, specifically with regard to the influence of President Reagan. Levy, Emanuel. Cinema of Outsiders: The Rise of American Independent Film. New York: New York University Press, 1999. A thorough and excellent discussion of independent film in the United States. Mary Hurd See also Academy Awards; Action films; Back to the Future; Blade Runner; Blue Velvet; Brat Pack in acting; Empire Strikes Back, The; E.T.: The Extra-Terrestrial; Epic films; Fast Times at Ridgemont High; Fatal Attraction; Film in Canada; Ford, Harrison; Full Metal Jacket; Ghostbusters; Heaven’s Gate; Horror films; Hughes, John; Little Mermaid, The; Multiplex theaters; PG-13 rating; Platoon; Raging Bull; Raiders of the Lost Ark; Rambo; Science-fiction films; Scorsese, Martin; Sequels; sex, lies, and videotape; Special effects; Spielberg, Steven; Stone, Oliver; Teen films; Tron; Wall Street.
■ Flag burning
Gregory Lee Johnson, the respondent in the Texas v. Johnson flag-burning case, holds an American flag sent to him by an anonymous supporter. (AP/Wide World Photos)
Definition
Form of protest using symbolic speech by desecrating the national banner
A controversial demonstration outside a political convention in 1984 triggered a five-year legal battle in the courts over whether or not the desecration of the American flag was protected under the First Amendment. On August 23, 1984, Gregory Lee Johnson, a member of the Revolutionary Communist Party, was arrested for violating a Texas law forbidding the mistreatment of a state or national flag in a manner intended to offend those who might witness the act. The day before, he had set a stolen American flag on fire outside the Republican National Convention in Dallas, as one of approximately one hundred protesters angry over the nomination of Ronald Reagan as the Republican candidate for president of the United States. The first court that heard the case found Johnson guilty, fined him two thousand dol-
lars, and sentenced him to a year in jail. However, the Texas Court of Appeals overturned this ruling, arguing that flag burning is a type of symbolic speech and is thus protected under the First Amendment. Whether or not a person has the right to burn the American flag was an issue that divided many Americans, and it was widely debated. The case made its way to the U.S. Supreme Court, where a famous ruling was eventually made on June 21, 1989: In Texas v. Johnson, by a margin of 5 to 4, the Supreme Court ruled the freedom of speech does extend to such symbolic speech as flag burning and thus made all flag desecration laws then extant invalid. The swing vote in the case was Associate Justice Antonin Scalia, who joined the majority opinion. Congress disagreed and reacted swiftly. In an attempt to craft a constitutionally permissible law,
The Eighties in America
Congress passed the Flag Protection Act of 1989, which removed the proviso of the Texas law that the act of desecration had to be intended to offend someone. It sought instead to craft a law that was more content neutral, and President George H. W. Bush signed the act into law in late October. The passage of the Flag Protection Act—which made it a crime not only to burn a flag but also to “maintain [it] on the floor or ground”—set off a spate of U.S. flag-burning incidents. Protesters were arrested, and a new set of court hearings worked their way up through the legal system. On June 11, 1990, in United States v. Eichman, the Supreme Court upheld its earlier rationale and declared the Flag Protection Act of 1989 invalid, again by a vote of 5 to 4. The majority observed that, while the law attempted some content neutrality, it included an exemption for burning “worn or soiled” flags in a respectful ceremony, thereby confirming that it was designed to encourage patriotism and discourage dissent. Impact
After the Eichman ruling, the brief outbreak of flag-burning incidents involving the American flag abated, and media attention turned elsewhere. However, some Americans and many members of Congress remained upset by the ruling. As a result, a movement began to amend the U.S. Constitution explicitly to prohibit the desecration of the flag. The congressional vote on a measure to pass a constitutional amendment and send it to the states for ratification became an annual event in Washington, and often the vote was extremely close.
Further Reading
Goldstein, Robert Justin. Burning the Flag: The Great 1989-1990 American Flag Desecration Controversy. Kent, Ohio: Kent State University Press, 1996. Leepson, Marc. Flag: An American Biography. New York: St. Martin’s Press, 2005. Scot M. Guenter See also Congress, U.S.; Conservatism in U.S. politics; Elections in the United States, 1984; Liberalism in U.S. politics; Reagan, Ronald; Rehnquist, William H.; Supreme Court decisions.
Flashdance
■
375
■ Flashdance Identification American film Director Adrian Lyne (1941) Producers Don Simpson (1943-1996) and Jerry
Bruckheimer (1945) Released April 15, 1983
Date
Flashdance combined elements of the sexually suggestive music video, traditional romance, and 1980’s mainstream feminism to appeal to men and women alike. It was the second highest grossing film released in 1983, narrowly edging out Trading Places for that honor. Flashdance is set in Pittsburgh, Pennsylvania, “a cold world of steel,” where Alex Owens (Jennifer Beals) works as a welder by day and as a nightclub dancer by night, while entertaining the dream of becoming a legitimate ballet dancer. The film depicts her clash with a world of competition, envy, and strenuous work, in which only a privileged few can make it. Visually, the film plays heavily on the juxtaposition of the harsh reality of a steel mill job and the ephemeral and majestic world of classical ballet. The film was conceived as a mixture of music video and drama. It did not fall within the category of traditional musical, because it had no vocal score. Much of the talent was the work of voice-over and stand-in performers, including French actress Marine Jahan, break-dancer Crazy Legs, and professional gymnast Sharon Shapiro. The music sound track by Phil Ramone and the choreography by Jeffrey Hornadays featured high-tech effects and eroticism to increase audience appeal. The sound track also gained fame for its hit songs, including “Flashdance . . . What a Feeling,” composed by Giorgio Moroder with lyrics by Keith Forsey and sung by Irene Cara, which won an Academy Award for Best Original Song, and “Maniac,” which was also nominated in that category. The film’s lighting and staging effects, including a famous scene in which water splashed down upon Beals at the end of a dance number, won it success among MTV watchers. Despite—or perhaps because of—all the flashy costumes and the sexual overtones for which it received an “R” rating from the Motion Picture Association of America (MPAA), the movie received bad reviews. Roger Ebert, columnist for the Chicago SunTimes, pointed out its artificial contrivances, flashy production numbers, undeveloped characters, and improbable plot. He even accused it of “grabbing a
376
■
The Eighties in America
Flynt, Larry
piece of Saturday Night Fever, a slice of Urban Cowboy, a quart of Marty and a 2-pound box of Archie Bunker’s Place.” This negative assessment did not harm the movie at the box office: It took in almost $95 million. Among the films of 1983, only the juggernaut Star Wars sequel The Return of the Jedi surpassed it.
Indecent Proposal (1993). The producers, Don Simpson and Jerry Bruckheimer, went on to produce Top Gun (1986) and Beverly Hills Cop (1984), and the screenwriter, Joe Eszterhas, would later write one of Hollywood’s most famous spec scripts, the screenplay for Basic Instinct (1992).
Impact For many aspiring dancers in their teens, however, Flashdance was the movie that changed their lives. It also affected fashion, beginning the trend of wearing off-the-shoulder sweatshirts and leg warmers, which soon flooded the market. Most significant, the movie became a stepping-stone for a number of filmmakers in the 1980’s. Director Adrian Lyne would later direct Fatal Attraction (1987) and
Further Reading
Ebert, Roger. “Flashdance.” Chicago Sun-Times, April 19, 1983. Grugg, Kevin. “Broadway and Beyond.” Dance Magazine 57 (July, 1983): 104. McRobbie, Angela. “Fame, Flashdance, and Fantasies of Achievement.” In Fabrications: Costume and the Female Body, edited by Jane Gaines and Charlotte Herzog. New York: Routledge, 1990. Sylvia P. Baeza See also
Ballet; Dance, popular; Fashions and clothing; Fatal Attraction; Film in the United States; Leg warmers; MTV; Music; Music videos; Pop music; Special effects.
■ Flynt, Larry Identification
American pornographer and First Amendment advocate Born November 1, 1942; Magoffin County, Kentucky Flynt outraged religious leaders and critics of pornography because he featured in his magazine, Hustler, more sexually explicit photographs than other mainstream pornographic magazines at that time. Feminists criticized Flynt for including in his magazine graphic acts of violence and demeaning depictions of women.
Jennifer Beals in Flashdance. In addition to leg warmers, the film helped popularize the off-the-shoulder sweater fashion modeled here. (dpa/Landov)
By the 1980’s, Larry Flynt had transformed the monthly newsletter he used to promote his chain of go-go clubs in Ohio into the nationally known pornographic magazine Hustler. During the decade, however, Flynt suffered from a variety of personal setbacks. A 1976 assassination attempt had left him in a wheelchair, paralyzed from the waist down. He had become addicted to the prescription pain medication he took to control the chronic pain he suffered from his bullet wounds. In 1987, his wife Althea died. As an unwavering—but outrageous—defender of the First Amendment, Flynt went to extreme lengths to show his contempt for any effort to infringe on his
The Eighties in America
Food Security Act of 1985
■
377
sion. Jerry Falwell, a nationally known televangelist from Lynchburg, Virginia, had sued Flynt for libel over a satirical advertisement published in Hustler that portrayed the minister as an incestuous drunk. The case was appealed all the way to the Supreme Court, which decided in Flynt’s favor, ruling that, regardless of how tasteless the speech may be, humor and satire are protected by the First Amendment. Impact Larry Flynt has been described as a vanguard defender of the First Amendment by his supporters and as an attention-grabbing smut peddler by his critics. His tenacity in fighting for his free speech rights, however, resulted in Hustler Magazine v. Falwell, a decision that has been hailed as an important First Amendment victory not only for Larry Flynt but for the mainstream media as well. Further Reading
Flynt, Larry. Sex, Lies, and Politics: The Naked Truth. New York: Kensington, 2004. Flynt, Larry, and Kenneth Ross. An Unseemly Man: My Life as a Pornographer, Pundit, and Social Outcast. Los Angeles: Dove Books, 1996. Smolla, Rodney A. Jerry Falwell v. Larry Flynt: The First Amendment on Trial. New York: St. Martin’s Press, 1988. Eddith A. Dashiell Larry Flynt speaks with reporters outside the U.S. Supreme Court building in December, 1987. (AP/Wide World Photos)
free speech rights. He served more than five months in a federal prison for desecrating the U.S. flag after he appeared at a court hearing wearing the flag as a diaper. As a publicity stunt in 1984, Flynt briefly ran for U.S. president. Flynt also sent a free Hustler subscription to every member of Congress and the U.S. Supreme Court. Flynt’s crusade for the First Amendment also resulted in two appearances before the U.S. Supreme Court. In 1983, Flynt hurled obscenities at the U.S. Supreme Court justices after they ruled against him in a libel suit filed by the girlfriend of the publisher of Penthouse magazine, one of Flynt’s competitors. He was arrested for contempt of court, but the charges were later dismissed. In 1988, however, Flynt won an important victory when the Court handed down its landmark Hustler Magazine v. Falwell deci-
See also
Dworkin, Andrea; Falwell, Jerry; Pornog-
raphy.
■ Food Security Act of 1985 Identification Federal agricultural legislation Date Signed on December 23, 1985
By uniting the goals of the environmental movement and farm support groups, the Food Security Act became the first legislation to enact both conservation measures for farmlands and economic incentives for farm production. In the decades prior to the 1980’s, important legislation existed to regulate the prices of agricultural products, which could fluctuate widely, to insure farm income, to provide loans to farmers, to set guidelines for the conservation of farmland by preventing erosion, and to adjust the total productive acreage of the nation. After World War II, attention focused also on the distribution of surplus products, as well as restricting cultivated land through the cre-
378
■
Food Security Act of 1985
ation of the soil bank in 1956. Before 1985, however, government programs designed to protect production and prices and those directed toward conservation were independent of each other. By the beginning of the 1980’s, lessening foreign demand for U.S. agricultural products and a corresponding decrease in market prices resulted in significant surpluses and a costly increase in governmental price supports for agricultural products. Moreover, conservation programs had not kept abreast of agricultural developments in terms of cost-benefit analysis, nor had the environmental impact of conservation programs received adequate attention. In fact, programs encouraging production could actually conflict with those that promoted conservation. The Food Security Act of 1985 attempted to address these developments. The act comprised seventeen titles, the first ten of which regulated prices and established quotas for dairy, wool and mohair, wheat, feed grains, cotton, rice, soybeans, and sugar. Title 11 focused on trade and Title 12 on conservation. The act created the Conservation Reserve Program, with sodbusting, swampbusting, and compliance programs. Thus, the Food Security Act was the first law to regulate both economic and environmental aspects of American agriculture. Conservation Measures
Government soil conservation programs had a history dating back to 1935, when the Soil Conservation Service was established to protect farmland for production purposes. Part of the impetus for including a conservation title in the Food Security Act came from environmentalists, who recognized that it was easier to achieve environmental legislation within a farm bill than separately. In order to reduce the surplus of commodities, as well as to protect water and soil resources on U.S. farms, the Conservation Reserve Program, once created by the 1985 law, revived the soil bank. By 1990, it aimed to remove from production 40 to 45 million acres of highly erodable land for periods of ten to fifteen years per parcel. Participants in the program were to receive payments from government agencies in the form of rents and cost-sharing for planting ground cover. The program also sought to conserve habitats for fish and other wildlife and to ensure that American fiber and food production was sustainable. The Food Security Act’s sodbusting provision required farmers wishing to convert erodable land into cropland to devise specific plans and submit
The Eighties in America
them for approval. Such plans had to be begun by 1990. This rule denied financial support to farmers who continued to farm erodable or otherwise environmentally fragile land without putting conservation measures in place. Those farmers not complying with the act lost their eligibility for aid programs sponsored by the U.S. Department of Agriculture. Unlike soil conservation initiatives, in 1985, measures to protect wetlands were relatively recent. Indeed, in the nineteenth century, the federal government had given the states incentives to drain wetlands rather than protect them. As such lands’ ecological value became understood, however, legislators became receptive to passing protective laws, starting with the Water Pollution Control Act of 1972. The Food Security Act increased such protections. Its swampbuster provisions, like the sodbuster rules, penalized farmers who converted wetlands by denying them the financial aid available through the various federal farm programs. Farmers could voluntarily drop out of such programs. The wording of these provisions suggested that Congress’s philosophy of conservation had changed over the previous decade. Rather than conserving land simply in order to ensure its future agricultural productivity, the new law was written to protect the environment as a good in itself. As the first attempt to unite economic and environmental protections, the Food Security Act had some weaknesses. For example, wetlands could not be drained for crops, but a farmer could drain wetlands for other purposes, such as constructing buildings or roads. Furthermore, only one penalty existed; there was no consideration of the extent of the infraction. The sodbuster rule made it difficult for some farmers to increase their acreage, forcing them to withdraw from governmental support to remain competitive. A major farm bill in 1990 addressed the shortcomings of the 1985 act; it also gave farmers more flexibility in planting. Impact By the end of 1989, nearly 34 million acres of land had been retired from agricultural production, and surplus stocks had fallen, aided by a drought in 1988. Soil erosion dropped substantially during the decade following the act’s passage. The rate of wetland conversion for agricultural use had been falling even before the act passed. Between 1982 and 1992, that conversion rate averaged only twenty-five thousand acres per year, a number representing
The Eighties in America
an 80 percent decrease from the period thirty years prior. The agricultural conversion of wetlands during the decade accounted for about 20 percent of the nation’s total annual loss of wetlands. The precise role played by the swampbuster act in the decrease in wetland conversion is unclear, however, because extraneous economic factors, such as a decrease in the cost-effectiveness of wetlands draining, may also have played a role. Further Reading
Allen, Kristen, ed. Agricultural Policies in a New Decade. Washington, D.C.: Resources for the Future and National Planning Association, 1990. Collection of articles assessing the national agricultural situation five years after the Food Security Act. Includes economic, policy, and conservation topics. Miranowski, John A., and Katherine H. Reichelderfer. “Resource Conservation Programs in the Farm Policy Arena.” In Agricultural-Food Policy Review: Commodity Program Perspectives. Agricultural Economic Report 530. Washington, D.C.: U.S. Department of Agriculture, 1985. Details the conflict between economic programs supporting agriculture and those focusing on conservation, especially in response to the heightened awareness of environmental issues. Thurman, Walter N. Assessing the Environmental Impact of Farm Policies. Washington, D.C.: The AEI Press, 1995. Short, readable book that discusses the problem of externalities, that is, costs not directly involved in production, in this case, environmental costs. While it focuses on the situation in 1994, it provides information about the impact and relevance of the Food Security Act. Kristen L. Zacharias See also
Agriculture in the United States; Environmental movement; Farm Aid; Farm crisis.
■ Food trends Definition
Patterns and tendencies within American eating habits, restaurants, and domestic food preparation
During the 1980’s, food’s value as a status symbol increased dramatically, even as Americans’ consumption of prepared foods increased and home cooking decreased. The combination of nouvelle cuisine and a proliferation of
Food trends
■
379
gourmet regional restaurants changed the nature of fine dining, which came to emphasize local ingredients and ingredients distinctive to authentic regional and international cuisines, instead of universal luxury foods such as foie gras. During the 1980’s, food became a form of entertainment, as Americans became infatuated with gourmet cooking, chic restaurants, and novelty ingredients. In 1983, for example, Americans spent almost 40 percent of their food budgets in restaurants. The decade also brought a recognition that American regional cooking traditions were worthy of the gourmet label. The signature food phrase of the decade, “Real men don’t eat quiche,” the title of a 1982 humor book, indicates the degree to which the gourmet trend reached mainstream America. Dining Out in the 1980’s American regional cooking styles exploded in popularity in the 1980’s, as trendsetters sought out novelty. The cooking of the West Coast had a profound influence on the nation’s fine restaurants. California cuisine—which incorporated elements of French nouvelle cuisine and Japanese sensibility alongside an emphasis on locally grown, fresh produce and light sauces—established a pan-American gourmet cooking style that spread across the country. Portions became smaller, and chefs paid as much attention to a dish’s presentation as to its taste. California cuisine was epitomized by chef Wolfgang Puck, whose restaurants Spago and Chinois on Main were as celebrated for their innovative cuisine as their celebrity clientele. Thanks in part to chef Paul Prudhomme, Cajun food also became an important trend of the decade. His signature blackened redfish, a spice-rubbed fish filet seared in a red-hot cast-iron skillet, led to the popularization of the technique of blackening, which swept American restaurants. Blackened fish, shrimp, chicken, and even steak became staples of restaurant menus. During the 1980’s, Americans also turned away from the red-sauce spaghetti and meatballs popular in Americanized Italian restaurants to embrace Italian regional cooking. Northern Italian food became very chic, with polenta and risotto becoming foodie favorites. The cuisines of the southwestern United States, Mexico, and Central America expanded beyond their borders to satisfy hungry Americans across the country. Mid-level and fastfood Mexican-style chains such as Taco Bell, Chi Chi’s, and El Torito expanded rapidly. Salsa and tor-
380
■
The Eighties in America
Food trends
tilla chips rivaled potato chips and onion dip in snack popularity, and previously unfamiliar ingredients such as blue cornmeal and cilantro became restaurant favorites. The number of mid-range establishments that served a time-conscious clientele also increased in number. Casual dining restaurants such as Bennigan’s and TGIFriday’s expanded rapidly into successful chains, while others, such as Olive Garden, were developed by corporations to appeal to changing consumer needs. Décor-laden theme restaurants such as the Hard Rock Café distinguished themselves from competitors not by their food but by the celebrity-autographed items, faux antiques, and tchotchkes that covered their walls and even ceilings. Trends in Home Cooking American palates became more sophisticated during the 1980’s, causing supermarkets to expand their range of international and gourmet products. European cheeses, exotic mushrooms, and fancy greens such as arugula and radicchio became staples in yuppie kitchens. Beverages also became upscale. Tap water no longer sufficed in the status-obsessed 1980’s, nor did a simple cup of coffee. Bottled waters such as Perrier and Pellegrino appeared on store shelves, and Americans became enamored of specialty coffees, paying high prices for whole beans from Hawaii, Jamaica, Colombia, and Kenya. It was not only ingredients that became upscale during the decade but appliances and kitchen gadgets as well. Top-of-the-line Cuisinart food processors, Kitchenaid mixers, and professional-quality knives became de rigueur in middle-class households. The microwave oven went from a novelty to a household staple. By the end of the decade, microwaves could be found in 75 percent of American homes. As larger numbers of women entered or returned to the workplace, preparing dinner became a more difficult task for American families. Many Americans were not willing to settle for tasteless frozen meals and sought more upscale convenience-food options. The range of prepared-meal choices expanded, as supermarkets, gourmet food stores, and restaurants began to cater to the take-out market. By 1988, over
80 percent of Americans were regularly buying takeout meals for home consumption. Impact During the 1980’s, American food traditions finally achieved the same level of culinary respect that continental cuisines had long enjoyed. The American gourmet revolution celebrated regional cooking and made use of previously ignored indigenous ingredients. At the same time, increasingly frenetic lifestyles left little time to cook the elaborate dishes that sophisticated palates of the 1980’s demanded. As Americans became more harried, the food-service industry responded by presenting more choices for consumers: Mid-range chain restaurants, take-out options, and prepared gourmet groceries were all markers of the decade’s eating habits. Further Reading
Belasco, Warren. Appetite for Change: How the Counterculture Took on the Food Industry. Ithaca, N.Y.: Cornell University Press, 1989. Includes a description of the role corporations played in 1980’s food culture. Brenner, Leslie. American Appetite: The Coming of Age of a Cuisine. New York: Avon, 1999. Personal overview of trends and influences in the American dining scene, including many developments in the 1980’s. Levenstein, Harvey. Paradox of Plenty: A Social History of Eating in Modern America. New York: Oxford University Press, 1993. Contains an overview of trends in 1980’s food styles. Lovegren, Sylvia. Fashionable Food: Seven Decades of Food Fads. New York: Macmillan, 1995. Includes a chapter on food trends of the 1980’s. Piesman, Marissa, and Marilee Hartley. The Yuppie Handbook. New York: Pocket Books, 1984. Contains several discussions of 1980’s food preferences, including typical restaurant menus and reviews, kitchen appliances, and popular ingredients. Shelly McKenzie See also
Agriculture in Canada; Agriculture in the United States; Diets; New Coke; Starbucks; Women in the workforce; Yuppies.
The Eighties in America
■ Football Definition
Team sport
Professional and college football gained increasing popularity during the 1980’s, replacing baseball as America’s most significant sports obsession. In 1981, a CBS-New York Times poll reported that 48 percent of U.S. sports fans preferred football, while 31 percent preferred baseball as their favorite spectator sport. By 1984, football’s popularity had skyrocketed, and a similar survey found that American sports fans enjoyed watching football more than baseball by a margin of almost three to one. Professional Football
During the 1980’s, the National Football League (NFL) solidified its dominance of both professional American football and American sports in general, as well as establishing itself as a corporate entertainment giant. Monday Night Football, an American Broadcasting Company (ABC) television program, was typically among the highest-rated shows among male viewers. Most of the league’s games were played during the day on Sundays, and the two teams chosen to compete on a given Monday night came to relish their moment in the prime-time spotlight. The 1980’s also saw an increase in female viewership, and the NFL’s annual championship, the Super Bowl, became a family event, attracting huge audiences of both sexes and all ages. Three Super Bowl broadcasts during the 1980’s are among the most-watched television programs of all time. In 1982, for example, nearly half of all U.S. households tuned in to Super Bowl XVI, and the game was viewed by more than 110 million fans. As a result, the Super Bowl became the premier showcase for new and innovative advertisements, such as the famous 1984 commercial introducing the Apple Macintosh computer. During much of the first half of the decade, the sound of Monday Night Football was easily identified by its charismatic and often controversial announcer, Howard Cosell. Cosell teamed with a number of memorable cohosts, including Frank Gifford, Don Meredith, Alex Karras, and O. J. Simpson. In 1984, another announcer joined the show, former New York Jets quarterback Joe Namath, who eventually went on to launch a significant career in entertainment. During the 1980 season, the NFL’s regular-season
Football
■
381
attendance of nearly 13.4 million set a record for the third year in a row. The fans’ enthusiasm for the game was well rewarded, when the Pittsburgh Steelers’ “Steel Curtain” defeated the Los Angeles Rams 31 to 19 in Super Bowl XIV, becoming the first team to win four Super Bowls. Super Bowl XV in 1981 featured two “wild-card” teams—teams that were not divisional champions but were chosen to fill extra playoff berths based on their records. In this championship game between the Oakland Raiders and the Philadelphia Eagles, Oakland bested Philadelphia 27 to 10 to become the first wild-card team to win a Super Bowl. That same year, the NFL began a campaign to recruit more black athletes by hosting representatives from predominantly black colleges. The 1982 season saw a fundamental change in the pay structure of the NFL as a result of a fifty-sevenday strike that cut the regular sixteen-game schedule down to just nine games. Minimum player salaries were established, and pay and benefits were increased. That year heralded the decade’s biggest challenge to the NFL’s supremacy, when the wellfinanced United States Football League (USFL) was created. Despite a lucrative television contract and several big-name players, however, the league lasted just three years. The 1983 college draft became known as the “Year of the Quarterback” because an unusually large number of quarterbacks was selected. Several proceeded to become household names, including John Elway, Jim Kelly, Tony Eason, and Dan Marino. Elway and Kelly eventually made a combined nine Super Bowl starts, and Marino became the first rookie player to start in a Super Bowl in 1984. Furthermore, of the twenty-eight players selected in the first round of the 1983 draft, fifteen went on to play in at least one Pro Bowl, the league’s showcase game for its most skilled and popular players. The 1984 season became a year for breaking records. The Miami Dolphins’ Dan Marino passed for 5,084 yards and 48 touchdowns; Eric Dickerson of the Los Angeles Rams broke O. J. Simpson’s singleseason rushing record by gaining 2,105 yards; Washington Redskins wide receiver Art Monk caught 106 passes; and Walter Payton of the Chicago Bears broke Jim Brown’s career rushing record, ending the season with a career total of 13,309 yards gained. In Super Bowl XVIII, the Los Angeles Raiders defeated Washington by an impressive score of 38 to 9. The following year, the San Francisco Forty-Niners,
382
■
Football
led by legendary quarterback Joe Montana, defeated Miami 38 to 16 in Super Bowl XIX. The game became the most-watched live event in television to that time, attracting 115,936,000 U.S. viewers, including President Ronald Reagan, who took his second oath of office before tossing the coin for the game. In the United Kingdom and Italy, almost 12 million additional viewers tuned in to the event. One of the more gruesome moments in NFL history occurred on November 18, 1985, when New York Giants linebacker Lawrence Taylor fell heavily on the leg of Washington Redskin Joe Theismann, causing multiple fractures that ended the prized quarterback’s career. Super Bowl XX, in January, 1986, was equally painful for many viewers, as New England Patriots quarterback Tony Eason gave a dismal performance, failing to complete a pass against the famed Chicago Bears defense. Eason was 0 for 6 in pass attempts, fumbled the ball once, and was sacked three times. He was pulled from the game
The Eighties in America
during the second quarter, but it was too late, and the Bears went on to defeat New England 46 to 10. The National Broadcasting Company (NBC) telecast of Super Bowl XX replaced the final episode of M*A*S*H as the most-viewed television program in history, with an audience of 127 million U.S. viewers. The game was also televised in fifty-nine other countries, and an estimated 300 million Chinese viewers tuned in to a delayed broadcast of the game in March, 1986. Also in 1986, the NFL began to make a concerted effort to expand American football’s popularity by holding a series of preseason exhibition games, called American Bowls, at sites outside the United States. That same year, Monday Night Football celebrated its seventeenth season, becoming the longest-running prime-time series in ABC’s history. The 1987 season was reduced from sixteen games to fifteen as a result of a twenty-four-day players’ strike. While only one game was canceled outright because of the strike, three games were played with
San Francisco 49er Roger Craig evades the final Miami Dolphin defender and leaps into the endzone to score a touchdown during the 1985 Super Bowl. (AP/Wide World Photos)
The Eighties in America
replacement teams. In addition, the NFL instituted a controversial antisteroid policy in 1987 that would become the longest-running such rule in professional sports. That year also saw the premiere of the Arena Football League, which played a scaled-down indoor version of American football. Besides Washington’s defeat of the Denver Broncos 42 to 10 in Super Bowl XXII, the big football news of 1988 was of a more societal nature. On September 4, 1988, Johnny Grier became the first African American referee in NFL history, opening the door for other minorities to participate more fully in the sport. The final year of the decade saw the NFL both figuratively and literally rocked by changes. The San Francisco Forty-Niners temporarily moved their play to Stanford Stadium in Palo Alto, California, after the Bay Area earthquake of October 17, 1989, damaged their home stadium, Candlestick Park. Little more than a week later, Paul Tagliabue was chosen to succeed NFL Commissioner Pete Rozelle, who had held the position since 1960 and was known as the force behind the creation of both the Super Bowl and Monday Night Football. In addition, Art Shell became the first African American head coach in the NFL, leading the Oakland Raiders after serving as a player and as an assistant coach for twenty years. That year also saw the beginnings of free agency, giving players more freedom to sign contracts with other teams following the expiration of their initial contracts. Finally, in one of the most famous draftday trades ever, the Dallas Cowboys traded running back Herschel Walker to the Minnesota Vikings for five veteran players and six draft picks over three years. Dallas later used these picks to leverage trades for additional draft picks and veteran players. As a result of this, the team drafted many of the stars that would help it win three Super Bowls in the 1990’s. College Football
Colleges and universities fall into three divisions under National Collegiate Athletic Association (NCAA) guidelines, and each division has several conferences. Seasonal and conference play culminates in postseason bowl games such as the Rose Bowl, the Orange Bowl, and the Sugar Bowl. Despite the growing popularity of professional football during the 1980’s, college competitions remained a staple for American sports fans—particularly in rural areas, where in-person access to NFL teams was sometimes difficult. Televised college foot-
Football
■
383
ball matches also frequently attracted large audiences, and postseason college bowl games, which typically showcased top teams, garnered significant audience interest as post-holiday fare. Unlike NFL players, college players were not allowed to receive salaries, yet college football has played an important role in the sport, not only as a source of talent for professional teams, but also as a way of generating revenue for participating universities. By the 1980’s, televised college football was a significant source of income for the NCAA, which oversaw the sport at participating universities. A number of would-be big-name professional players incubated in the college football conferences during the 1980’s. These players included the Georgia Bulldogs’ Herschel Walker, who earned a 1980 freshman rushing record of 1,616 yards and went on to clinch the team’s first national title in the Sugar Bowl with a 17-10 victory over Notre Dame. In 1981, Southern California’s Marcus Allen became the first running back to gain more than 2,000 yards in one season and outpolled Georgia’s Walker for the Heisman Trophy, awarded annually by the Downtown Athletic Club of New York City to the best college football player of the year. Walker, however, rebounded in 1982 to earn the coveted award. That same year saw Alabama win the Liberty Bowl, pushing coach Bear Bryant’s thirty-eight-year record to 323 wins, 85 losses, and 17 ties. He died less than a month later. In 1984, Boston College quarterback Doug Flutie, the first player to pass for 10,000 yards, won the Heisman Trophy. The Miami Hurricane’s Vinny Testaverde won the award in 1986, and in 1988, the star of the year was Oklahoma State running back Barry Sanders, a virtual unknown who captured the limelight by shattering the NCAA single-season records for rushing yards (2,628) and touchdowns (39). By 1989, the Miami Hurricanes had cemented their claim as team of the decade by winning their third national title of the 1980’s. That same year, Houston quarterback Andre Ware became the first African American quarterback to win the Heisman Trophy. Impact The 1980’s became known as the decade in which football claimed the title of America’s most popular spectator sport. The game evolved to become a key part of American culture and also began to bridge the gender gap among sports fans. The
384
■
Nielsen ratings for Monday Night Football increased throughout 1980’s, as viewers continued to respond to the show’s mixture of sports and entertainment. The Super Bowl, meanwhile, became an unofficial national holiday and the top-rated television program during most of the decade. Further Reading
Boyles, Bob, and Paul Guido. Fifty Years of College Football. Fishers, Ind.: Sideline Communications, 2005. A comprehensive history of college football. Dunnavant, Keith. The Fifty-Year Seduction: How Television Manipulated College Football, from the Birth of the Modern NCAA to the Creation of the BCS. New York: St. Martin’s Press, 2004. Analysis of how television coverage has affected college football. Kanner, Bernice. The Super Bowl of Advertising: How the Commercials Won the Game. New York: Bloomberg Press, 2003. Examination of how Super Bowl advertising has changed since 1967. Leiker, Ken, and Craig Ellenport, eds. The Super Bowl: An Official Retrospective. New York: Rare Air, 2005. Explores the Super Bowl as a cultural phenomenon and as a global event. MacCambridge, Michael, ed. ESPN College Football Encyclopedia: The Complete History of the Game. New York: Hyperion, 2005. Comprehensive reference guide to college football history, tradition, and lore. Smith, Michael, et al., eds. The ESPN Pro Football Encyclopedia. New York: Hyperion, 2006. Provides statistics on all aspects of professional football from 1920 through the 2005 season. Watterson, John Sayle. College Football: History, Spectacle, Controversy. Baltimore: Johns Hopkins University Press, 2002. Historical examination of college football, including scandals, controversies, and attempts at reform. Yost, Mark. Tailgating, Sacks, and Salary Caps: How the NFL Became the Most Successful Sports League in History. Chicago: Kaplan, 2006. Analyzes the financial impact of professional football. Cheryl Pawlowski See also
The Eighties in America
Ford, Harrison
Advertising; African Americans; Apple Computer; Arena Football League; Elway, John; Hobbies and recreation; Montana, Joe; Sports; Television.
■ Ford, Harrison Identification American actor Born July 13, 1942; Chicago, Illinois
After playing small parts in movies such as American Graffiti, Harrison Ford won the role of Han Solo in George Lucas’s Star Wars trilogy and the title role in Steven Spielberg’s Indiana Jones movies. As a result of playing these roles, Ford became one of the top box-office stars of the 1980’s. Harrison Ford appeared in four of the top ten grossing films of the 1980’s: The Empire Strikes Back (1980), Raiders of the Lost Ark (1981), The Return of the Jedi (1983), and Indiana Jones and the Last Crusade (1989). His work in Witness (1985) and the Star Wars and Indiana Jones trilogies in particular led to his status as a major star who could guarantee huge profits at the box office. Ford’s appeal as a strong yet heroic Everyman won him wide appeal with audiences and critics alike. While Ford is best known for his portrayals of Han Solo and Indiana Jones, his versatility as an actor led to his playing a wide range of characters in all genres. He starred in science-fiction movies, such as the cult hit Blade Runner (1982); detective movies, such as Witness; and romantic comedies, such as Working Girl (1988). In The Mosquito Coast (1986), the actor took on the challenge of playing the role of a less likable character to mixed reviews. Even though this movie was not as successful as Ford’s other films, his personal popularity continued to gain momentum during the rest of the decade. Other films that Ford starred in during the 1980’s include Indiana Jones and the Temple of Doom (1984) and Frantic (1988). He worked with several major directors during the decade, including Steven Spielberg, Peter Weir, Roman Polanski, Ridley Scott, and Mike Nichols. In the mid-1980’s, Ford was recognized by his peers in the film industry when he was nominated for Golden Globe awards in 1985 and 1986 for his roles in Witness and The Mosquito Coast. As a result of his personal success and the recognition of the movie industry, Ford was able to pick the roles that he wanted to play, and he went on to take chances in some of the characters that he chose to portray. While the resulting films were not always popular, he continued to enjoy the artistic freedom to choose the kinds of characters he played and the way that he practiced his craft.
The Eighties in America
Foreign policy of Canada
■
385
See also
Action films; Blade Runner; Empire Strikes Back, The; Film in the United States; Raiders of the Lost Ark; Science-fiction films; Spielberg, Steven.
■ Foreign policy of Canada Definition
The interactions of the Canadian government and its representatives with other countries of the world
After a decade of strained relations with the United States, Canada developed closer ties to its southern neighbor following the election of Conservative prime minister Brian Mulroney in 1984. Related to this development, Canada pursued more foreign investment and fewer regulations on trade. The nation also acquired an increased role in global affairs.
Harrison Ford as Rick Deckard in Blade Runner. Ford’s performance against type disappointed fans at the time but would later be seen as one of his most important roles. (AP/Wide World Photos)
Impact Few actors have achieved the popularity of Harrison Ford. His portrayals of characters who meet extraordinary challenges with courage and determination have won him a large number of fans and the respect of his peers. Whether he plays swashbuckling action heroes or ordinary men who must become heroes, Ford remains one of Hollywood’s most resilient and successful actors. Further Reading
Duke, Brad. Harrison Ford: The Films. Jefferson, N.C.: McFarland, 2005. Jenkins, Garry. Harrison Ford: Imperfect Hero. Secaucus, N.J.: Carol, 1998. Pfeiffer, Lee, and Michael Lewis. The Films of Harrison Ford. 3d ed. Reno, Nev.: Citadel, 2002. Kimberley M. Holloway
Canadian foreign policy in the 1980’s changed as the decade progressed. Under Liberal prime ministers Pierre Trudeau and John Turner, the country experienced differences with the United States during the decade’s first few years. In 1984, however, a Conservative landslide victory produced a change in emphasis in Canadian foreign policy. Brian Mulroney, the new prime minister, adopted a more pro-United States stance and sought to place fewer restrictions on international trade in order to address the country’s economic problems. Divisions in the Early 1980’s Divisions between Washington and Ottawa began before the 1980’s. The strained relations between the United States and Canada in the first few years of the decade were due mostly to two factors: ideological differences between Prime Minister Trudeau and U.S. president Ronald Reagan and an emerging uncertainty within Canada about the nation’s proper role in the world. The economic ties that had developed in North America since World War II made some Canadians fearful of becoming too dependent on the United States. They supported governmental assertions of independence from the Americans. However, as was true in much of the Western world, significant economic problems occurred in Canada at the dawn of the 1980’s. People became concerned about high taxes, inflation, and unemployment. Similar to reactions in the United Kingdom in 1979 and in the United States in 1980, Canada experienced a shift toward conservatism in
386
■
Foreign policy of Canada
response to global economic problems. In 1984, the Conservatives won 211 seats in the House of Commons, and the Liberals dropped to a record-low representation of just 40 seats. One major aspect of Mulroney’s platform in securing this victory for his party was the pursuit of a more capitalist economy. This domestic agenda would be linked to Mulroney’s foreign policy. The “Special Relationship” In describing the type of relationship he wanted Canada to have with the United States, Mulroney used the phrase “special relationship.” To address Canada’s economic problems, the Conservative prime minister pursued closer links with the United States. He wanted more trade with his country’s southern neighbor, hoping that an influx of American dollars could improve the Canadian economy. Mulroney also sought greater direct foreign investment from American businesses. Indeed, he even referred to Canada as being “open for business.” In order for Canada to implement these policies, it was necessary for the government to reduce its existing regulations on foreign trade and investment. Thus, soon after becoming prime minister, Mulroney took multiple actions in the area of foreign economic policy to increase trade and foreign investment. In 1985, his government created Investment Canada, a new bureaucracy whose goal was to attract investment from abroad. Simultaneously, Canada and the United States started talks on a free trade pact. This policy was extremely controversial in Canada. Indeed, it was the most important issue in the 1988 elections. Liberals argued that free trade with the United States would lead to a loss of Canadian jobs, as American businesses would likely close down plants in Canada because they would no longer have to worry about tariffs on goods sold there. The Conservatives countered by saying that free trade with the United States would increase the likelihood of lower prices and provide Canadian businesses with greater access to the large American market. Though they lost some seats in Parliament, the Conservatives retained their majority, and the Canada-United States Free Trade Agreement went into effect on January 1, 1989. As a result, tariffs would gradually be eliminated, and each country gained preferential access to the other’s market. Closer Ties but Differences Remain
Under Mulroney, Canada developed the strongest relationship
The Eighties in America
that it had ever had with the United States. Despite this situation, divisions remained between the two neighbors. One point of contention was the status of the Northwest Passage—a region of the Arctic Ocean. In 1985, the United States sailed the Polar Sea, an icebreaker, through this ocean region without seeking Canada’s permission. Mulroney angrily responded to the U.S. action by stating that the Northwest Passage belonged to Canada “lock, stock, and barrel.” He then announced that Canada would construct its own modern icebreaker and acquire a fleet of nuclear-powered submarines to patrol the passage. The environment constituted another point of contention between the United States and Canada. A debate developed over territorial waters, and much of the debate had to do with the supply of fish. In particular, the stocks of cod were declining. Canada believed that some Americans had engaged in overfishing. Another environmental issue that arose was acid rain. Even though he had a close relationship with President Reagan, Mulroney demanded that the United States act to reduce the pollution emitted by its factories in its midwestern and northeastern regions. Much of this pollution from U.S. factories resulted in acid rain falling in Canada. A third issue over which Canada and the United States differed was apartheid in South Africa. The primary foreign-policy concern of the United States was the Cold War with the Soviet Union and curtailing the potential expansion of Soviet influence. Reagan therefore supported any nation that would ally with the United States against Moscow, whatever that nation’s human rights record might be. South Africa was such an ally, and the Reagan administation opposed the adoption of economic sanctions against the country’s racist regime. Mulroney, however, supported such sanctions, deeming that human rights should sometimes take priority over Cold War alliances. Impact Canadian foreign policy experienced a significant shift in the 1980’s. Canada forged its closest relationship ever with the United States. It also followed the trend set by its southern neighbor and the United Kingdom, pursuing deregulation in its foreign economic policy. Such policies demonstrated a major change in the Canadian political climate, as the Conservatives gained ascendancy by advocating free trade, deregulation, and increased foreign investment.
The Eighties in America Further Reading
Doern, G. Bruce, et al. Faith and Fear: The Free Trade Story. Toronto: Stoddart, 1991. Insightful analysis of the negotiations over the Canada-United States Free Trade Agreement. Emphasizes the motivations of the Canadians in pursuing such a pact. Gough, Barry M. Historical Dictionary of Canada. Lanham, Md.: Scarecrow Press, 1999. Includes a thorough list of political terminology relevant to Canada, with brief descriptions. Granatstein, J. L., ed. Towards a New World. Toronto: Copp Clark, 1992. This set of essays concentrates on Canada’s involvement in major international organizations and peacekeeping missions. Hampson, Fen Osler, et al., eds. Canada Among Nations, 1999: A Big League Player? Oxford, England: Oxford University Press, 1999. This collection of essays examines Canada’s policies in the areas of economics, cultural affairs, and international security. Riendeau, Roger. A Brief History of Canada. New York: Facts on File, 2000. Lengthy and detailed coverage (despite its title) of major issues in Canadian history. Particularly strong discussion of the federal government’s relationship with Quebec. Kevin L. Brennan See also Business and the economy in Canada; Canada Act of 1982; Canada and the British Commonwealth; Canada and the United States; CanadaUnited States Free Trade Agreement; Canadian Caper; Elections in Canada; Income and wages in Canada; Mulroney, Brian; Reagan, Ronald; Trudeau, Pierre; Unemployment in Canada; Vancouver Expo ’86.
■ Foreign policy of the United States Definition
The interactions of the United States government and its representatives with other countries of the world
Culminating a decades-long effort, American leaders in the 1980’s mobilized national power to confront, and roll back advances made by, the Soviet Union and its allies. By the end of the 1980’s, the United States began to emerge as the sole global superpower, with its economic system and democratic values substantially ascendant.
Foreign policy of the United States
■
387
The darkening horizon of a more menacing world stunned Americans in the last year of the 1970’s. In July, 1979, anti-U.S. revolutionaries drove brutal Nicaraguan dictator and longtime U.S. ally Anastasio Somoza Debayle from power. In Iran in November, acting with the encouragement of a new, anti-U.S. government, radical student extremists seized the U.S. embassy in Tehran and took its staff hostage. A month later, Soviet Red Army troops occupied Afghanistan in the first direct military aggression of the Cold War. The central feature in this pattern was widely perceived to be a decline in the influence of the United States. Ineffective responses to each crisis by U.S. president Jimmy Carter compounded a national sense of malaise. The Reagan Revolution
During the 1980 presidential election, Ronald Reagan effectively mobilized voters unwilling to accept these reverses abroad and economic troubles at home; he defeated Carter in a landslide. Having promised a general military buildup as part of his campaign, Reagan gained an early success based solely on his reputation for toughness with U.S. adversaries. On his inauguration day in January, 1981, the Iranian revolutionaries released all American hostages after over 440 days in captivity. This gesture alone could do little to reduce the growing anger evident in many Muslims’ relations with the Western world. Early in the Reagan administration, firm steps were taken to channel this Muslim anger to U.S. advantage by giving enhanced support to the mujahideen, or Muslim guerrilla fighters, resisting the Soviet occupation of Afghanistan. Working in coordination with the intelligence services of Pakistan, a South Asian and Muslim ally of the United States, Reagan assigned Director of Central Intelligence William J. Casey to expand greatly the limited assistance efforts that had begun under President Carter. Financing, military training, and limited arms shipments began to flow to the Afghani mujahideen. Confronting Soviet imperialism in Afghanistan engendered considerable new support for a policy of aiding anticommunist guerrillas throughout the world. For a host of separate reasons, China, Saudi Arabia, Pakistan, and other countries assisted the project. Within a short time, young men from throughout the Islamic world began to arrive in Peshawar, Pakistan, to receive training, arms, and assignment to one of several armed groups of Afghani
388
■
Foreign policy of the United States
The Eighties in America
believed that it was specifically communist tyranny that posed a threat to the nation. Therefore, while he opposed left-wing dictatorships, he consistently supported rightwing dictatorships, especially in Central America. Because the Reagan Doctrine advocated military and economic aid being given to any allies who were under attack, it advocated preserving tyrannies when those tyrannies were anticommunist. The administration also sought to aid anticommunist guerrilla armies that sought to dislodge new revolutionary anti-American governments. Fearing repetition of the fiasco of the Vietnam War, however, Congress was reluctant to embrace this policy formula. In 1981, after Cuban and Soviet assistance to revolutionary Nicaragua became clear, communist insurgencies against neighboring governments also increased. Some of these U.S. allies who came under Defense Secretary Caspar Weinberger, left, stands by as President Ronald Reacommunist attack, such as Guatemala and gan addresses a crowd during a White House Rose Garden ceremony in 1983. El Salvador, violated the human rights of The two men were the guiding force behind U.S. foreign policy during much of their own citizens. The Guatemalan and the 1980’s. (AP/Wide World Photos) Salvadoran governments tortured and murdered civilians who opposed them. Congress insisted that for aid to these antiguerrillas. In Reagan’s second term in office, sophiscommunist states to be given at all, they would have ticated but relatively inexpensive U.S. surface-to-air to make progress toward eliminating their death missiles known as Stingers allowed the Afghani resissquads and protecting the human rights of their tance to neutralize Soviet control of the air. Over the citizens. Moreover, Congress wanted these Central years until the final Soviet withdrawal from AfghaniAmerican regimes to be replaced over time by demostan in February, 1989, these policies grew increascratic institutions. Reagan reluctantly agreed to these ingly popular in the U.S. Congress, until support for conditions. secret appropriations to fund the project became Even with this concession, Congress was willing to nearly unanimous. Especially instrumental in buildsend direct military aid only to El Salvador. In 1984, ing bipartisan support for the mujahideen was Demthat nation held democratic elections, and reformist ocratic congressman Charles Wilson of Texas. José Napoleón Duarte became president. As a result, congressional purse strings loosened, and El Salvador received significant U.S. aid. Within five more The Reagan Doctrine President Reagan’s policy of years, the country’s communist insurgency was deconfronting aggression by the Soviet Union and Sofeated both militarily and at the ballot box. viet allies came to be known as the Reagan Doctrine. Persuasion by the Reagan team was less successful With the exception of Afghanistan, however, nearly when Congress was asked to fund the anticommuall its other applications were controversial in Connist group known as the Contras, whose guerrilla gress. Inspired by the 1979 Commentary magazine esactivities aimed at removal of the recognized govsay “Dictatorships and Double Standards,” written ernment of Nicaragua, led by leftist Daniel Ortega by his future U.N. ambassador, Jeane Kirkpatrick, and the Sandinista National Liberation Front. Many Reagan embraced a vision of the United States once members of the Contras had been connected to the again leading a free world. However, Reagan firmly
The Eighties in America
repressive Somoza regime, and many members of Congress were reluctant to support the right-wing army. Their reluctance only increased in 1984, when the nation held democratic elections, in which the Sandinistas were victorious. From 1982 to 1984, Reagan was confronted with three separate defense appropriations bills, each of which contained language barring the use of U.S. funds to finance efforts to overthrow the Ortega government in Nicaragua. These bills also included the funding for Reagan’s highest-priority defense projects, such as the Strategic Defense Initiative (SDI). Because he was barred by the U.S. Constitution from vetoing only a portion of the bills, he had no choice but to sign them into law. Reagan continued to believe, however, that it was the presidency, not the Congress, that ultimately must determine the course of foreign policy for the nation. In a series of controversial and arguably illegal actions, two national security advisers (Robert McFarlane and John Poindexter) and members of the National Security Council staff, including Oliver North, arranged covert funding mechanisms designed to bypass the will of Congress so that the administration could continue to support the Contras. In November, 1986, U.S. attorney general Edwin Meese III revealed that some of the money being funneled to the Contras had come from secret sales of arms to Iran. The revelation sparked outrage in Congress, resulting in years of congressional hearings, legal investigations, and criminal trials that came to be known as the Iran-Contra affair. The Weinberger Doctrine A general buildup of U.S. military forces and the modernization of U.S. capabilities defined the Reagan years, as well as the final year of the decade, during which George H. W. Bush assumed the presidency. This buildup was extremely expensive. In constant 2000 dollars, the Reagan administration spent almost $3 trillion to increase U.S. military capabilities, funding and deploying many new weapons systems. Despite the resulting enhanced capabilities, it was rare for either Reagan or Bush to employ U.S. military forces directly in support of their foreign policy. Only twice were U.S. armed forces deployed to remove from power an anti-U.S. government: in 1983, on the tiny Caribbean island of Grenada, and in 1989, when Bush ordered the arrest and overthrow of President Manuel Noriega of Panama. While each of these operations
Foreign policy of the United States
■
389
swiftly accomplished its mission, the relative infrequency of such operations underlined the calibrated choices preferred by both administrations. This larger strategy was guided substantially by the thinking of Reagan’s first secretary of defense, Caspar Weinberger. In November, 1984, Weinberger outlined several strict conditions that each had to be met before the United States would deploy its armed forces in combat. This Weinberger Doctrine demanded that U.S. armed forces be used only as a last resort, that they be used only when vital interests were at risk, and that, once committed, they be used wholeheartedly and with the full support of Congress and the American people. World War II veteran Weinberger developed his doctrine through hard experience. As defense secretary, he reluctantly had presided over a deployment to Lebanon of U.S. troops. These Army and Marine soldiers first arrived in Lebanon on August 25, 1982, to secure the evacuation of combatants of the Palestine Liberation Organization, which had been defeated by Israel. In the wake of that evacuation, an ongoing civil conflict inside Lebanon sharply worsened. Within weeks, a new Christian president, Bashir Gemayel, was assassinated. In retaliation, Christian militias allied with the Israelis massacred over nine hundred unarmed Muslim civilians. The mission of the U.S. Marines was forced to evolve in response to these events, and the Marines were instructed to train an indigenous army to support a new goverment of Lebanon. However, they themselves had not been trained to accomplish such a task. In the late 1950’s, the United States had committed its troops to the area and had succeeded in forging a new, more stable balance of power in Lebanon. By the 1980’s, however, many of the Lebanese factions perceived even this very limited U.S. military presence as partisan, supporting one side or the other in their civil war. Americans therefore began to be targeted by Muslim extremists. In April, 1983, the U.S. embassy in Beirut was bombed. Seventeen Americans were killed, including the regional Central Intelligence Agency (CIA) chief, Robert Ames. Then, in October, 241 soldiers and Marines perished when their barracks at the Beirut airport were struck by a suicide truck bomb. The resulting explosion was at the time the largest non-nuclear explosion in history. After such anti-U.S. carnage, the American peo-
390
■
ple and the U.S. Congress were reluctant to persist in Lebanon. President Reagan withdrew all U.S. troops in February, 1984, and throughout the rest of the decade, the Reagan and Bush administrations were pestered by radical Lebanese Muslims and Palestinians taking Americans as hostages. Indeed, one of the secondary goals in the Iran-Contra affair was to secure Iranian help in winning release of some of the hostages held by terrorists in Lebanon. The “Vietnam syndrome” of reluctance to employ direct military force to achieve U.S. objectives continued to shape popular attitudes and foreign policy throughout the decade. Presidents Reagan and Bush labored under this constraint. Impact
The Eighties in America
Foreign policy of the United States
One resource abundantly available to and frequently used by presidents was the “bully pulpit”: Both Reagan and Bush employed presidential speeches effectively to portray the forces of lawlessness and tyranny abroad as common enemies, not just of the United States, but of all free peoples. Within the Western Hemisphere, this rhetoric was challenged by the United States’ willingness to support repressive dictatorships as long as they were opposed to communism. In Europe, however, the Cold War was defined by the struggle between the democracies composing the North Atlantic Treaty Organization (NATO) and the communist bloc powers of the Warsaw Pact. Especially across East Central Europe, Reagan was a steadfast advocate of the view that the Cold War was not merely a contest between two strong empires but a titanic struggle between the good of the West and, as he memorably put it to the British House of Commons on June 8, 1982, the “totalitarian evil” in control of the East. Known as the Great Communicator, Reagan had an ability to turn a phrase that never was more effective than when he visited the Berlin Wall on June 12, 1987. He implored the Soviet leader, Mikhail Gorbachev, to “tear down this wall.” Such phrases may have sounded quaint to jaded Western publics, but in communities of activists denied all basic and fundamental human rights in East Central Europe, Reagan was heard. In the fall of 1989, all across the region, religious and human rights activists confronted each totalitarian system Stalin had imposed in the late 1940’s. Symbolized by the opening of the Berlin Wall on November 9, during 1989, freedom was won peaceably by the peoples of Hungary, Czechoslovakia, Poland, East Germany, and Bulgaria, and by violent
anticommunist revolution in Romania. The widespread revolution vividly underlined that the purpose of U.S. foreign policy outlined by President Harry S. Truman in 1947—containment of communism until free peoples could overcome it—under Reagan and Bush finally had been achieved. Subsequent Events
The worldwide anticommunist revolution championed by the United States during the Reagan-Bush years grew in the early 1990’s, culminating in the collapse of the communist system in the Soviet Union itself in August, 1991.
Further Reading
Cox, Michael, et al., eds. American Democracy Promotion: Impulses, Strategies, and Impacts. New York: Oxford University Press, 2000. Eight leading analysts assess the importance of policies to advance U.S. values in winning the Cold War. Gaddis, John Lewis, The United States and the End of the Cold War: Implications, Reconsiderations, Provocations. New York: Oxford University Press, 1994. Eminent historian reveals how habits learned over several decades successfully guided the management of the final stages of a dangerous superpower rivalry. Matlock, Jack F., Jr. Reagan and Gorbachev: How the Cold War Ended. New York: Random House, 2004. Veteran diplomat and Reagan-era U.S. ambassador to Moscow reveals the complicated interplay between Soviet efforts to reform, the U.S. arms buildup, and diplomatic efforts to end the Cold War. Woodward, Bob, Veil: The Secret Wars of the CIA, 19811987. New York: Simon and Schuster, 1987. Through focus on efforts by Director of Central Intelligence William J. Casey, Reagan-era policies that confronted the Soviet Union, its allies, and international terrorism are revealed. Gordon L. Bowen See also Anderson, Terry; Beirut bombings; Berlin Wall; Bush, George H. W.; Cold War; Grenada invasion; Iran-Contra affair; Iranian hostage crisis; Israel and the United States; Kirkpatrick, Jeane; Klinghoffer, Leon; Latin America; Middle East and North America; North, Oliver; Poindexter, John; Reagan, Ronald; Reagan Doctrine; Reagan’s “Evil Empire” speech; Strategic Defense Initiative (SDI); Terrorism; Weinberger, Caspar.
The Eighties in America
■ 401(k) plans Definition
Tax-deferred retirement savings programs Date Began November 10, 1981 The 401(k) plan was named for the section of the Internal Revenue Code that governs it. It allows employees to defer paying taxes on a portion of their income that they invest toward retirement. The money in the resulting investment portfolio—including accrued interest—is taxed only when it is withdrawn. The history of the 401(k) plan dates back to the Revenue Act of 1978, which added section 401(k) to the Internal Revenue Code. That law went into effect on January 1, 1980. However, the birthday of the plan is typically celebrated to coincide with the November 10, 1981, issuance by the Internal Revenue Service (IRS) of Regulations on the Plan, which explained the 401(k) system and how to take advantage of it. Under the employer-sponsored 401(k) plans, employees could elect to have a portion of their salaries deducted from their paychecks and deposited directly into retirement investment accounts. Many companies had previously sponsored such savings plans, but those plans typically involved the use of after-tax dollars. The 401(k) law, by contrast, allowed employees to save for retirement using before-tax dollars. As a result, funds would accumulate faster, because the original contribution and subsequent earnings on the plan were all tax-free until they were withdrawn after retirement. Employees were motivated to save more for retirement, and the stock market benefited from the increased funds available for investment. Operational Details of the Plans The new retirement plans were viewed as a good deal for employees, so many large companies, such as Johnson & Johnson, PepsiCo, and JC Penney, quickly implemented 401(k) plans. For many companies, the implementation date was January, 1982. Within two years, about half of all large American companies had 401(k) plans in place. At many companies, the plans replaced traditional pension plans, which provided guaranteed income upon retirement. To sweeten the shift to this new and riskier type of retirement plan, and to make the new plans more enticing to employees, companies often agreed to match employees’ contributions. Initially, the law allowed each employee to defer up to $45,475 of salary each year,
401(k) plans
■
391
but that amount was reduced to $30,000 in 1983 and remained at that level throughout the rest of the decade. By 1984, the Internal Revenue Code had been revised to require nondiscrimination tests to assure that 401(k) plans applied equally to all employees, rather than being provided only to highly paid employees and managers. By 1984, there were over 17,300 companies with 401(k) plans, covering more than 7.5 million employees. Total investments in these plans were valued at $91.75 billion. The Tax Reform Act of 1986 tightened the nondiscrimination rules. By the end of the decade, there were over 97,000 companies with plans, covering 19.5 million employees. Assets totaled over $384 billion. Congress had finally found a motivating factor to encourage employees to save money for retirement, although not all employees participated in the plans, which were voluntary. Although 401(k) plans were available only to employees of for-profit businesses, similar plans called 403(b) plans were made available to employees of nonprofit organizations and educational institutions. Although employers were made responsible for establishing and administering the plans, that task was typically outsourced to financial services companies, such as mutual funds or insurance companies. When employees moved to new employers, they were given the option of transferring, or “rolling over,” their existing retirement plans to similar plans administered by the new employers. The tax code established restriction on preretirement withdrawals from 401(k) plans. Unless an exception applies, an employee’s 401(k) funds must remain in the plan accounts until the employee reaches the age of fifty-nine and one-half years. If money is withdrawn prior to this age, the employee must pay a 10 percent penalty tax, in addition to the normal taxes. Some plans do allow employees to borrow money from their retirement plans, but such loans must be paid back with interest prior to retirement. Impact The explosive growth in the number of 401(k) plans throughout the 1980’s was at least partly attributable to their low cost to employers. Such plans were typically less costly than the older defined-benefit pension plans that many companies had used. Companies are not required to contribute matching funds, and if they elect to do so, the amount of such funds is easily predictable in ad-
392
■
Fox, Michael J.
The Eighties in America
vance. Also, if the plans are outsourced to financial intermediaries, there is no administrative cost for the employer. Employees tend to like the plans, because they are able to save pre-tax dollars and often receive matching contributions from employers. Also, funds held in the plans are protected from creditors. However, unlike pensions, 401(k) plans produce no guaranteed benefit, since they may be invested in stocks and other securities that may lose some or all of their value. Nevertheless, 401(k) plans became a powerful vehicle to provide retirement income for participants. The stock market, meanwhile, was boosted tremendously by the increased availability of investment funds. Further Reading
Gale, William G., John B. Shoven, and Mark J. Warshawsky, eds. The Evolving Pension System: Trends, Effects, and Proposals for Reform. Washington, D.C.: Brookings Institution Press, 2005. Includes discussion of various effects of pension plans and the impact of 401(k) plans on household savings and wealth. Munnell, Alicia H., and Annika Sundén. Coming Up Short: The Challenge of 401(k) Plans. Washington, D.C.: Brookings Institution Press, 2004. Discusses the advantages and disadvantages of 401(k) plans. Includes an extensive bibliography. U.S. Senate. The Role of Employer-Sponsored Retirement Plans in Increasing National Savings: Hearing Before the Special Committee on Aging, United States Senate, One Hundred Ninth Congress, First Session, April 12, 2005. Washington, D.C.: U.S. Government Accountability Office, 2005. Excellent overview of the impact that 401(k) plans have had on the level of national savings. Dale L. Flesher See also
Business and the economy in the United States; Tax Reform Act of 1986.
■ Fox, Michael J. Identification Canadian actor Born June 9, 1961; Edmonton, Alberta, Canada
Fox was already a television star when his performance as Marty McFly in Back to the Future made him the heartthrob for girls across the United States and secured his film career.
Michael J. Fox arrives at the premiere of Back to the Future II with his wife, Tracy Pollan, in November, 1989. (AP/Wide World Photos)
Michael J. Fox spent his childhood and youth moving across Canada, because his father was a career serviceman in Canada’s armed forces. At fifteen, Fox had his first role in a Canadian television series, Leo and Me. The next year, he moved to Los Angeles to try his luck in Hollywood. However, success eluded him, and his luck was so bad that he had to sell pieces of his sectional sofa to buy food. Fox got his first real break when he was cast in the television sitcom Family Ties as Alex P. Keaton, a role he very nearly did not get. Several key figures in the National Broadcasting Company (NBC) studios thought he was not quite right for the role, and they gave it to him only after much convincing. One of these people, NBC executive Brandon Tartikoff, would later receive some ribbing about his hesitance
The Eighties in America
when Fox became a major star. Alex P. Keaton was an archetypal 1980’s character, a high school-aged Republican who embraced greed, consumerism, and the values of corporate America in opposition to his parents, left-wing 1960’s idealists who were horrified by their son’s values. Fox’s portrayal of Keaton walked a fine line, as he sought to make him both ridiculous and sympathetic, revealing the human being beneath the caricature. Fox’s success on television opened the door for him to take the lead role of Marty McFly in the hit science-fiction movie Back to the Future (1985). Again, Fox’s wry humor was essential to the success of the film, which appealed to a broader audience than the average time-travel movie by refusing to take its own science-fiction premise too seriously. The film’s success made sequels inevitable, but after repeating the role of McFly in Back to the Future II and Back to the Future III, Fox made a point of branching out into other types of performances—both comedic and dramatic—so as to avoid being typecast. The late 1980’s also marked a major change in Fox’s personal life, as he married Tracy Pollan—who had played his girlfriend on Family Ties—on July 16, 1988. They would subsequently have four children together. Impact
Michael J. Fox’s portrayals of Alex P. Keaton and Marty McFly created two of the most memorable and culturally resonant characters of the 1980’s. He continued a successful career as an actor. Although he refused to become typecast and sought a variety of roles, he remained best known for his comedic performances. Subsequent Events In 1998, Fox announced publicly that he had been diagnosed with Parkinson’s disease in 1991. Thereafter, his career began to be eclipsed by his struggles with the disease, a neurological disorder that slowly and progressively robs its victims of the ability to control their voluntary movements. Fox’s high profile as a successful actor allowed him to become a leading advocate of embryonic stem-cell therapy, a controversial technology that promised the ability to rebuild ravaged nervous systems, but at the price of destroying early-stage human embryos, which many people considered to be full-fledged human beings with the same moral rights and status as humans at any other stage of life.
FOX network
■
393
Further Reading
Fox, Michael J. Lucky Man: A Memoir. New York: Random House, 2002. In his own words, Fox discusses both his successes and disappointments. A very readable primary source. Wukovitz, John F. Michael J. Fox. San Diego, Calif.: Lucent Books, 2002. A basic and readable biography, hitting the high points of Fox’s career and life. Leigh Husband Kimmel See also Back to the Future; Family Ties; Film in the United States; Science-fiction films; Television.
■ FOX network Identification
American broadcast television network Date Debuted in 1986 FOX network was the first successful new broadcast television network in decades. Its success demonstrated that ABC, CBS, and NBC, the Big Three networks, no longer monopolized American television audiences. Australian billionaire Rupert Murdoch owned international media conglomerate News Corporation. In 1985, with expansion in mind, Murdoch completed the purchase of Twentieth Century-Fox Studios, acquiring its large film library and television studio. Later that year, he purchased six independent majormarket television stations in New York, Los Angeles, Chicago, Washington, D.C., Dallas-Fort Worth, and Houston, pending the approval of the Federal Communications Commission (FCC). Regulation and the State of the Industry
Federal regulations under the FCC placed a cap on the number of major-market television stations a single company could own, and a separate set of regulations restricted a broadcaster’s ownership of programming. However, the administration of President Ronald Reagan was opposed to government regulation of business in general, and under Reagan, the FCC was instructed to restrict commerce as little as possible. In 1986, the FCC approved Murdoch’s television acquisitions and loosened the regulations regarding programming ownership for News Corporation. Murdoch overcame a third restriction against foreign
394
■
FOX network
ownership of U.S. media by becoming a citizen of the United States. Murdoch and News Corporation intended to start a fourth broadcast network to challenge the three that had dominated U.S. broadcasting for thirty years. By the mid-1980’s, the traditional broadcast networks—the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), and American Broadcasting Company (ABC)—had become vulnerable to such a challenge. Their viewers had recently acquired several viewing alternatives, including basic cable television, premium services such as pay TV, and prerecorded material played on videocassette recorders (VCRs) and other such devices. As a result, Americans’ network loyalties and viewing habits were in a state of flux. The communications industry was also in flux. In 1986, Cap Cities Communications acquired ABC. General Electric acquired Radio Corporation of America (RCA), the parent company of NBC. By fall of 1986, the Loews Corporation assumed control of CBS. While the ownership of the older broadcast networks changed, their attitudes about a potential fourth network did not. They were dismissive of Murdoch’s plans for a FOX network. The initial FOX network affiliates were weaker, ultrahighfrequency (UHF) stations. They reached a mere 22 percent of households with televisions. Moreover, previous attempts at creating a fourth broadcast network had failed. The old networks did not accurately assess the changing television marketplace. Founding a New Network
When Murdoch bought Twentieth Century-Fox Studios, Barry Diller was its head. Diller had revitalized Paramount Studios with hit movies such as Raiders of the Lost Ark (1981), Indiana Jones and the Temple of Doom (1984), and Beverly Hills Cop (1984). As head of ABC in the 1970’s, Diller had improved the network’s ratings, moving it up from its habitual third-place standing to first place. When he moved to Paramount, Diller oversaw production of highly rated television comedies such as Laverne and Shirley, Taxi, and Cheers. Diller shared Murdoch’s commitment to creating a viable fourth broadcast network, and, with his vast television experience, he was the ideal executive to build FOX. Diller put together a team of young executives, some poached from rival networks. They were eager to build a new network, and they set about two central tasks in that regard. The first task was to expand
The Eighties in America
the network’s reach by adding affiliate stations. By the time FOX went on the air in 1986, almost ninety affiliates had been added to the network. The second task was to design a programming strategy that would attract viewers and cultivate an audience. The fledgling network’s research indicated that television consumers were frustrated with stodgy, familiar programming. FOX therefore developed a strategy to produce innovative and edgy shows that would appeal to a young audience that was no longer watching broadcast network television. FOX initially chose a name—the Fox Broadcasting Company, or FBC—in keeping with the three-letter acronyms of the older networks. In 1986, the FOX network made its first foray into original programming, a late-night talk show hosted by Joan Rivers. Rivers was a well-known comedian as the permanent guest host for Johnny Carson on The Tonight Show. The Late Show with Joan Rivers aired in September of 1986. Despite initial interest, audiences soon migrated back to The Tonight Show, and The Late Show with Joan Rivers was off the air seven months after its premiere. News Corporation had lost millions of dollars on FOX, but Murdoch remained determined to find a successful strategy for the network. In 1987, FOX aired its first original prime-time programming, beginning with weekend shows. The network planned to add one night of new programming with each new season, keeping the total number of nights below the number beyond which FOX would officially be designated as a regulated network by the FCC. The low number of hours per week of programming would also help the network minimize costs for producers, writers, and directors. The initial Sunday night schedule included Married . . . with Children, a vulgar family comedy, and The Tracey Ullman Show, a variety show starring British comedian Tracey Ullman. Cartoonist Matt Groening produced an animated segment for The Tracey Ullman Show called The Simpsons. FOX also aired It’s the Garry Shandling Show, an irreverent comedy in which a self-mocking Garry Shandling played the host of a long-running late-night talk show. Veteran producer Stephen J. Cannell provided 21 Jump Street, a police youth drama that focused on young officers who went undercover in high schools. The show starred Johnny Depp. In the next two years, FOX added America’s Most Wanted and COPS, early entries in a new format that
The Eighties in America
became known as reality programming. America’s Most Wanted profiled criminals then at large and encouraged the audience to identify the criminals and help bring them to justice. Successful identifications led to arrests that were filmed and broadcast on subsequent episodes. America’s Most Wanted was the first FOX show to break into the Nielsen ratings’ top fifty television shows. COPS sent video crews out on patrol with real police officers, filming their encounters and pursuits. FOX also produced A Current Affair, a syndicated news show that featured tabloidstyle journalism. In 1989, Fox spun off the animated segment The Simpsons as a half-hour series. The show was a genuine hit and became the anchor for the network’s Sunday night schedule. After just three years, the fourth network was on its way to building a brand and a loyal audience of young viewers. Impact
The FOX network demonstrated that media corporations beyond the Big Three networks could own multiple stations and maintain ownership of content. Following FOX’s success, Time Warner would create the WB network, and Paramount would create UPN. Along with MTV, FOX pioneered the reality show, which would become a significant genre for most broadcast and cable channels. FOX also became an innovator in targeting young audiences, at a time when the advent of cable television and narrowcasting were making such targeted programming more important than it had been in previous decades. As a result, FOX was able to attract significant advertising dollars, despite its smaller broadcast footprint. However, the type of programming FOX used to reach younger audiences caused some critics to complain that it was coarsening American culture in the name of ratings. Nevertheless, the other networks were forced to rethink their relationship to the youth audience in order to compete with FOX. Further Reading
Auletta, Ken. Three Blind Mice: How the TV Networks Lost Their Way. New York: Random House, 1991. Details the factors that lead to the precipitous decline of broadcast network viewership. Excellent behind-the-scenes descriptions. Baker, William F., and George Dessart. Down the Tube: An Inside Account of the Failure of American Television. New York: Basic Books, 1998. Critical examination of media corporatization on television.
Full Metal Jacket
■
395
Hack, Richard. Clash of the Titans: How the Unbridled Ambition of Ted Turner and Rupert Murdoch Has Created Global Empires That Control What We Read and Watch. Beverly Hills: New Millenium Press, 2003. Thorough examination of two extraordinary men. The book details the origins of CNN and the FOX network. Kimmel, Daniel M. The Fourth Network: How Fox Broke the Rules and Reinvented Television. Chicago: Ivan R. Dee, 2004. Presents the history of the FOX network through 2000 with accounts by insiders. Nancy Meyer See also Advertising; Cable television; Demographics of the United States; Married . . . with Children; Television.
■ Full Metal Jacket Identification Vietnam War film Director Stanley Kubrick (1928-1999) Date Released June 26, 1987
Touted by some as “the best war movie ever,” Stanley Kubrick’s Full Metal Jacket stirred critics and war veterans alike. Its violent realism, loose narrative structure, and vision of madness in war changed film-viewing audiences long after the 1980’s. The United States became involved in the conflict in Vietnam in 1955. In 1963, U.S. military advisers in the country numbered 16,000; by 1966, more than 200,000 U.S. soldiers were stationed there. The Communist Tet Offensive early in 1968, portrayed in Full Metal Jacket (1987), signaled the end of U.S. strength and resolve in the region. The final troops withdrew in March, 1973. The nation continued to process its experience of the war long after it ended. Books and essays provided history, cultural analysis, and societal introspection of the conflict. A spate of films— realistic, surrealistic, fictional—appeared in the late 1970’s, only five years after the end of the war. Stanley Kubrick had already made what for many was the definitive World War I film, Paths of Glory (1957), when he chose to make another film portraying the war in Vietnam. He decided to base his film on a war novel, The Short-Timers (1979), by Gustav Hasford. This work had not received much public mention, yet it impressed the director. For years, he worked on a screenplay with the author,
396
■
Full Metal Jacket
The Eighties in America
along with the publicly recognized veteran journalist Michael Herr. The relationship between these men shaped Kubrick’s vision of what a war movie should be. This vision involved a rigorous realism, as Kubrick portrayed a Marine boot camp on Parris Island, South Carolina; the Tet Offensive; and the battle for the city of Hue—the major parts of the film. The film takes a great part of its meaning from its juxtaposition of basic training with active combat, as it is unclear which is worse. The stock characters— bookish, oafish, macho—become trained killing machines, as their drill instructor (played by R. Lee Ermey) dehumanizes them and Lee Ermey as Gunnery Sergeant Hartman, the new recruits’ abusive drill instructor, in strips them of their individuality Full Metal Jacket. (AP/Wide World Photos) to prepare them for war. The trainees become incorporated fierce satirists, Kubrick produced perhaps the most into the squad, the company, and the machinery of compelling vision of war since Robert Altman’s war. After an incident of extreme violence at Parris M*A*S*H (1970). Island, they ship out to Vietnam, where more violence awaits amid the destroyed palm trees, the killFurther Reading ing fields, and the ruins of the Holy City of Hue, Bolton, John. “The War Film.” In American Cinema/ where a chaotic battle with a sniper marks the secAmerican Culture. New York: McGraw-Hill, 1994. ond denouement of the film. Kubrick juxtaposes Gianetti, Louis. Understanding Movies. 10th ed. Enhorror with irony to capture the madness of war, and glewood Cliffs, N.J.: Prentice-Hall, 2004. the surviving Marines march away from Hue singing Hasford, Gustav. The Short-Timers. New York: Harper, The Mickey Mouse Club theme song. 1979. Impact As in Paths of Glory, Kubrick’s portrayal of Herr, Michael. Dispatches. New York: Knopf, 1977. war in Full Metal Jacket emphasized the extent to Mason, Bobbie Ann. In Country. New York: Harper, which soldiers find themselves simultaneously un1985. der attack—albeit in very different ways—both by Melling, Philip H. Vietnam in American Literature. Bosenemy troops and by their own military. By capturton: Twayne, 1990. ing this predicament, he decisively altered the hisO’Brien, Tim. The Things They Carried. New York: tory of war films. Careful to give voice, through Houghton, 1990. onscreen interviews, to a broad spectrum of political James F. O’Neil viewpoints on Vietnam, he nevertheless crafted those viewpoints into a very personal vision of the nature See also Academy Awards; Action films; Film in of war and of military service. As one of socithe United States; Platoon; Stone, Oliver; Vietnam ety’s most biting critics, profound visionaries, and Veterans Memorial.
G ■ Gallagher Identification
American stand-up comic and entertainer Born July 24, 1947; Fort Bragg, North Carolina Thanks to manic performances that revitalized conventional stand-up comedy, a tireless touring schedule, and a savvy use of cable programming, Gallagher became one of the decade’s most recognized comics. Emerging during a time when stand-up comedians such as David Letterman, Jerry Seinfeld, Gary Shandling, and Jay Leno routinely offered wry commentaries on quirky everyday experiences, Gallagher energized his stage show by incorporating physical comedy into his performances. Taking a page from Lily Tomlin’s book, Gallagher performed on a stage designed to resemble a children’s room, with oversized furniture and toys whose scale suggested that Gallagher himself was a child. He engaged his audience while seated at a massive Big Wheel or in a huge high chair. His performance roots were in rock and roll (he was a roadie for several groups in the early 1970’s), giving him a taste for the outrageous. Sporting what would become his signature outfit, loud suspenders and a bowler with tangles of red stringy hair, Gallagher used comedy to reveal the hypocrisies of American materialism; the superficiality of the television generation; the perplexing implications of gender; the incompetence of politicians; and the paradoxical nature of language itself. A graduate of the University of South Florida in 1969, he admitted the influence of the counterculture on both his look and his avant-garde show. Although his show’s prevalent themes reflected a long tradition of using comedy as a vehicle for social and cultural criticism, Gallagher’s outrageous use of props, most notably food, gave his shows a hip, subversive feel. Indeed, he became best known for whacking objects with an oversized wooden mallet, dubbed the Sledge-o-Matic, a bit that parodied kitchen products sold through television infomercials. In this rou-
tine, Gallagher positioned on a slab objects ranging from Big Macs to dead fish to computer parts and then gleefully smashed them, a not-too-subtle commentary on the materialism of America’s culture. As the routine became his signature, audiences nearest the stage would be provided protective gear in anticipation of flying refuse—most notably from his inevitable closer, a watermelon. Because of his reliance on props and the apparently juvenile nature of his act, critics often dismissed his comedy as anti-intellectual. He was seldom asked to appear on late-night network television. Rather, he earned his living touring, and at his peak in the mid1980’s, he performed more than one hundred shows per year. Moreover, his inability to a find a place on broadcast network television led Gallagher to become one of the first comedians to embrace the lucrative possibilities of cable television, producing more than a dozen specials for the premium channel Showtime during the decade. Impact Gallagher’s unapologetic use of props, his interactive stage show, and his edgy development of visual comedy as part of a stand-up stage show created a defiant counterargument to the laid-back stylings of most of the decade’s comedians, who, in intimate club surroundings, would share understated, acerbic insights. Gallagher’s comedy, though, had a point: His playful antics—especially his gargantuan sense of overstatement drawn from rock and roll, vaudeville slapstick, and the visual silliness of classic cartoons— raised unsettling questions about the shallowness and hypocrisy of Ronald Reagan’s America. Further Reading
Double, Oliver. Getting the Joke: The Inner Workings of Stand-Up Comedy. London: Methuen, 2005. Gallagher: Comedy Legend. http://gallaghersmash .com. Limon, John. Stand-Up Comedy in Theory: Or, Abjection in America. Durham, N.C.: Duke University Press, 2000. Joseph Dewey
398
■
The Eighties in America
Gallaudet University protests
See also
Business and the economy in the United States; Cable television; Comedians; Consumerism; Herman, Pee-Wee; Reagan, Ronald; Television.
■ Gallaudet University protests The Event
Weeklong uprising by students demanding the appointment of a deaf university president Date March 6-13, 1988 Place Gallaudet University, Washington, D.C. Students from Gallaudet University, a school for the deaf, staged protests to demand that the next president of their institution be deaf as well. The protests brought new visibility to deaf activists and brought deaf leadership to an institution founded for deaf individuals. Undertanding the 1988 Gallaudet University protests, also known as the Deaf President Now uprising, requires knowledge of the institution’s role in deaf culture and history. In 1864, Edward Gallaudet founded what was then called the Columbia Institution for the Education of the Deaf and Dumb and Blind, in Washington, D.C. Gallaudet’s four-year liberal-arts curriculum was designed for deaf and hard-of-hearing students and used sign language for communication and instruction. Persons who identified as culturally Deaf (the upper-case spelling is intentional within this group) came to know Gallaudet for affirming the values of pride, solidarity, and survival in the “hearing world.” For most of the university’s history, however, deaf people were excluded from consideration for the position of president of Gallaudet. This situation changed dramatically in March, 1988, when students, alumni, and faculty drew international attention with eight days of nonviolent direct action. A search for a new university president had been in its late stages at that point, and expectations that Gallaudet’s board of trustees would choose a deaf candidate were high. The selection of hearing candidate Elisabeth Zinser therefore ignited uncompromising resistance among the student body. In addition to demanding Zinser’s resignation, the Deaf President Now agenda called for the board of trustees to seat a majority of deaf persons among its mem-
bers and demanded that no reprisals be made against the protesters. Gallaudet’s activists used a variety of methods and found a variety of allies. Protesters blocked university entrances with vehicles and used their bodies to disrupt and ultimately take over the campus. The key tactic was reliance on sign language to organize demonstrators, make and implement plans, and frustrate hearing authorities (including police officers). In this environment, deafness and sign language were advantages while speaking and hearing became disabilities. Support from political leaders as different as George H. W. Bush and Jesse Jackson was another unique aspect of the Gallaudet protests. Laborunion members and business owners provided monetary and material resources, but the discipline and determination of Gallaudet’s students were the decisive factors. On March 10, Zinser resigned, but demonstrators continued pressing for the rest of their demands to be met with a rally at the U.S. Capitol. Gallaudet alumnus I. King Jordan became the university’s first deaf president on March 13, and the entire Deaf President Now agenda was implemented. Impact The Deaf President Now uprising challenged public perceptions of deaf people and sign language. Gallaudet’s protesters demonstrated that an inability or refusal to communicate on the hearing world’s terms need not be disabling or futile. Most observers consider the Gallaudet protests part of the disability rights movement, but there is resistance among some deaf activists and advocates to equating deafness and disability. Further Reading
Christiansen, John B., and Sharon N. Barnartt. Deaf President Now! The 1988 Revolution at Gallaudet University. Washington, D.C.: Gallaudet University Press, 1995. Sacks, Oliver. Seeing Voices: A Journey into the World of the Deaf. Berkeley: University of California Press, 1989. Ray Pence See also Bush, George H. W.; Disability rights movement; Jackson, Jesse.
The Eighties in America
■ Gangs Definition
Subcultural groups of persons—often young persons—attributing allegiance to specific territories and often engaged in illegal activities
By the end of the 1980’s, the United States experienced a sharp rise in crime. Simultaneously, gang activity increased and became more lethal, particularly as a result of increases in gang-related homicides and drug trafficking. Historically, gangs in the United States have been confined to urban areas, especially New York, Philadelphia, Los Angeles, and Chicago. However, throughout the 1970’s and 1980’s, gangs spread to other urban areas and even to smaller cities in nearly all fifty states. Gangs of the 1980’s were associated with violence and crime. The crack epidemic of the 1980’s allowed gangs to support themselves economically, while gang members infiltrated the drug markets selling other hard drugs such as powder cocaine, PCP, and heroin. African American gangs, especially the Crips and the Bloods, began concentrating on selling crack cocaine, and drug trafficking provided young gang members with the opportunity to make significant amounts of money. Bloods and Crips of Los Angeles Two of the largest gangs in America, both of which had gained “super gang” status by the 1980’s, were the African American gangs the Bloods and the Crips. Both were based in Los Angeles, California. The Crips were the first to form, rising to power in the late 1960’s. The Crips’ members were younger than most other gang members and were notoriously violent. They held little regard for life or property and, consequently, engaged in a wide spectrum of violent crimes. They terrorized entire neighborhoods, leaving residents afraid to leave their homes after dark, and they were eventually blamed for the record-breaking crime rate in South Central Los Angeles. They adopted the color blue and began wearing blue and white bandanas to signify their allegiance. During the 1980’s, the Crips earned celebrity status, when rap star Snoop Dog (Calvin Brodus) began glorifying gang life as a Crip in his music. This type of music, which became known as “gangsta rap,” helped spread the popularity of gangs, because it portrayed gangs as offering impoverished adolescents a chance to thrive in their inner-city environments.
Gangs
■
399
In opposition to the Crips and for self-protection, a band of juveniles living on Piru Street in Compton, another inner-city neighborhood of Los Angeles, formed a gang called the Compton Piru Street Boys, which later became known as the Bloods. They chose red bandanas and quickly gained recognition as an opposing force to the Crips. Other local gangs joined the fight, and the Bloods grew rapidly, becoming a more unified opponent for the Crips. The Bloods remained a relatively small gang compared to the Crips, but they became violent and powerful enough to survive and even thrive as a legitimate threat to the latter gang. By the 1980’s, nearly every predominantly African American neighborhood in Los Angeles was dominated by either the Bloods or the Crips, and both gangs were perceived as serious threats to local law enforcement. People and Folks of Chicago A war similar to the war between the Crips and Bloods in Los Angeles erupted in Chicago. The Chicago war was between the People Nation and the Folk Nation. By the end of the 1980’s, nearly every other gang in Chicago claimed affiliation with one or the other, and both gangs gained strength, recognition, and unity as legitimate gangs. The People Nation traced its heritage back to a gang known as the Blackstone Rangers, which first formed in the 1960’s. The Folk Nation first began as the Black Gangster Disciples and also formed in the 1960’s. Unlike the Crips and Bloods, which were predominantly African American and very resistant to admitting members of other racial and ethnic groups, the People and Folks were much more racially diverse. Also, many of the smaller regional gangs that paid national allegiance to the People or the Folks actually went by another name altogether, unlike the regional gangs of the Crips and Bloods which almost always retained their affiliated gang name. Thus, the People and Folks acted as umbrellas that encompassed many gangs underneath them, and it was not uncommon for regional gangs to switch allegiance from the Folks to the People or vice versa. Also unlike the Crips and Bloods, which reached nationwide, the People and Folks were located mainly in the Midwest and the East. The Folks allied themselves with the Crips to offset an alliance between the People and the Bloods. While the Crips and Bloods wore blue or red to identify themselves, members of the People and the Folks instead devel-
400
■
The Eighties in America
Garneau, Marc
oped a tradition of signaling their allegiance by decorating one side of their bodies more than the other side. The Folks identified themselves by emphasizing the right side of their bodies with their clothing, jewelry, and so on. Their members might wear baseball caps facing toward their right sides or roll up their right pant legs, for example. Similarly, the People identified themselves by emphasizing the left side of their bodies. Impact
By the 1980’s, the Crips became one of the most powerful gangs in Los Angeles, with a thriving cocaine business. Once they realized that they could maximize their profits by exploiting the crack epidemic, they sought to control the cocaine market, propelling themselves to spread their gang from one coast to the other. They established smaller chapters in other urban cities and rural areas, reaching from the Midwest to the East Coast. In response to the spread of the Crips to all corners of the country, the Bloods also successfully spread to nearly all fifty states. Throughout the 1980’s, the Bloods began to dominate the East Coast, becoming the largest gang in New York City. The Bloods also exploited the crack epidemic of the 1980’s, establishing a lucrative crack trade rivaling that of the Crips.
■ Garneau, Marc Identification Canada’s first astronaut Born February 23, 1949; Quebec City, Canada
Garneau, the first Canadian to fly in space, made Canadians more aware of the accomplishments of their space program, paving the way for Canadian participation in the International Space Station. Canada and the United States have cooperated in space exploration since the beginning of the Space Age. Canada’s first satellite, Alouette 1, was launched on a U.S. rocket in 1962. The development of the space shuttle, which could carry seven astronauts into orbit, provided an opportunity to extend this cooperation to human space travel. In December, 1983, Marc Garneau was one of six Canadians selected from about four thousand applicants for astronaut training.
Further Reading
Delany, Tim. American Street Gangs. Upper Saddle River, N.J.: Pearson Prentice-Hall, 2006. Comprehensive description of gangs, discussing the different types that exist, why they form, and how law enforcement copes with the increasing threat. Hagedorn, John. People and Folks: Gangs, Crime, and the Underclass in a Rustbelt City. Chicago: Lake View Press, 1988. Two gangs from Los Angeles and Milwaukee are portrayed as institutionalized entities in impoverished areas, where gang members move underground to perpetuate the underclass. Huff, C. Ronald. Gangs in America. Thousand Oaks, Calif.: Sage, 1996. Collection of academic writings that address methodological issues in analyzing gangs, criminological and ecological factors used to explain gangs, the socioeconomics of gang operations, and behavioral aspects of ethnicity and gender. Kilby Raptopoulos See also
African Americans; Crack epidemic; Crime; Hip-hop and rap; Organized crime; Slang and slogans.
Astronaut Marc Garneau prepares for a space shuttle mission in 1996. (NASA-KSC)
The Eighties in America
Garneau had earned a bachelor of science degree in engineering physics from the Royal Military College in Kingston, Ontario, in 1970, and a doctorate in electrical engineering from the Imperial College of Science and Technology in London, England, in 1973. He then joined the Royal Canadian Navy, serving in various engineering roles until his selection as an astronaut. Canada’s astronauts began to be trained by the National Aeronautics and Space Administration (NASA) at the agency’s Johnson Space Center in Houston, Texas, in February, 1984. Garneau became the first Canadian in space, flying as a payload specialist on the space shuttle Challenger from October 5 to 13, 1984. His major responsibility as a payload specialist was to conduct experiments, and on this mission he operated CANEX-1, a package of Canadian experiments focused on the human body’s responses to being in space, including the sensitivity of nerve endings and motion sickness. Because he lacked training as a mission specialist (an astronaut whose primary responsibility is the operation of spacecraft systems), Garneau could not operate the Shuttle Remote Manipulator System (SRMS), a robotic arm developed and built in Canada that flew on all shuttle missions. Subsequent to his successful completion of his shuttle mission, Garneau left the Navy in 1989 to become deputy director of the Canadian Astronaut Program, as part of the newly formed Canadian Space Agency. Impact Garneau’s role as the first Canadian in space was overshadowed in the United States, because the mission he joined was also the first one on which two woman astronauts, Sally Ride and Kathryn Sullivan, flew into orbit on the same mission. Garneau’s flight received widespread attention in Canada, however. It made Canadian citizens more aware of their country’s efforts in space exploration and paved the way for Canada’s role in the International Space Station (ISS). Subsequent Events
Garneau returned to the Johnson Space Center to train as a mission specialist in August, 1992. He worked as a capsule communicator for several shuttle flights before being selected to fly on Endeavour from May 19 to 29, 1996. On that mission, Garneau used the SRMS to retrieve a satellite called SPARTAN and return it to the shuttle’s payload bay. Garneau flew his third shuttle mission on Endeavour from November 30 to December 11,
Gehry, Frank
■
401
2000. He used the SRMS to install solar panels on the ISS and became the first Canadian to enter the ISS crew area. Garneau was appointed executive vice president of the Canadian Space Agency in February, 2001. He became its president in November of the same year. Further Reading
Dotto, Lydia. The Astronauts: Canada’s Voyageurs in Space. Toronto: Stoddart, 1993. _______. Canada in Space. Toronto: Irwin, 1987. George J. Flynn See also Canada and the United States; Science and technology; Space exploration; Space shuttle program.
■ Gehry, Frank Identification Canadian American architect Born February 28, 1929; Toronto, Ontario
Modernist architect Frank Gehry emphasized the artistic aspects of architecture and influenced architects to approach buildings as sculptural objects. Frank Gehry, born Ephraim Owen Goldberg, grew up in Toronto and Timmins in Canada. In 1947, his family moved to Los Angeles, and he subsequently became a naturalized American citizen. In 1954, he graduated from the University of Southern California (USC). He had worked as an intern with Victor Gruen Associates while still in school and joined the firm as a full-time architect after earning his degree. The draft interrupted his career, however, and Gehry spent a year in the U.S. Army. He then studied urban planning at Harvard University Graduate School of Design. Returning to Los Angeles with his degree, Gehry worked briefly for Pereira and Luckman before rejoining Gruen. He stayed at Gruen until moving in 1961 to Paris to join André Remondet for a year. In 1962, Gehry opened his own firm, Frank Gehry LLP, in Los Angeles. Gehry had artist friends, such as Claes Oldenburg and Jasper Johns, who used inexpensive materials like broken wood and paper to create beautiful works of art. Dissatisfied with the buildings that he was creating, Gehry decided to apply the artistic techniques of his friends to the construction of buildings. When asked for his architectural in-
402
■
The Eighties in America
Gehry, Frank
fluences, he named sculptor Constantin Brancusi before listing architects Alvar Aalto and Philip Johnson. Gehry’s work became known for its unfinished quality. His buildings were characterized by a reliance on harsh, unfinished materials such as chain link, exposed pipe, corrugated aluminum, and utility-grade construction board. He typically juxtaposed such materials with simple geometric forms. By the end of the 1980’s, Gehry’s designs included the Merriweather Post Pavilion of Music in Columbia, Mar yland, and the Aerospace Museum in Exposition Park in Los Angeles. Santa Monica Place, a large shopping mall, included an outside wall that was three hundred feet long and six stories tall, hung with a curtain of chain link. Shaped like a fish, the Fish-Dance Restaurant in Kobe, Japan, became one of the architect’s most discussed buildings. Gehry received one of his most important commissions, the $100 million Walt Disney Concert Hall for the Los Angeles Music Center, in early 1989. Four months later, on May 18, 1989, Gehry won the Pritzker Architecture Prize for his lifetime body of work. An international award, it is the most prestigious honor in architecture. Gehry, the twelfth Pritzker laureate, accepted the honor at the Todai-ji Temple in Nara, Japan.
The Fish-Dance Restaurant in Kobe, Japan, became one of architect Frank Gehry’s most talked about buildings. (663highland/cc-by-a-2.5)
Further Reading Impact
Gehry made his mark in architecture by using simple materials to solve the complex problems of designing structures that were safe, useful, respectful of context, and pleasing to a client. His structures bridged the line between art and architecture, and they came to define for many observers the architectural style of modernism.
Friedman, Mildred, ed. Gehry Talks: Architecture + Process. New York: Universe, 2002. Reid, Dennis, ed. Frank Gehry. Toronto: Art Gallery of Ontario, 2006. Stungo, Naomi. Frank Gehry. London: Carlton, 2000. Caryn E. Neumann See also Architecture.
The Eighties in America
Gender gap in voting
■ Gender gap in voting Definition
The difference in the percentage of men and the percentage of women voting for a given candidate or issue in an election
A trend differtiating the voting patterns of men and women emerged and came to be acknowledged in the 1980’s. As a result of this so-called gender gap, political campaigns adopted new strategies designed specifically to attract women’s votes. When women received suffrage in 1920, many scholars believed that female voters would immediately unite and vote as a bloc, wielding significant political influence. However, until 1980, women voters— though voting in smaller numbers than men— largely cast votes that followed the same pattern of the men voting in any given election. In the 1976 presidential election, for example, women and men voted for Democrat Jimmy Carter or Republican Gerald Ford in almost identical proportions. The 1980 presidential election, however, produced a dramatic split between male and female voters, as women supported liberal views and voted for Democratic candidates more so than men. This split was labeled “the gender gap.” Following 1980, this gender gap continued, although it would vary in its strength and effects in subsequent elections. Voting Patterns
Throughout the 1980’s, the gender gap in voting patterns existed across all other major demographic categories, including marital status, race, age, education, and income. The gap was consistent and often had a decisive impact on election outcomes during the decade. As women’s political participation increased and their voting rates became comparable to men’s, a gender gap in party identification emerged. Women identified with the Democratic Party, while men shifted their allegiance to the Republican Party. Women, more than men, held and supported more liberal and activist views consistent with the Democratic positions on issues such as the use of force, violence, feminism, race, and the proper role of government.
■
403
Statistics in Presidential Elections In 1980, women preferred Ronald Reagan to Jimmy Carter, but by a smaller margin than did men. Among women voters, 45 percent voted for Carter, 46 percent voted for Reagan, and 7 percent voted for John Anderson. Among men, 37 percent voted for Carter, 54 percent for Reagan, and 7 percent for Anderson. Thus, while a majority of both sexes voted for Reagan, a gender gap of 9 percent was evident between the major parties. In 1984, even with a female candidate for vice president running on the Democratic ticket, a majority of both sexes again voted for Reagan, but the gender gap persisted. Men favored Reagan over Walter Mondale 63 percent to 36 percent, while women only supported Reagan by 56 percent to 44 percent. Women were thus 8 percentage points more likely than men to vote for Mondale. In the 1988 presidential race, George H. W. Bush received 57 percent of the men’s vote and only 50 percent of the women’s vote, while Michael Dukakis captured 49 percent of women’s ballots and only 41 percent of men’s. Almost half of women preferred Dukakis, the Democratic candidate, while a majority of men voted Republican.
The Gender Gap in the 1980’s
A gender gap in voters’ political party identification became evident in the early 1980’s. Larger proportions of women considered themselves to be Democrats, while more men identified themselves as Republicans, as reflected in these poll statistics: Democrats
Republicans
Women
Men
Women
Men
June, 1983
43%
32%
21%
25%
April, 1984
40%
37%
28%
31%
Date
May, 1985
38%
30%
31%
28%
June, 1986
40%
35%
29%
28%
May, 1987
44%
35%
30%
31%
May, 1988
41%
32%
29%
31%
June, 1989
36%
32%
31%
31%
May, 1990
38%
28%
30%
32%
Source: CBS/The New York Times and the Center for American Women and Politics.
404
■
The Eighties in America
Gender gap in voting
Explanations for the Gender Gap
An interplay of social, economic, and psychological factors produced the gender gap. Most explanations for the gender gap in the 1980 election centered on the Republican Party’s abandonment of support for the Equal Rights Amendment and Reagan’s conser vative policy agenda. Many claimed that the feminist movement contributed to the gender gap by publicizing differences in male and female perspectives about social issues and by encouraging women to vote. Others proposed economic explanations for the gender gap, suggesting it related to differences in socioeconomic status between men and women. In the 1980 election, however, women from higher income categories voted similarly to those with lower incomes. Overall, however, women may have seen themselves as more economically vulnerable than men, even when they did not occupy a lower income category. Women tended to vote on issues pertaining to the economy, reflecting a national concern and a goal of establishing equality, while men tended to engage in pocketbook voting based upon their personal interests rather than those of the country. The nurturance perspective argues that many women are taught to be more compassionate than are many men. Women, thus, may view politics through the lens of a caregiver and vote accordingly. As compared to men, women tend more consistently to support reducing military involvement, enforcing fewer criminal penalties, taking more actions to protect the environment, and spending more for social programs. Women’s rising labor-force participation has also been suggested as an explanation for the gender gap. As more women entered the labor force, the gap widened. Employed women often engaged in more political discussions and policy debates, and they often received more information about candidates, than did those whose lives focused more consistently in the domestic sphere. Labor-force participation may also have provided women with opportunities to challenge traditional gender expectations and exposed them more directly to gender inequalities in the form of lower wages. These experiences would both have shaped women’s political opinions and increased their tendency to see themselves as wielders of power in the public sphere.
Impact The gender gap forced politicians to recognize women’s power as a political interest group and raised awareness of women’s issues such as abortion
rights, the use of force, and environmental protection. Although it did not affect the outcome of presidential races in the 1980’s, the gender gap may have aided the Democratic Party in retaining control of the House of Representatives in the face of the Reagan Revolution, and it forced both parties to modify their platforms in order to court women voters. Further Reading
Abzug, Bella. Gender Gap. Boston: Houghton Mifflin, 1984. Gives an in-depth view of the 1980 election and the gender gap. Center for the American Woman and Politics. Eagleton Institute of Politics, Rutgers University. http:// www.cawp.rutgers.edu. Provides a comprehensive summary of statistics regarding women and politics. Chaney, Carole Kennedy, et al. “Explaining the Gender Gap in U.S. Presidential Elections, 1980-1992.” Political Research Quarterly 51, no. 2 (1998): 311339. Examines the gender gap using theories regarding the different issue emphases of the two genders. Howell, Susan E., et al. “Complexities of the Gender Gap.” The Journal of Politics 62, no. 3 (August, 2000): 858-874. Relates the gender gap to women’s cultural roles and increased autonomy from men. Ingelhart, Ronald, et al. “The Gender Gap in Voting and Public Opinion.” In Rising Tide: Gender, Equality, and Cultural Change Around the World. New York: Cambridge University Press, 2003. Thorough overview of the development of the gender gap from a cross-cultural perspective. Manza, Jeff, et al. “The Gender Gap in U.S. Presidential Elections: When? Why? Implications?” The American Journal of Sociology 103, no. 5 (March 1998): 1235-1266. Good historical overview of the gender gap and compares multiple theories of why the gender gap exists. Norrander, Barbara. “The Evolution of the Gender Gap.” Public Opinion Quarterly 63 (1999): 566-576. Discusses implication of independent voters on the gender gap. Smeal, Eleanor. Why and How Women Will Elect the Next President. New York: Harper & Row, 1984. A feminist view of the gender gap by a former president of the National Organization for Women. Wirls, Daniel. “Reinterpreting the Gender Gap.” Public Opinion Quarterly 50, no. 3 (Autumn 1986):
The Eighties in America
316-330. Assesses how changing party preferences contributed to the gender gap. Barbara E. Johnson See also Conservatism in U.S. politics; Bush, George H. W.; Dukakis, Michael; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Ferraro, Geraldine; Liberalism in U.S. politics; Mondale, Walter; Reagan, Ronald; Reagan Democrats; Reagan Revolution; Reaganomics.
■ General Hospital Identification American soap opera Date: Premiered April 1, 1963
Once struggling in the ratings, General Hospital introduced innovative characters and plotlines during the late 1970’s and early 1980’s that made it the most popular American soap opera. Its newfound success made the show influential as well, as competitors began to copy its winning formula. Created by Frank Hursley and Doris Hursley, General Hospital debuted on the American Broadcasting Company (ABC) television network on April 1, 1963. Set in the fictional town of Port Charles, New York, the show was scheduled as a half-hour daytime drama. Initially, most of the story lines took place at the Port Charles Hospital and focused on the lives of Dr. Steve Hardy (John Beradino) and Nurse Jessie Brewer (Emily McLaughlin). Only marginally successful, the show was threatened with cancellation by network executives in 1978. In an attempt to save the struggling soap opera, ABC executives hired Gloria Monty as its new executive producer in 1978. That same year, the show was expanded to an hour-long format. Under Monty’s guidance and with the writing leadership of Douglas Marland, the characters of Laura Webber (Genie Francis) and Luke Spencer (Anthony Geary) were introduced to General Hospital fans. Luke and Laura’s subsequent romance became enormously popular with viewers. The couple’s wedding on November 16, 1981, was watched by approximately 30 million people, and the episode continues to hold the distinction of being the highest-rated episode in American soap opera history. The phenomenal success of Luke and
General Hospital
■
405
Laura’s romance caused a neologism to be coined to describe them: “supercouple.” Other supercouples on General Hospital that became extremely popular with fans during the 1980’s were Robert Scorpio (Tristan Rogers) and Holly Sutton (Emma Samms), Duke Lavery (Ian Buchanan) and Anna Devane (Finola Hughes), and Frisco Jones (Jack Wagner) and Felicia Cummings (Kristina Wagner). Along with these power couples, the writers also introduced adventure plots into the show, which had previously been a medical drama. Many of the characters’ story lines began to revolve around spy mysteries. In fact, the characters of Scorpio and Devane were introduced to viewers as international spies for the fictional organization, the World Security Bureau (WSB). This combination of the supercouple and the action/adventure story line made General Hospital the number-one-rated soap opera between 1979 and 1988. The series also won seven Daytime
Anthony Geary and Genie Francis as Luke and Laura in General Hospital. (AP/Wide World Photos)
406
■
Generation X
Emmy Awards during the 1980’s, including two Emmys for Outstanding Drama Series, in 1981 and 1984. In 1982, Anthony Geary won the Daytime Emmy Award for Outstanding Lead Actor in a Drama Series for his portrayal of Luke Spencer. Impact
General Hospital introduced the action/adventure plot and the supercouple to daytime drama. The enormous popularity of both devices catapulted the once-struggling soap opera to the top of the Neilsen ratings. Equally important, the series became a leader in the industry, as competing shows attempted to model their story lines after those of General Hospital.
Further Reading
Simon, Ron, et al. Worlds Without End: The Art and History of the Soap Opera. London: Harry N. Abrams, 1997. Spence, Louise. Watching Daytime Soap Operas: The Power of Pleasure. Middletown, Conn.: Wesleyan University Press, 2005. Warner, Gary. General Hospital: The Complete Scrapbook. Toronto, Ont.: Stoddart, 1995. Bernadette Zbicki Heiney See also
Dallas; Dynasty; Soap operas; Television.
■ Generation X Definition
The generation immediately following the baby-boom generation
Generation X came of age during the 1980’s, the decade during which it first acquired a unique (though contested) cultural identity and the first decade to be shaped in part by the generation’s goals and values. Born between the early 1960’s and the mid-1970’s (the precise years are a matter of controversy), most members of Generation X experienced childhood or young adulthood during the 1980’s. The popular culture and political events of the 1980’s had profound effects on this particular segment of the population, and the decade was also defined in part by a cultural conflict between Generation X and the baby-boom generation that preceded it. Generation X was affected by many social developments. Increased divorce rates led to single parents raising children alone, so many Generation X children split their time between the separate households of fa-
The Eighties in America
ther and mother. Even in traditional households, both parents were often employed. Divorce or parental employment frequently resulted in “latch-key kids,” children who spent afternoons at home after school with no adult supervision. The popularity of musical artists and groups including Madonna, Michael Jackson, Bon Jovi, Duran Duran, New Kids on the Block, and Run-D.M.C. reflected the diverse musical tastes of Generation X. One issue that affected Generation X’s access to music was the requirement of parental advisory stickers on albums containing offensive material. Likewise, the creation of the PG-13 rating for films that were considered inappropriate for pre-teens affected Generation X more than any other segment of the population. Nevertheless, movies including E.T.: The ExtraTerrestrial and The Breakfast Club entertained and influenced Generation X. Fashions of the 1980’s also defined Generation X; there were several fashion trends (neon and spandex), clothing items (parachute pants and leg warmers), and accessories (banana clips) that Generation X embraced. In terms of societal contributions, Generation X invented a large number of the decade’s slang and catchphrases. Technological innovations profoundly shaped Generation X in the 1980’s. The television remote control; Atari, Nintendo, and Sega video-game entertainment systems; Apple and IBM-compatible personal computers; Sony’s Walkman and Discman; and compact discs, to name only a few, were developed or made commercially available during the decade. Many toys created or popularized during the 1980’s enjoyed commercial success that was due in large part to Generation X. Several of these products (including Rubik’s Cube and Cabbage Patch Kids) quickly became cultural mainstays. Television played a significant role in the development of Generation X. The advent of cable television during the 1980’s dramatically increased the number of programs on the air. Nickelodeon and MTV, among other cable channels, provided hours of entertainment and became staples of the Generation X culture. Programs including 3-2-1 Contact, Schoolhouse Rock, and the Cable in the Classroom initiative utilized television to educate Generation X. However, television also brought dramatic impact to the lives of Generation X. For example, millions of stunned students watched in classrooms and packed auditoriums as the space shuttle Challenger exploded during takeoff in 1986, a defining event for Generation X.
The Eighties in America Impact During the 1980’s, the members of Generation X began to develop their own generational identity, largely in reaction to the baby boomers that preceded them. That identity was unique, but it was not coherent: Generation X was often represented as merely reacting against the values of the preceding generation without developing values of its own. Children of the 1980’s rebelling against the values of the 1960’s became a frequent subject in popular culture, in such television programs as Family Ties (19821989) and such feature films as River’s Edge (1986). Subsequent Events
The 1980’s ended without naming the generation it had seen come of age. This was short-lived, however. Canadian novelist Douglas Coupland’s Generation X: Tales for an Accelerated Culture (1991) produced a name for which the media had been searching. By the time the novel came out, the generation had been featured on the cover of the July 16, 1990, issue of Time magazine, under the label “twentysomething.” That cover showed a group of black-clad individuals, all looking past one another in various directions, emphasizing the lack of coherent self-identity of the new generation.
Further Reading
Craig, Stephen C., and Stephen Earl Bennett, eds. After the Boom: The Politics of Generation X. Lanham, Md.: Rowman & Littlefield, 1997. Owen, Rob. Gen X TV: The Brady Bunch to Melrose Place. Syracuse, N.Y.: Syracuse University Press, 1997. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of that Decadent Decade. New York: St. Martin’s Griffin, 1996. Thau, Richard D., and Jay S. Heflin, eds. Generations Apart: Xers vs. Boomers vs. the Elderly. Amherst, N.Y.: Prometheus Books, 1997. Matthew Schmitz See also
Breakfast Club, The; Business and the economy in the United States; Cabbage Patch Kids; Cable television; Challenger disaster; Children’s television; Education in the United States; Empire Strikes Back, The; E.T.: The Extra-Terrestrial; Family Ties; Fashions and clothing; Film in the United States; Marriage and divorce; MTV; Multiculturalism in education; Music; Parental advisory stickers; PG-13 rating; Pop music; Slang and slogans; Teen films; Television; Toys and games; Video games and arcades.
Genetics research
■
407
■ Genetics research Identification
Scientific investigations of the role, function, and manipulation of the biochemical mechanisms governing heredity and variation in organisms
The 1980’s represented a transition period in the field of genetics, a time during which techniques and technologies developed in the previous decade were applied to human research. Subsequent applications were at the level of the genome itself. Scientists’ understanding of the nature of deoxyribonucleic acid (DNA) in general and the genome in particular dated from the publication of a proposed structure by James Watson and Francis Crick in 1953. The following decades saw development of molecular research techniques, first in understanding how DNA encodes genetic information for an organism, then in beginning to understand how a species’s genome is expressed and regulated. The study of mutations in bacteria and bacterial viruses provided an overview of the manner in which changes in an organism’s DNA result in alterations within the organism itself. Applications of this principle to more complex biological entities, including humans, were not immediately possible, given the size and complexity of these organisms. Genetic Mapping During the late 1960’s, it was discovered that enzymes obtained from bacteria could be used to cut strands of DNA at specific locations. These so-called restriction enzymes were found to generate specifically sized fragments of DNA based upon the specific sequence of nucleotides of which the strand of DNA being cut was composed. Mutations in DNA would result in variations in the pattern of fragmentation generated by these enzymes. Such “markers” could serve as surrogates in determining the region in which DNA defects might have originated, as well as indicating whether or not a specific mutation was present. Further, linkage maps were generated, indicating the positions of numerous genes relative to one another. In 1983, applying these surrogates, James Gusella and his collaborators demonstrated that the mutation that results in Huntington’s disease was located near the tip of chromosome 4. As a result, it became possible to determine whether or not a tested individual carried that mutation. In later years, the same
408
■
The Eighties in America
Gentrification
procedure was used to identify the locus of numerous other genetic mutations. Human Genome Project
Mapping the entire sequence of nucleotides in the human genome long represented the holy grail of human genetics. Prior to the 1980’s, the sheer number of base pairs constituting the genome precluded analysis at that level. During the 1970’s, however, techniques were developed that allowed for the sequencing of large segments of DNA, including the human genome. In 1975, Fredrick Sanger first published a method for sequencing DNA; the procedure was quickly superseded by one developed by Allan Maxam and Walter Gilbert that same year. Improvements in computer technology at the same time allowed for the process of sequencing to be carried out in a rapid and highly efficient manner. In 1981, molecular techniques accomplished the sequencing of the first genome found in human cells: mitochondrial DNA. Only the year before had it been found that one’s mitochondria and mitochondrial DNA are inherited solely from one’s mother. Though the mitochondrial DNA consisted of only 16,500 base pairs, the demonstration that sequencing was practical on that scale set in motion the idea of applying the same technique to the genome as a whole. Consequently, in 1984 and 1985, Robert Sinsheimer, Nobel laureate Renato Dulbecco, and others began lobbying for funds to begin the project. Subsequently, the National Research Council, a component of the prestigious National Academy of Sciences, established a committee to set the program in motion.
Impact
During the 1980’s, theoretical genetic knowledge and recently developed practical techniques were combined and applied to the study of human genes and DNA. Tests for some genetic mutations were developed, the Human Genome Project was begun, and DNA fingerprinting became a practical reality. Thus, geneticists transformed the fields of medicine and law enforcement, as well as contributing to the general understanding of human heredity and the role and functioning of genes on the most basic, molecular level.
Further Reading
Hartl, Daniel, and Elizabeth Jones. Genetics: Analysis of Genes and Genomes. 6th ed. Sudbury, Mass.: Jones and Bartlett, 2004. Reviews both classical and modern genetics, including the development of research techniques during the period between the 1970’s and 2000. Khoury, Muin, et al. Fundamentals of Genetic Epidemiology. New York: Oxford University Press, 1993. Applies the role of genetic factors with development of disease in human populations. Includes a summary of genetic techniques. Ott, Jurg. Analysis of Human Genetic Linkage. Baltimore: Johns Hopkins University Press, 1999. Summary of techniques used to determine genetic loci as applied to human genetic diseases. Richard Adler See also
Bioengineering; DNA fingerprinting; Fetal medicine; Health care in the United States; Medicine; Science and technology.
Forensic Analysis
Because each person’s DNA is unique, incubating a person’s DNA with restriction enzymes generates a sequence of fragments unique to that individual. This technique is known as “restriction fragment length polymorphism” (RFLP). In 1984, RFLP was successfully used to identify individuals by their so-called DNA fingerprint. The same technique was also used to determine whether different persons were genetically related. RFLP was quickly adapted to use by police conducting forensic investigations and by people attempting to establish a child’s paternity. DNA fingerprinting was initially used in 1984 to identify the families of kidnapped children in Argentina. In 1989, DNA fingerprinting was used for the first time to identify the suspect in a murder case in the United Kingdom.
■ Gentrification Definition
Changes in the population of urban districts resulting in the raising of rents and property values
Gentrification refers to changes in urban centers in terms of populations, demographics, character, and culture. It entails wealthier people moving into urban centers and changing the housing stock. The process is most often characterized by social, socioeconomic, and often racial tensions between original residents and those who have moved to these neighborhoods. Through the latter half of the twentieth century, the majority of American cities came to be characterized
The Eighties in America
by such social characteristics as a polarization of a city’s various communities, racial tensions, urban decay, and a growing crime rate. In tandem with these social patterns, American cities saw trends that included depopulation, deterioration of central housing stock, and a migration of people and jobs out of central business districts. Background Postwar America saw much social change. As soldiers returned home from the battlefront, America had a newfound wealth. Two major social shifts in postwar America directly affected future urban change: The first was America’s increasing dependence on the automobile, and the second was suburbanization. Postwar America demanded more cars, oftentimes more than one vehicle per family—something theretofore unheard of. Further, white America was increasingly moving out of center cities and into new communities outside the boundaries of cities, creating suburbs. While there had indeed been suburbs previous to this time, they were not a norm. Thus America became at once more separated in living patterns along racial and ethnic lines and more separated along lines of accessibility to resources, with transportation being one point of access to these resources. Suburban lifestyle and community structures were different in several ways from lifestyles and community structures that America had known previously. There was an increasing reliance not only on the automobile but also on a lifestyle that revolved around the convenience of the car. In part this included architecture and how new communities were planned. Suburbs saw the genesis of services that were all done from the convenience of the car, including eating, banking, watching movies, and dry-cleaning clothes. This suburban lifestyle stood in stark contrast to the urban lifestyle, which in many core ways remained unchanged. Those living in city centers tended to still rely on public transportation or to walk. They also tended to live in more densely packed habitations, such as in tenement buildings or apartment buildings, which differed from the increasing suburban norm of two people or a family living to one house with a yard and a garage for the car. City governments and the federal government of the 1940’s to the early 1970’s responded to these tremendous changes by implementing various urban
Gentrification
■
409
renewal programs. The federal government enacted the Housing Act of 1949, which began a process of wholesale demolition of urban neighborhoods that were deemed “slums.” Slum removal was also known under the popular moniker “urban renewal.” Slums were increasing in number and size: They were characterized by an increasing concentration of poor nonwhites and deterioration of this central-city housing stock. The Housing Act of 1949 and urban renewal were accompanied by the growth of the interstate highway system. With Americans’ increasing reliance on the car, the federal government spent millions of dollars on the development of highway systems that cut through cities and traversed states, connecting America from coast to coast, from north to south. Most of the country’s interstate highway system was completed between 1960 and 1990. The Situation in the 1980’s By the 1980’s, American city centers had changed drastically from their prewar population density, demographics, and community structures. Many city centers suffered from decaying neighborhoods, eroded tax bases, increasing crime rates, and decreasing city services. The polarization of city and suburb along lines of race and class was often a constant across urban centers. City governments, community groups, and private companies struggled to grapple with these realities of urban America. One of the most common strategies for dealing with the situation became the gentrification of central-city neighborhoods. Numerous gentrification projects were undertaken during the decade, and much of urban America was reimagined. Gentrification relies on changing the character of city centers and changing the use of land and properties. It also, by nature, includes shifts in urban populations within neighborhoods. These same neighborhoods often are adjacent to the financial heart of many cities, thus making them attractive for redevelopment. One of the goals of gentrification projects is the revitalization of commercial and housing stock, thus increasing its value and resale price. In the process of gentrification, central business districts replace their declining manufacturing and retail sectors with all-new service sectors that usually cater to whitecollar workers. Gentrification projects tend to involve major commercial restoration of historic downtown neighborhoods and sites. These projects are
410
■
The Eighties in America
Gere, Richard
often undertaken by a hybrid of municipal entities and private companies, sometimes accompanied by grassroots community groups, such as block clubs. Controversies It is important to note that at the same time that gentrification projects were undertaken across the United States, President Ronald Reagan decreased federal housing assistance by more than 75 percent from 1982 to 1988. This further exacerbated social shifts and the need to address these shifts. Reagan encouraged a system of reliance on private interests for everything from housing and community development to education. Gentrification projects have come under much criticism and created controversy. Conflicts that surfaced beginning in the 1980’s tend to center on class and often on race and ethnicity. A large concern is its potential displacement of vulnerable populations— much like the urban renewal efforts of the preceding decades. Often, nonwhite neighborhoods changed demographically as a result of gentrification. Some studies argued that gentrification encouraged crime, as increasingly wealthy neighborhoods were ringed by poverty-stricken neighborhoods. Rental properties, as a pattern, tend to rise in value, and thus price, often dramatically. Traditional renters are therefore put under pressure. Low-income individuals and families are pushed out of their homes as they increasingly find that they cannot afford them. Impact Gentrification efforts in the 1980’s were a response to various social concerns and economic realities. Gentrified neighborhoods often reflected changing lifestyles and larger social changes, such as a shift away from the nuclear family, higher numbers of single adults living together, single adults living on their own, same-sex couples, and women entering the workforce. Further Reading
Brenner, Neil, and Nikolas Theodore. Spaces of Neoliberalism: Urban Restructuring in North America and Western Europe. Malden, Mass.: Blackwell, 2002. Analyzes the role of “neoliberalism” (a term also used interchangeably with “globalization”) in contemporary processes of urban restructuring plans. Keating, W. Dennis, and Norman Krumholz, eds. Rebuilding Urban Neighborhoods: Achievements, Opportunities, and Limits. Thousand Oaks, Calif.: Sage Publications, 1999. Examines the efforts and
achievements of community organizations and individuals in rebuilding many of America’s poorest and most crime-ridden urban neighborhoods. Kolson, Kenneth L. Big Plans: The Allure and Folly of Urban Design. Baltimore: Johns Hopkins University Press, 2001. A critique of many urban restructuring plans, citing examples of communities that lose sight of their inhabitants in such plans. Miller, Zane L., and Edward Bruce Tucker. Changing Plans for America’s Inner Cities: Cincinnati’s Over-theRhine and Twentieth-Century Urbanism. Columbus: Ohio State University Press, 1998. A historiography of Cincinnati’s inner-city neighborhood. Discusses the various schemes that it has undergone: comprehensive planning, zoning, slum clearance, redevelopment, and neighborhood conservation and rehabilitation. Taylor, Monique M. “Gentrification in Harlem: Community, Culture, and the Urban Redevelopment of the Black Ghetto.” In Race and Ethnic Politics. Greenwich, Conn.: JAI Press, 1994. Examines the politics of gentrification efforts in Harlem and the perceptions of the redevelopers as outsiders. Von Hoffman, Alexander. House by House, Block by Block: The Rebirth of America’s Urban Neighborhoods. New York: Oxford University Press, 2003. Through interviews, an examination of how neighborhood groups and local organizations revitalize neighborhoods in five cities: New York, Boston, Chicago, Atlanta, and Los Angeles. Alison Stankrauff See also
African Americans; Architecture; Business and the economy in Canada; Business and the economy in the United States; Crime; Demographics of Canada; Demographics of the United States; Racial discrimination; Reaganomics; Yuppies.
■ Gere, Richard Identification American actor Born August 31, 1949; Philadelphia, Pennsylvania
In the 1980’s, Gere became a rare combination of respected actor, humanitarian activist, and sex symbol. Richard Gere was the second of five children born to Homer Gere, an insurance salesman, and Doris Tiffany Gere, a homemaker. He graduated in 1967 from North Syracuse Central High School, where he
The Eighties in America
Gere, Richard
■
411
Richard Gere, right, shakes hands with the Dalai Lama at a reception in New York City in 1987. (AP/Wide World Photos)
excelled in music and gymnastics. He won a gymnastics scholarship to the University of Massachusetts at Amherst, where he majored in philosophy and drama. After two years in college, Gere dropped out to pursue his growing interest in acting. Gere spent several years as a struggling actor, and in 1973 he landed a starring role in the London production of Grease (pr. 1972). He began appearing in Hollywood films in the mid-1970’s and first came to the filmgoing public’s notice in Looking for Mr. Goodbar (1977), Days of Heaven (1978), and American Gigolo (1980). It was An Officer and a Gentleman (1982), however, that established Gere as a major star. He followed this hit with a series of unmemorable movies, including Breathless (1983), The Cotton Club (1984), and King David (1985). Although Gere’s reputation as a star was established in the early 1980’s, he did not again achieve
box office success until the 1990’s. In fact, he spent much of the 1980’s pursuing his interest in human rights causes, turning down the lead roles in Die Hard (1988) and Wall Street (1987), which went to Bruce Willis and Michael Douglas, respectively. Instead of starring in those films, Gere visited refugee camps in Nicaragua, El Salvador, and Honduras and pursued other humanitarian interests. Gere was a Buddhist, an outspoken advocate for human rights in Tibet, and a supporter of the Dalai Lama. He founded the Tibet House and the Gere Foundation and became active in ecological causes and AIDS awareness. Impact Gere became a well-known star then put the resulting cultural capital to work, using his reputation to raise money for and awareness about issues that were important to him.
412
■
Ghostbusters
Subsequent Events Gere reestablished himself as a major star in 1990, when he teamed up with Julia Roberts in the blockbuster hit, Pretty Woman. He also had big hits with Primal Fear (1996) and Runaway Bride (1999), again co-starring Roberts. Gere became known as much for his interests and activities outside show business as for his acting career, however. He was banned as an Oscar presenter after he denounced the Chinese government from the podium in 1993. Further Reading
Davis, Judith. Richard Gere: An Unauthorized Biography. New York: New American Library, 1983. Gere, Richard. Pilgrim. Boston: Little, Brown, 1997. Parker, John. Richard Gere: The Flesh and the Spirit. London: Headline, 1997. Mary Virginia Davis See also
Academy Awards; Film in the United
States.
■ Ghostbusters Identification Supernatural comedy film Director Ivan Reitman (1946) Date Released June 8, 1984
Capitalizing on the popularity of science-fiction movies of the 1970’s and 1980’s, such as the Star Wars and Alien series, Ghostbusters created almost a separate genre by mixing humor and scary situations with traditional elements of supernatural films, including monsters, gremlins, and spirits. A science-fiction ghost tale starring performers best known for their comedic skills, Ghostbusters mixed genres to create a film that was equal parts supernatural action film and spoof. The flitting ghosts and slimy creatures of Ghostbusters provide more comedy than horror, and the characters handle bizarre situations with comedic wit and mock-seriousness, thereby concocting one of the most financially successful comedies of the 1980’s, one that spawned a sequel, Ghostbusters II (1989), and created a large audience of admirers. The film’s special effects and studio work were seamlessly integrated with location shooting in New York City to provide an apocalyptic finale that threatened a believable urban setting with demoniac inva-
The Eighties in America
sion. In perhaps the film’s most famous sequence, the city of Manhattan is threatened with destruction by a gigantic Stay-Puft Marshmallow Man. (Ghostbusters won an Academy Award for Best Visual Effects.) This combination of realism and supernatural effects, fright and fun, appealed to the wide audience that drove the film’s profits. The knowing delivery of many of the film’s lines by Saturday Night Live alums Bill Murray and Dan Aykroyd caused them immediately to enter the popular vernacular, as they were quoted repeatedly throughout the remainder of the decade. Murray and Aykroyd, along with Harold Ramis, Ernie Hudson, Sigourney Weaver, Annie Potts, and Rick Moranis, created characters that were by turns likable, sympathetic, and ridiculous— characters whose reactions to the strange things happening to them form the core of the movie. In addition to its 1989 sequel, the film’s popularity engendered several television series—animated and live-action. One television spin-off in particular, The Real Ghostbusters, borrowed from fairy tales and folklore to give the material greater breadth and appeal. Ghostbusters’s success also generated a merchandising machine that included toys, video games, and the inevitable fast-food tie-ins. Impact Ghostbusters blended wacky humor, deadpan delivery, scampering ghosts, oddball characters, and escalating mayhem mixed with the trappings of parapsychology. It helped establish that big-budget, effects-driven films could broaden their appeal by incorporating humor and refusing to take themselves too seriously, luring audiences for whom ef-
From left: Bill Murray, Dan Aykroyd, and Harold Ramis in Ghostbusters. (AP/Wide World Photos)
The Eighties in America
Gibson, Kirk
■
413
fects alone were not a draw. When the similarly tongue-in-cheek time-travel action film Back to the Future (1985) also became a blockbuster the following year, Hollywood took notice. Further Reading
Coleman, John. “Ghostbusters.” New Statesman 108 (December 7, 1984): 35. Kael, Pauline. “Ghostbusters.” The New Yorker 60 (June 25, 1984): 104. Schickel, Richard. “Ghostbusters.” Time 123 (June 11, 1984): 83. Bernard E. Morris See also Action films; Film in the United States; Murray, Bill; Science-fiction films; Special effects; Weaver, Sigourney.
■ Gibson, Kirk Identification American baseball player Born May 28, 1957; Pontiac, Michigan
During the 1980’s, Gibson played on two World Serieswinning baseball teams and won the National League Most Valuable Player Award, but he is best remembered for an improbable and memorable home run hit in the opening game of the 1988 World Series. Kirk Gibson, whose Major League Baseball career began in 1979, played his first nine seasons with the Detroit Tigers. He was a member of the great 1984 Tigers team that opened the season by winning thirty-five of its first forty games and went on to win the World Series. Gibson batted .333 in that World Series and hit two home runs in the fifth and final game. Gibson, an outfielder, is best remembered for a home run that he hit as a member of the Los Angeles Dodgers in 1988. Gibson left the Tigers as a free agent after the 1987 season and had an excellent first season in Los Angeles. He batted .290, hit twenty-five home runs, and won the National League Most Valuable Player Award. He also built a reputation as a dangerous clutch hitter when a game was on the line. The Dodgers finished in first place in the National League’s Western Division, won the National League Championship Series, and faced the heavily favored Oakland As in an allCalifornia World Series. A leg injury kept Gibson on the bench in the first
Los Angeles Dodger Kirk Gibson raises his arm in celebration as he hobbles around the bases after hitting a game-winning home run in the first game of the 1988 World Series. (AP/Wide World Photos)
game of the World Series, but in the bottom of the ninth inning of that game—with two outs, one runner on base, and the Dodgers trailing 4 to 3— manager Tommy Lasorda summoned Gibson to pinch-hit. Gibson could not run, but he told Lasorda that he could swing the bat. On the mound was Oakland’s ace relief pitcher, Dennis Eckersley, at the time the best reliever in baseball. Gibson fouled off several of Eckersley’s pitches, grimacing in pain with each cut. Drama built with each swing of the bat. Then, Gibson connected, sending a high line drive over the right field wall to give the Dodgers an improbable 5-4 victory. Gibson hobbled around the
414
■
bases pumping his fist back and forth and was mobbed by his delighted teammates at home plate. Inspired by Gibson’s blast, the Dodgers won the next three games to sweep the World Series. The ninthinning home run in the first game was Gibson’s only appearance in the 1988 World Series. Impact Gibson’s ninth-inning home run in the first game of the 1988 World Series has become one of the most memorable homers in World Series history, and the film of that home run has become one of baseball’s iconic game films. Played over and over at World Series time, the film reminds fans that anything can happen in a game of baseball and that a baseball game is never truly over until the last out of the last inning has been recorded. Further Reading
Shatzkin, Mike, ed. The Ballplayers. New York: Arbor House, 1990. Ward, Geoffrey C., and Ken Burns. Baseball: An Illustrated History. New York: Alfred A. Knopf, 1994. James Tackach See also
The Eighties in America
Gibson, Mel
Baseball; Sports.
■ Gibson, Mel Identification American Australian actor Born January 3, 1956; Peekskill, New York
Gibson began the 1980’s starring in several major Australian films, and over the course of the decade he transitioned to being a major star of American action movies as well. Mel Gibson achieved celebrity status with the success of the Australian film Mad Max in 1979. As the film’s title character, a revenge-bent police officer in a postapocalyptic near future, Gibson brutally punished the men who killed his family. Gibson would revisit the role of Max Rockatansky in 1981’s Mad Max 2: The Road Warrior, capitalizing on the early 1980’s popularity of science fiction and anxiety over the Cold War. Gibson became one of America’s favorite action stars in this role. The role opened the door for his entry into the action circuit, but he repeatedly turned down offers of other action-hero parts. Gibson, trained as a professional stage actor before his rise to fame, instead played roles in more se-
rious movies like Gallipoli (1981), The Year of Living Dangerously (1982), The Bounty (1984), and The River (1984). Unfortunately for Gibson’s intents, however, action was what the 1980’s public craved from him. In 1985, he returned to the role of Mad Max in Mad Max Beyond Thunderdome alongside Tina Turner. This time, Gibson helped lost children find a home in the postapocalyptic world while escaping a fight in the Thunderdome, a pop culture reference that would live on for the rest of the 1980’s. Notably, it was his appearance in this film that sent him into the spotlight again and allowed him to capture People magazine’s first “Sexiest Man Alive” title in 1985. Gibson returned to the action genre two years later in Lethal Weapon (1987). Starring in this buddy movie opposite Danny Glover, Gibson played the suicidal Sergeant Martin Riggs of the Los Angeles Police Department. Gibson gave a stunning performance as Riggs, one that transcended the conventions of the genre, raising it to the level of drama. Gibson’s acting in Lethal Weapon proved to America that he was not just a man of action but also one of the superior actors of the decade. Though Gibson picked his parts carefully for the rest of the decade, he did return to the character of Riggs in Lethal Weapon 2 in 1989. Gibson married in 1980 but kept his personal life guarded from the public as best he could. He could not help getting noticed for his drinking, however; in 1984, he was charged with drunk driving in Canada and fined. Impact Mel Gibson’s greatest impact on the 1980’s was his work as a film actor, especially in the action genre. Gibson portrayed many gritty characters during the 1980’s and is most remembered for playing characters who endured extreme pain in both the Mad Max series and the Lethal Weapon series. Though Gibson would try to shy away from the action movies, his largest successes of the 1980’s were in that genre. Further Reading
Clarkson, Wensley. Mel Gibson: Man on a Mission. London: John Blake, 2006. Perry, Roland. Lethal Hero: The Mel Gibson Biography. London: Oliver Books, 1993. Daniel R. Vogel See also
Action films; Film in the United States.
The Eighties in America
■ Gibson, William Identification American cyberpunk writer Born March 17, 1948; Conway, South Carolina
Gibson’s novels and stories written in the 1980’s examine the effects, mostly negative, of new computer and telecommunication technologies on the world’s population. They are credited with inspiring the cyberpunk subgenre of science fiction. As a boy and young teenager, Gibson was an avid reader of science fiction. He discovered William S. Burroughs when he was fifteen, which led him to Jack Kerouac, Allen Ginsberg, and the other Beat authors. In 1968, he fled from the United States to Canada to avoid military service. He began to write seriously in 1977 and made his first sale in 1979. Gibson’s first novel, Neuromancer (1984), was the first one ever to sweep the Nebula, Hugo, and Philip K. Dick awards. Ironically, given the novel’s subject
Gibson, William
■
415
matter, Gibson wrote it and his other early stories on a manual typewriter, because he could not afford a dedicated word processor or even one of the early personal computers. Neuromancer formed the first book of the Cyberspace Trilogy, which Gibson completed during the 1980’s, publishing Count Zero in 1986 and Mona Lisa Overdrive in 1988. Some of the stories in his collection Burning Chrome (1986) are set in the same dystopian future that forms the backdrop to the trilogy. The titular “cyberspace” represents the marriage of virtual-reality technology with the Internet; it is treated in Gibson’s work as a place that someone can visit or even live. In Gibson’s twenty-first century, multinational corporations are more powerful than most governments, there are no democracies left, and there is a direct relationship between power and technology. Gibson had already written about one-third of Neuromancer when the movie Blade Runner premiered in 1982. Some critics have unfairly criticized Gibson’s novel as derivative of the film, but it would be more accurate to say that they share many of the same influences, especially the writings of Philip K. Dick, upon whose 1968 novel Do Androids Dream of Electric Sheep? the film was based. Impact Gibson’s novels became the most famous exemplars of cyberpunk, a science-fiction subgenre that portrayed hypertechnological societies in which the human mind and computers had become linked. Like Gibson, other cyberpunk authors such as Bruce Sterling, Rudy Rucker, and Michael Swanwick used their fictional representations to comment on the distribution of power and the use of technology in the real world of the 1980’s. Gibson not only changed the direction of science fiction during the 1980’s but also invented words that became part of the vernacular, such as “netsurfing,” “cyberspace,” and “jacking in.” “Cyberspace” became almost synonymous with the Internet and the World Wide Web, although the actual technologies differ significantly from those represented in Gibson’s fiction. Further Reading
William Gibson. (Karen Moskowitz)
Hafner, Katie, and John Markoff. Cyberpunk: Outlaws and Hackers on the Computer Frontier. New York: Simon & Schuster, 1991. Rheingold, Howard. Virtual Reality. New York: Summit Books, 1991. Thomas R. Feller
416
■
Gimli Glider
See also Blade Runner; Computers; Cyberpunk literature; Information age; Max Headroom; Science and technology; Science-fiction films; Tron; Virtual reality.
■ Gimli Glider Identification
Air Canada 767 involved in an emergency landing Date July 23, 1983 Place Gimli, Manitoba The safe landing of this Boeing 767 after it had run out of fuel was seen as both a random and a miraculous occurrence. On July 23, 1983, Air Canada Flight 143 took off from Montreal, Quebec, to begin a scheduled fourhour flight across Canada to Edmonton, Alberta, via Ottawa, Ontario. The airplane, a Boeing 767-200,
The Eighties in America
carried sixty-one passengers. Before leaving, the airplane was bedeviled by a fuel gauge that did not work properly. This problem meant that the ground crew and pilots needed to take special precautions to ensure that the proper amount of fuel was available for the flight. A calculation of the quantity of fuel believed to be in the tanks was made, and the airplane’s computer indicated that it would be sufficient for the journey. There was a problem, however. The 767 was one of the first Air Canada planes to have instrument readings displayed in metric units, while all calculations had traditionally been done using the imperial system. The Canadian government of Prime Minister Pierre Trudeau in the 1970’s had adopted the metric system to replace the imperial system. While younger people became increasingly comfortable with the new system of measurement through schooling, metric measurements remained unfamiliar to many Canadians. As a result of this unfamiliarity, a crucial mistake was made, and Air Canada’s
The Gimli Glider lands in Toronto in July, 2005. The famous plane remained in service for decades after surviving the incident that gave it its nickname. (© Contrails Aviation Photography)
The Eighties in America
staff overestimated the amount of fuel contained in the airplane’s tanks. For the pilots, the first sign of trouble occurred when the plane was flying over the northern part of Ontario: A warning sounded in the cockpit indicating that fuel pressure was dropping. Then, suddenly, the fuel ran out, and the engines stopped, leaving the pilots in command of a large airplane without any power at forty-one thousand feet. To make the situation even more serious, the pilots calculated that the nearest major airport, at Winnipeg, Manitoba, was too far away for the plane to reach. The 767 was declining in altitude at a rate of several thousand feet per minute, meaning it would be able to glide only a short distance before it crashed into the ground. Luckily, one of the plane’s pilots had once flown out of a small airport in Gimli, Manitoba, so he was able to direct the plane to that airfield. Equally helpful, the other pilot had experience flying gliders, which was in effect what the powerless Boeing 767 had become. Despite the lack of power, which caused some of the cockpit instruments not to function, the airplane managed to reach the airport. It touched down roughly and ended up nose down on a runway, narrowly avoiding automobiles and people at one end of the runway that, unbeknownst to the pilots, was being used as a racetrack. Miraculously, no one was injured. Impact The story of the Gimli Glider, as the plane was called, briefly but forcefully captured the imagination of a public fearful of plane crashes. It both frightened and inspired people, and the heroism of the cockpit crew partly made up for the shocking error that had been made in calculating the plane’s fuel load. The Gimli Glider survived its harrowing experience intact and continued in service as part of Air Canada’s fleet for years. Further Reading
Hoffer, William, and Marilyn Hoffer. Freefall: From 41,000 Feet to Zero—A True Story. New York: Simon & Schuster, 1989. Montesi, Jorge, dir. Falling from the Sky: Flight 174. Canada, Television Movie, 1995. Steve Hewitt See also Air India Flight 182 bombing; Pan Am Flight 103 bombing; Sioux City plane crash; Trudeau, Pierre.
Glass, Philip
■
417
■ Glass, Philip Identification American composer and musician Born January 31, 1937; Baltimore, Maryland
Regarded as a prominent minimalist composer, Philip Glass often collaborated with visual artists, actors, and musicians, influencing his compositional technique. Already established as a composer of theater music when the 1980’s began, Philip Glass completed his second opera, Satyagraha, in the first year of the decade. The opera, based on the early life of Mahatma Gandhi in South Africa, formed part of the operatic “portrait” trilogy that began with the monumental Einstein on the Beach (1976). That first work includes poetry and text relating to and commenting on general relativity, nuclear weapons, science, and AM radio. It consists of nine connected twenty-minute scenes taking place over a five-hour period with no intermission. Audiences are instructed to come and go as they please. The third opera, Akhnaten (1983), portrays an Egyptian pharaoh and includes libretto spoken in ancient Egyptian. The completion of the operatic trilogy was a musical departure for Glass, since Einstein on the Beach was his only opera composed before 1980. Permeating his music were the stylistic characteristics of minimalism, including continual repetition of repeated sounds and rhythmic patterns that gradually evolved over the course of the work. Also termed “systematic music,” short musical themes in Glass’s work undergo alterations through change in length, choice of notes, or rhythmic variation. Some critics refer to this type of music as simplistic and lacking variety. However, Glass was considered an advanced, eclectic composer who used minimalist techniques, rather than being labeled a strict minimalist. Glass first collaborated with Robert Wilson, an American avant-garde stage director and playwright, on the Einstein project. They teamed again to create the CIVIL warS for the 1984 Olympic Games in Los Angeles. The full production was canceled as a result of insufficient funding, but portions received full productions in Rome; Minneapolis; Rotterdam, the Netherlands; and Cologne, Germany. The opera’s other two sections were workshopped in Tokyo and Marseilles, France. The CIVIL warS staged events from ancient Athens to futuristic spaceships and was another landmark composition. Glass’s final operatic scores of the 1980’s included The Making of the
418
■
The Eighties in America
Glass ceiling
Representative for Planet 8 (1986) and an adaptation of Edgar Allen Poe’s The Fall of the House of Usher (1988). The 1980’s were prolific for Glass, whose approximately forty compositions of the decade included ballets, chamber music, operas, orchestra music, choral music, film scores, songs, world music, and theatrical scores. Many of these works were collaborations with artists in various disciplines. Songwriter Paul Simon and others contributed lyrics to Songs from Liquid Days (1985), and David Byrne collaborated on the CIVIL warS. In the Upper Room (1986), a ballet choreographed by Twyla Tharp, integrated dance and music. Glass’s seven film scores included those for Godfrey Reggio’s Koyaanisqatsi (1982) and Powaqqatsi (1987), each of which won multiple awards. These first two productions of Reggio’s stunning “Quatsi trilogy” (completed in 2002) incorporated music, images, and ideas from American landscapes. Glass also contributed film scores to less experimental films, such as Hamburger Hill (1987) and The Thin Blue Line (1988). Impact A prodigious composer in many genres, Philip Glass blurred the lines of musical categorization through unconventional compositions using non-Western techniques and partnerships with collaborating artists. Further Reading
Glass, Philip, and Robert T. Jones. Music by Philip Glass. Updated ed. Cambridge, Mass.: Da Capo Press, 1995. Kastelanetz, Richard, and Robert Flemming. Writings on Glass: Essays, Interviews, Criticism. Berkeley: University of California Press, 1999. Maycock, Robert. Glass: A Portrait. London: Sanctuary, 2002. Douglas D. Skinner See also
Classical music; Minimalist literature; Music; Olympic Games of 1984; Tron; World music.
■ Glass ceiling Definition
An unofficial or unacknowledged barrier within an organization’s hierarchy that prevents personal advancement, especially of women or minorities in employment
The term “glass ceiling” was popularized in the 1980’s and became an important concept in the American workplace.
Two articles written during the 1980’s are credited with coining the phrase “glass ceiling.” The first instance of the term appeared in a 1984 Adweek article about magazine editor Gay Bryant. The second article, from the March 24, 1986, issue of The Wall Street Journal, was written by Carol Hymowitz and Timothy Schellhardt. The term “ceiling” describes a barrier that women or minorities experience as they try to advance within a company or organization. The “glass” metaphor describes the transparent quality of the ceiling, because it is not immediately recognized or acknowledged. Workplace Issues for Women
Issues contributing to women’s experience of a workplace glass ceiling during the 1980’s included work-life balance, a lack of access to informal networks, a lack of effective mentors and role models, and gender-based stereotypes. A 1984 U.S. Census Bureau study showed that, on average, women had spent less time working at their current job than had men. The Department of Labor concluded that women were unable to attain seniority equal to that of men because of the time they spent away from work. Women were the primary caregivers for children or the elderly, and working women had to balance family life with their career. Women also lacked access to the informal networks that men used to develop relationships within organizations. These networks were perceived to include “male activities,” usually sporting events such as golf. The golf course was a place where male bonding and informal mentoring occurred, and women were usually not included. Despite the fact that mentoring is considered an important factor in leadership development, women had fewer opportunities to cultivate mentor relationships. Mentors usually chose protégés that they viewed to be most similar to themselves, and typically men were more comfortable mentoring other men. There were fewer women mentors because of the limited number of female managers and executives, creating a vicious circle. Biases and stereotypes also contributed to the glass ceiling. Women were viewed as operating more on emotion than on intellect. Other sexist attitudes included the beliefs that women could not be effective leaders, that they did not know how to get along in the business world, and that they had ineffective management styles. If a female executive behaved in
The Eighties in America
a warm and caring manner, she was viewed as weak. On the other hand, if a woman appeared to be tough, logical, and unemotional—characteristics typically expected of a male leader—she could be viewed in a negative manner, because she was not “feminine enough.” Other factors preventing women’s advancement in the workplace included their initial placement in dead-end jobs, lack of training opportunities, and sexual harassment. Research conducted on the glass ceiling in the 1980’s indicated that the effects of these factors were subtle but systematic in many types of organizations. It was recommended that women adopt various methods and strategies if their goal was to move into higher-level positions. Suggested strategies included pursuing more difficult and visible assignments, gaining support from an influential mentor, developing a style with which male managers would feel more comfortable, and accepting the need to outperform male colleagues. Organizational Strategies Strategies were recommended to organizational leaders to create more opportunities for minorities and women to reach higher-level positions. Strategies included developing policies and practices designed to increase opportunities for upward mobility; establishing pay equity for work of comparable value; eliminating gender, race, and ethnic-based stereotyping; creating “family-friendly” workplace policies; creating “parent-track” policies; and collecting data to track the advancement progress of women and minorities. Another View of the Glass Ceiling Some believed that the glass ceiling did not exist and that women could achieve a higher-level position through hard work and ambition if they chose to do so. Supporters of this view held that women might not pursue more ambitious goals if they decided that family was more important to them than career advancement. Others argued that women were very successful as leaders in smaller companies or as entrepreneurs and that these accomplishments should carry equal value when compared to higher-level roles in larger corporate settings. Impact The Civil Rights Act of 1991 established the Glass Ceiling Commission to study artificial barriers to the advancement of women and minorities in the American workplace and to make recommenda-
Globalization
■
419
tions for overcoming such barriers. The commission was composed of twenty-one members, with the secretary of labor serving as chair. The intent of this legislation was to ensure that women and minorities would receive equal treatment in employment. Further Reading
Frenkiel, Nora. “The Up-and-Comers: Bryant Takes Aim at the Settlers-In.” Adweek, March, 1984, 8. The term “glass ceiling” appeared in this article. Hymowitz, Carol, and Timothy Schellhardt. “The Corporate Woman: The Glass Ceiling—Why Women Can’t Seem to Break the Invisible Barrier That Blocks Them from Top Jobs.” The Wall Street Journal, March 24, 1986, 1, 4-5. The article that many credit with coining the phrase “glass ceiling.” Kanter, Rosabeth. Men and Women in the Corporation. New York: Basic Books, 1993. Provides information about corporate careers and the factors that promote individual and organizational success. Stith, Anthony. Breaking the Glass Ceiling: Racism and Sexism in Corporate America—The Myths, the Realities, and the Solutions. Orange, N.J.: Bryant & Dillon, 1996. Describes racism and discrimination in corporate America and the impact that these practices have on individuals and businesses. Weiss, Ann E. The Glass Ceiling: A Look at Women in the Workforce. Brookfield, Conn.: Twenty-First Century Books, 1999. Provides a history of women in the workplace as well as a discussion about whether or not a glass ceiling really exists. Sharon M. LeMaster See also
Affirmative action; Feminism; Income and wages in the United States; Mommy track; Racial discrimination; Sexual harassment; Women in the workforce; Women’s rights.
■ Globalization Definition
Process of integration of the world’s economies into a single global market
Globalization increased significantly during the 1980’s, but its spread was retarded by the persistence of communist nations and Cold War politics. Even so, trade crossed many barriers and was largely responsible for the export of culture, information, and ideas from the United States to other nations, as teenagers in so-called Second and Third World
420
■
Globalization
countries watched Hollywood movies and wore T-shirts advertising American corporations. By the end of the decade, the fall of the Soviet Union seemed to presage a truly global economy and the worldwide spread of capitalism. Although worldwide capitalist trade was under way in the early twentieth century, globalization was halted by the rise of a separate world socialist economic system after the Russian Revolution of 1917 and, soon after World War II, the rise of socialist economies from East Germany to North Korea and Vietnam. Nevertheless, in 1947, capitalist countries launched a process for developing a tariff-free world economy by signing the General Agreement on Tariffs and Trade (GATT) to encourage free trade through several subsequent rounds of negotiation. Up to the 1970’s and 1980’s, the wealthiest corporations accumulated such vast amounts of capital that they established conglomerates with subsidiaries in several poorer countries around the world, where their financial power also served to control those nations’ political elites. In 1973, the Summit Conference of Non-Aligned Nations called for a new international economic order (NIEO) to end the exploitation and impoverishment under which poor countries shipped raw materials to rich countries at low prices and then had to buy goods made from those same materials at high prices. Based on the concept of a right to development, special sessions of the United Nations General Assembly in 1974 and 1975 endorsed NIEO principles, which included a relocation of industries from the wealthy industrial countries to the poorer countries, so poorer countries could gain control of the new industries inside their borders. The richest countries made nominal concessions to NIEO advocates regarding foreign aid, but they were slow to implement them. When U.S. president Ronald Reagan took office in 1981, his advisers declared NIEO dead. They insisted that the way to address poorer countries’ concerns was to require those countries to privatize their industries. If the developing nations agreed to dismantle government-run corporations and expand their private sectors, the World Bank would then issue loans directly to specific private corporations in the developing nations. Just as it did in the United States, then, the Reagan administration insisted on shrinking governments, deregulating industry, and trusting free markets throughout the world.
The Eighties in America
Reagan and other economic conservatives embraced a model of global free trade, believing that the industrialized nations of the world would benefit from such trade. Those nations would gain access to new markets for their industrial goods and technologically advanced services, especially in the developing nations that had previously erected protectionist tariffs against those goods and services in order to nurture their own fledgling industries. Beginning in 1986, the developed world vigorously pursued an end to such tariffs and of all barriers to global free trade. In that year, GATT’s Uruguay Round of talks began to negotiate comprehensive tariff reduction agreements. In 1988, the Canada-United States Free Trade Agreement was signed, as momentum built behind all such free trade agreements and the development of a global economy. Meanwhile, market economies had been developing within such communist nations as China, Eastern Europe, the Soviet Union, and Vietnam. In 1989, the Berlin Wall came down, the world socialist economic system disintegrated, and Eastern Europe joined the world economy. The 1980’s also saw the rise of the Internet, and when commecial interests were allowed to join the network in 1985, the Internet began to evolve into another tool or medium of global capitalism. Impact As more countries became industrial producers, low-cost consumer goods were increasingly sold worldwide. Transnational corporations acquired smaller businesses in a wave of consolidations and mergers, becoming truly global in scope. The diversity of these global corporations’ holdings meant that many different kinds of businesses, from manufacturing to services, were integrated inside the same corporate entity. Demand steadily increased for unskilled labor to produce enough commodities to take advantage of newly opened international markets. As a result, large rural populations in developing nations migrated to cities and to decrepit shantytowns to work in the factories that sprung up there. If there were negative consequences within the United States to the globalization of the 1980’s, they were largely hidden by the celebration of Reaganomics and the “victory” of capitalism over communism in the Cold War. The U.S. economy grew tremendously during the decade, driven in part by the expansion of multinational corporations into new markets. It would not be until the following decade
The Eighties in America
that opponents of free trade would assert that it was causing the United States to lose jobs overseas, for example. By the end of the 1980’s, however, advocates of social justice had begun to voice their opposition to the establishment of an international division of labor. In addition, the U.S. multicultural movement expressed concern over what it called “cultural imperialism,” that is, the economic imposition of American culture and values on other nations through the export of American cultural commodities, especially music, television, and cinema. Further Reading
Friedman, Thomas L. The World Is Flat: A Brief History of the Twenty-First Century. New York: Farrar, Straus & Giroux, 2006. Optimistic account of globalization as leveling competition between industrial and emerging market countries; focuses particularly on how corporations in India and China have become part of global supply chains as a result of technological innovations, especially the Internet. MacGillivray, Alex. A Brief History of Globalization: The Untold Story of Our Incredible Shrinking Planet. London: Constable & Robinson, 2006. Traces the preconditions of today’s globalized world to such inventions as the magnetic compass, geometry and mathematics, Aristotelian logic, and world maps. Muller, Ronald E. “Globalization and the Failure of Economic Policy.” Challenge 18, no. 2: 57. A prophetic warning that the rise of multinational corporations invalidated orthodox economic theory, since they can use their economic power to move production to countries with minimal economic restrictions, thereby displacing workers in the industrial democracies. One of the first uses of the term “globalization.” Stiglitz, Joseph E. Globalization and Its Discontents. New York: W. W. Norton, 2003. A former chief economist at the World Bank indicts the global economic policies of the International Monetary Fund for outmoded economic theories, lack of transparency to the public, and favoring corporate interests over those of the people. _______. Making Globalization Work. New York: W. W. Norton, 2006. Solutions to such problems as the instability of the global financial system caused by America’s debt and the destruction of the environment by developing countries. Michael Haas
Go-Go’s, The
■
421
See also Canada-United States Free Trade Agreement; Cold War; Computers; Environmental movement; Foreign policy of the United States; Reagan, Ronald; Reaganomics; United Nations.
■ Go-Go’s, The Identification All-woman rock band Date Active from 1978 to 1985
The Go-Go’s were the first all-woman rock band of musical significance to be widely successful. They helped define the look and sound of the 1980’s. The Go-Go’s formed in Los Angeles in 1978. Belinda Carlisle, their lead singer, had sung briefly with a hardcore punk band called the Germs. Jane Wiedlin, their most talented songwriter, was also in the Los Angeles punk underground scene. They were joined by Charlotte Caffey, the group’s lead guitarist and keyboard player, who was some years older than its other members. After being briefly preceded by Elissa Bello, the charismatic Gina Schock (alluded to in the 1987 John Hughes-written film Some Kind of Wonderful ) became the group’s drummer in 1979. Bass guitarist Margot Olavarria left the band early, though she received some financial consideration from their later success. She was replaced by Kathy Valentine, perhaps the most underrated member of the group, who quickly became a skilled bass player. Unlike previous all-female bands that were often largely marketing vehicles, the Go-Go’s wrote their own songs and were accomplished musicians. Their debut album, Beauty and the Beat (1981), combined New Wave and surf-music influences to produce an inimitable and infectious sound. Their first hit, “Our Lips Are Sealed,” was a catchy song about a couple plagued by rumors concerning a possible affair between them. Their second album, Vacation, featured a song of the same name that epitomized a spirit of wistful yet carefree yearning. The five attractive yet quirky young women quickly became pop icons. Carlisle, in particular, became a celebrity and began to date Los Angeles Dodgers baseball prospect Mike Marshall. Though not reaching the sales heights of the previous two albums, the group’s Talk Show (1984) represented an artistic advance, illustrated not only by the upbeat single “Head Over Heels” (featured prominently as a video on MTV) but also by songs with unusually thoughtful lyrics, such as
422
■
The Eighties in America
Go-Go’s, The
The Go-Go’s attend the Grammy Awards ceremony in 1982. Visible from left: Jane Wiedlin, Charlotte Caffey, and Belinda Carlisle. (AP/ Wide World Photos)
“Beneath the Blue Sky” and “Yes or No.” Both of the latter songs were written or co-written by Wiedlin, whose evolution as a songwriter and performer led her to want to sing lead on some of the group’s efforts. As Carlisle did not play an instrument, however, the question of what role she could play in songs she did not sing was a vexatious one. Tension between Wiedlin and Carlisle, combined with serious drug use on the part of some members of the group, led Wiedlin to leave the group. She was replaced briefly by Paula Jean Brown, but the group was unable to continue, and it broke up altogether in 1985. Impact The Go-Go’s, with their fun-loving yet hardedged California sound, are such a key part of the musical legacy of the 1980’s that it is hard to remember they recorded only three albums in that decade. Despite not being overtly political, they represented
a major breakthrough for women in rock and for feminism in general during a decade often inimical to both. Further Reading
Gaan, Gilliam. She’s a Rebel: The History of Women in Rock. Seattle: Seal Press, 1992. Gehman, Pleasant. Escape from Houdini Mountain. San Francisco: Manic D Press, 2001. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of That Decadent Decade. New York: St. Martin’s Griffin, 1996. Nicholas Birns See also Feminism; Music; Music videos; New Wave music; Pop music; Women in rock music; Women in the workforce.
The Eighties in America
■ Goetz, Bernhard Identification
American electrical engineer involved in a subway shooting Born November 7, 1947 Goetz shot four young African American men whom he believed intended to rob him on a New York City subway in December, 1984. The shooting set off a national debate over crime, racial tensions, gun control, the right to self-defense, and the use of deadly force by private citizens. On December 22, 1984, Bernhard Goetz, who was a self-employed electrical engineer in Queens, New York, boarded a subway on New York City’s IRT line. He sat down near four young African American men. One of these men, Troy Canty, told Goetz to
Bernhard Goetz is escorted to Manhattan’s Central Booking facility on March 28, 1985. (AP/Wide World Photos)
Goetz, Bernhard
■
423
give him five dollars. Canty and the others with him later claimed he was merely panhandling. Goetz, however, claimed that he believed he was being robbed. Fearing for his safety, Goetz drew an unregistered revolver from beneath his windbreaker and shot five times, wounding all four men. All of the young men survived, but one of them, Darrell Cabey, was permanently paralyzed and suffered brain damage, because the bullet severed his spine. Goetz fled the subway at a subsequent stop, rented a car, and drove to Bennington, Vermont. He disposed of the gun and the windbreaker he had been wearing. On December 31, 1984, he turned himself in to police in Concord, New Hampshire, and was returned to New York City. Goetz was soon labeled the “Subway Vigilante” by the media. His case attracted national attention and sparked debate over crime and the right to use deadly force when threatened, as well as the extent to which racial stereotypes did or did not affect Goetz’s perception of danger. Goetz confessed to the shooting but claimed he had acted in self-defense. He was eventually charged with seventeen counts of attempted murder and assault, but following a sevenweek trial in mid-1987, he was acquitted of these charges, although he was convicted of illegal possession of a firearm. He received a one-year sentence on this charge and served eight months in jail before being released. In 1985, a lawyer for Darrell Cabey filed a civil suit against Goetz, charging that Goetz acted recklessly and deliberately in attacking Cabey. In 1996, the jury in this case awarded Cabey a $43 million judgment. Goetz eventually filed for bankruptcy, and since he had few assets, it was unlikely that Cabey would ever receive a substantial amount from this judgment. Impact Goetz’s reaction to the perceived threat against him figured prominently in the national news from the time of the shootings until the end of his trial. Many saw Goetz as a heroic figure who stood up against the threat of crime. Others charged that he had vastly overreacted to a situation in which he was not seriously at risk, possibly because his perceived assailants were African American. The incident was one of several with racial overtones that took place in New York City in the 1980’s, and it added to the increasing tensions between races there and in other U.S. cities during the decade.
424
■
Golden Girls, The
Further Reading
Fletcher, George P. A Crime of Self-Defense: Bernhard Goetz and the Law on Trial. Chicago: University of Chicago Press, 1990. Lesly, Mark. Subway Vigilante: A Juror’s Account of the Bernhard Goetz Trial. Latham, N.Y.: British American Publishing, 1988. Mark S. Joy See also
African Americans; Bonfire of the Vanities, The; Brawley, Tawana; Central Park jogger case; Crime; Do the Right Thing; Howard Beach incident; Racial discrimination.
■ Golden Girls, The Identification Television comedy series Date Aired from September 14, 1985, to May 9,
1992 The Golden Girls was often extremely controversial, tackling topics that, during the 1980’s, were taboo for network television—and often in society at large—including homosexuality, menopause, gun control, domestic violence, suicide, cross-dressing, HIV/AIDS, lesbianism, euthanasia, chronic fatigue syndrome, artificial insemination, and senility. The Golden Girls was an American situation comedy, created by Susan Harris, that originally aired Saturday nights on the National Broadcasting Company (NBC) from September 14, 1985, to May 9, 1992. The show followed four older women who shared a fashionable house together in Miami. Blanche owned the house; Dorothy and Rose responded to an ad seeking roommates on the bulletin board of a local grocery store. The three women were later joined by Dorothy’s mother, Sophia, when Sophia’s retirement home, Shady Pines, burned down. The show starred Bea Arthur as practical Dorothy Zbornak, Betty White as naïve Minnesotan Rose Nylund, Rue McClanahan as sexy Southern belle Blanche Devereaux, and Estelle Getty as the wisecracking Sicilian Sophia Petrillo, Dorothy’s mother. In many episodes, the ladies ate cheesecake at the kitchen table—a familiar set in almost every episode—as they talked about their problems or reminisced about the past. During the seventh season, Arthur decided that she wanted to leave the series, so in the last episode of that season, her character married Blanche’s Un-
The Eighties in America
cle Lucas (Leslie Nielsen). The other three protagonists, Blanche, Rose, and Sophia, continued in a spin-off series, The Golden Palace, but it lasted only one season. After the end of The Golden Palace, Getty joined the cast of another sitcom, Empty Nest, making far more frequent appearances as Sophia in the show’s final years than she had earlier as a recurring guest. During its original run, The Golden Girls received sixty-five Emmy nominations and won eleven Emmy Awards (including Outstanding Comedy Series), four Golden Globe awards, and two Viewers for Quality Television awards. Most unusual, all four lead actresses won Emmy Awards for their performances on the show. Impact The Golden Girls combined humor, warmth, and relevance to become one of the most popular and successful television programs of the 1980’s and one of the few shows to develop new fans years after its original run. The show, featuring four women of middle age or older, was highly unusual in a time when broadcast television relied on young, attractive stars to attract viewers. In that context, The Golden Girls was as significant as was The Cosby Show in diversifying the casts and audiences of network sitcoms. Further Reading
Colucci, Jim. The Q Guide to “The Golden Girls.” New York: Alyson Books, 2006. Mitz, Rick. The Great TV Sitcom Book. New York: Putnam, 1988. Martin J. Manning See also Age discrimination; Cosby Show, The; Sitcoms; Television; Women in the workforce.
■ Goldmark murders The Event
A Seattle family is murdered on Christmas Eve Date December 24, 1985 David Rice killed a Seattle family on Christmas Eve, because he mistakenly believed the family to be Jewish and part of a communist conspiracy. The murders drew attention to the growth of right-wing extremism and anti-Semitic terrorism in the United States. On December 24, 1985, David Rice forced his way into the Seattle home of Charles Goldmark using
The Eighties in America
a toy gun. He handcuffed the forty-one-year-old Goldmark, his forty-three-year-old wife, Annie, and their two sons, Derek (twelve) and Colin (ten). He then repeatedly stabbed all four of them. Annie and Colin died immediately. Charles died shortly after arriving at the hospital, and Derek died approximately thirty-seven days after the attack. Charles Goldmark was a graduate of Yale Law School and a prominent Democratic civil rights attorney in Seattle. He had served as a delegate for Senator Gary Hart of Colorado at the 1984 Democratic National Convention. Charles was the son of John and Sally Goldmark, who were accused of being communists in the 1960’s. The accusations ended John Goldmark’s career as a member of the Washington legislature, and in 1963 he sued for libel against his accusers, who had based their statements on the fact that Sally Goldmark had been a member of the Communist Party in the 1930’s. John Goldmark won his libel suit and received a fortythousand-dollar judgment. Rice was apprehended two days after the attack, when authorities traced the use of Charles Goldmark’s credit cards. During the interrogation process, Rice confessed that he had been planning the murder for six months. Investigators determined that Rice was a right-wing extremist and a member of the local Seattle chapter of the anticommunist Duck Club. He learned of the Goldmark family through the Duck Club organization. Leaders of the Seattle Duck Club chapter, Homer Brand and Gene Gooseman, identified the Goldmark family as members of a communist conspiracy and provided shelter for Rice after he committed the murder. At his arraignment, Rice pleaded not guilty by reason of insanity to four counts of aggravated firstdegree murder. Rice occasionally displayed symptoms of psychosis. However, his attorney, Bill Lanning, failed to introduce any evidence at trial of his client’s psychotic symptoms. Rice was found guilty of the murders and sentenced to death. Impact The Goldmark murders helped legislators enact more stringent punishments for hate crimes. Despite the victims in fact being neither Jewish nor communist, the crime also shed light upon the rise in anti-Semitism, as well as of hate groups and racist ideologies in general, that characterized the 1980’s. Subsequent Events Rice appealed his conviction on the grounds of having been represented by an in-
Goldwater-Nichols Act of 1986
■
425
effective counsel. In addition to Lanning’s failure to introduce evidence supporting his client’s defense, the attorney had allowed the police unlimited access to Rice. In 1997, the appellate court determined that Lanning’s defense had indeed been ineffective, and it overturned Rice’s conviction, granting the appellant a new trial. In May of 1998, Rice pled guilty in exchange for a lesser sentence of life imprisonment without the possibility of parole. Further Reading
Amond, P. “Racist Origins of Border Militias: The History of White Supremacist Vigilantism and Tom Posey’s Civilian Military Assistance.” n.p.: Public Good Project, 2005. Available at http:// www.publicgood.org/reports/vigilante_history .pdf Robbins, J. A. T. “This Thing of Darkness: A Sociology of the Enemy.” Journal of Scientific Study of Religion 36, no. 2 (1997): 340. Turner, W. “Sanity of Confessed Slayer at Issue in Seattle Trial.” The New York Times, May 28, 1986. Nicholas D. ten Bensel See also
Atlanta child murders; Crime; Jewish Americans; Terrorism; Tylenol murders.
■ Goldwater-Nichols Act of 1986 Identification
Law to reorganize the United States’ military command structure Date Signed into law on October 1, 1986 The act was the first major overhaul of the top U.S. military command in about forty years. It was an attempt to centralize authority in the chairman of the Joint Chiefs of Staff rather than continue the separation of the uniformed services’ command. Senator Barry Goldwater and Representative William Flynt Nichols cosponsored the piece of legislation officially known as the Goldwater-Nichols Department of Defense Reorganization Act of 1986, which was passed as Public Law 433 of the Ninetyninth U.S. Congress. The intent of the law was to force more coordination between the various branches of the U.S. armed forces in order to encourage a more efficient military. To accomplish this coordination, the law strengthened the position of the chairman of the Joint Chiefs of Staff.
426
■
The Eighties in America
Golf
Previously, this position was basically a figurehead responsible for formal communications between the Joint Chiefs and the secretary of defense or sometimes the president. Under the GoldwaterNichols Act, the chairman was designated as the principal military adviser to the president. While the leaders of the various branches of the military might still give advice, their role was clearly subordinated to that of the chairman. The same type of unified command structure was also established for each of the various areas of military operation. Thus, U.S. troops throughout the world were assigned regional commanders. Within each region, one person became responsible for coordinating all U.S. military actions, regardless of the branches of service involved. As at the top level, the new structure replaced one in which each region had had separate commanders for each branch of the armed forces, with no clear chain of command among the branches. The new structure unified not only command and control but also planning, as the unified regional commanders assumed responsibility for making contingency plans to respond to any possible situation with their regions. As an outgrowth of the mandate to unify planning, the procurement and distribution of supplies were also unified. It was hoped that this development would reduce competition among the branches, as resources would be allocated to whichever units needed them most, regardless of branch, and new technologies would be made available to all military branches, regardless of which brand had funded their development. Impact While the 1983 U.S. invasion of Grenada had been successful, it also produced evidence of a significant lack of coordination among the armed forces that concerned leaders in Washington. The Goldwater-Nichols Act was one of the responses to this situation. It sought to eliminate potential problems in the command structure before the United States became embroiled in a more serious conflict. By the beginning of the 1990’s, the new structure was in place, and it directly affected the military operations of that decade, most notably the Persian Gulf War. Further Reading
Lederman, Gordon Nathaniel. Reorganizing the Joint Chiefs of Staff: The Goldwater-Nichols Act of 1986. Westport, Conn.: Greenwood Press, 1999.
Locher, James R. Victory on the Potomac: The GoldwaterNichols Act Unifies the Pentagon. College Station: Texas A&M University Press, 2004. Donald A. Watt See also Cold War; Grenada invasion; Iranian hostage crisis; Military spending; Weinberger, Caspar.
■ Golf Definition
Professional and amateur sport
The 1980’s was the most important growth decade for the sport of golf. Researchers who analyze golf demographics and the golf industry have recognized the decade of the 1980’s as a time of important growth and development for the sport. The post-World War II, babyboom generation made golf a very popular recreational sport, spent large sums of money on newly developed equipment, and drove an expansion in the number of U.S. golf-course facilities. Americans temporarily had a greater amount of leisure time, and the improving American economy resulted in more disposable income being available for popular leisure-time activities such as golf. According to the National Golf Foundation, the number of American golfers increased from fourteen million in the 1970’s to seventeen million in the mid-1980’s, and it continued to grow steadily thereafter. Golf Clubs During the 1980’s, the design and construction of golf clubs were significantly improved. Using principles of engineering and physics, golfclub manufacturers designed some clubs for “game improvement” (the ability of a club to compensate for a golfer’s swing error by increasing the accuracy and distance of a ball’s flight despite a poor swing). The primary characteristics of these new clubs were perimeter weighting and a larger “sweet spot” on the club face, a lower center of gravity of the club head, and improvements to the materials and flex characteristics of the club shaft. In 1988, a costly lawsuit was filed against the official rules organization, the United States Golf Association (USGA), by Karsten Manufacturing, which made Ping clubs. The square grooves on the Ping club face were so effective at increasing ball underspin and ball flight accuracy that the USGA believed
The Eighties in America
the club compromised the integrity of the game. If players were allowed to use Ping clubs, the USGA feared that skill would cease to be the primary factor determining who won a game of golf. The major courses would no longer challenge golfers, and historical comparisons between contemporary and past generations of golfers would become meaningless. The USGA therefore attempted to ban the square grooves in 1988, and the Professional Golf Association (PGA) followed suit in 1989. However, both organizations eventually settled the lawsuit brought by Karsten.
Golfer Kathy Whitworth lines up a putt at the Sleepy Hole Golf Course in Suffolk, Virginia, in 1982. Whitworth was the first woman to win more than one million dollars in prize money on the LPGA tour. (AP/Wide World Photos)
Golf
■
427
Professional Golf The economy, media publicity, and substantial television coverage of the various professional tournaments and tour organizations thrived during the 1980’s. Many exciting finishes to major professional tournaments were witnessed by millions via live television coverage. The men’s PGA tour maintained most of the public’s attention, but as a sign of popularity and growth, the Senior PGA Tour began in 1980, featuring veterans of the PGA Tour over age fifty. The new tour gave the viewing public a chance to continue watching aging superstars such as Arnold Palmer and Gary Player. The Ladies’ Professional Golf Association (LPGA) tour, like the men’s tour, saw a continued growth in available prize money, annual number of tournaments, and television coverage. Nancy Lopez generated much public attention as a young superstar champion, and Kathy Whitworth became the first woman pro to win over one million dollars in career prize money. The men’s professional tour experienced significant growth in both profits and popularity. Tom Watson was the most notable player in the early 1980’s, winning the British Open in 1980, 1982, and 1983; the 1981 Masters; and the 1982 U.S. Open. Watson’s victory in the U.S. Open was especially notable, as he battled and defeated the legendary Jack Nicklaus at Pebble Beach, California, holing a difficult chip shot at the par-three seventeenth hole. Nicklaus, for his part, won the last two of his eighteen major championships in the 1980’s. In 1980, he set a U.S. Open record with a score of 272 at Baltusrol, in New Jersey. In 1986, at age forty-six, on the last day of the Masters Tournament, he shot a dramatic 65 to come from behind and win. Another of the decade’s notable major championship victories occurred in 1984, when Hall of Fame golfer Lee Trevino won his sixth and final major, the PGA Championship, at Shoal Creek Country Club, in Alabama. Trevino was forty-four years old, and the runner-up was forty-eight-year-old, Hall-ofFame golfer Gary Player. Also in 1984, Ben Crenshaw won the Masters Tournament, which was particularly notable because he had finished second in major championships five times. Curtis Strange was the top-ranked player at the end of the decade. In 1988, he was the first player on the PGA tour to exceed one million dollars in prize money in one season. He also won the U.S. Open consecutively in 1988 and 1989. The year 1987 was a milestone of a
428
■
Goodwill Games of 1986
different kind for the PGA tour: By the end of that year, the organization had donated more than 100 million dollars to charitable organizations. Impact The growth of golf in the 1980’s was fueled by the aging of the baby-boom generation, increased leisure time available to Americans, an increased number of golf course facilities, and technological improvements to golf equipment. These developments drove golf’s popularity not only as an amateur pastime but also as a professional sport, as golf afficionados also became spectators on the PGA tour. Further Reading
Astor, Gerald. The PGA World Golf Hall of Fame Book. New York: Prentice Hall Press, 1991. Focuses primarily on the famous players and accounts of their most famous victories in the major tournaments. Campbell, Malcolm. The Encyclopedia of Golf. 3d ed. New York: DK, 2001. Good for research and specific information about the history of golf, the accomplishments of specific champion golfers, and the champions by year of the world’s major tournaments. Extensive facts and information about the 1980’s. McCormick, David, and Charles McGrath, eds. The Ultimate Golf Book: A History and a Celebration of the World’s Greatest Game. New York: Hilltown Press/ Houghton Mifflin, 2002. Historical information about the great players and the major tournaments. Contains extensive historical photographs from golf’s earliest years to the end of the twentieth century. Alan Prescott Peterson See also
Sports; Watson, Tom.
■ Goodwill Games of 1986 The Event
International sports competition founded by Ted Turner Date July 4-20, 1986 Place Moscow, Russia The Goodwill Games of 1986, the inaugural Goodwill Games, were the brainchild of Atlanta mogul Ted Turner, who sought a venue for increasing goodwill between the world’s superpowers. The competitions were to be aired on TBS, Turner’s Atlanta-based television superstation.
The Eighties in America
The Goodwill Games were organized as a response to the United States’ boycott of the 1980 Olympic Games in Moscow and to the Soviet Union’s refusal to participate in the 1984 Olympics in Los Angeles. Despite the fact that Ted Turner bypassed the U.S. Olympic Committee (USOC) in his plans, he and the USOC later reached an agreement that future Goodwill Games should not become alternatives to the Olympic Games but should focus on U.S.-Soviet competition. Turner believed that the world’s topnotch athletes should be able to come together in an environment free of the political pressures that had marred the 1980 and 1984 Olympic Games. Eleven feverish months went into planning the event, in which athletes from the United States and the Soviet Union competed together on the same playing field in a major international multi-sport summer event. In response to Turner’s initial proposal, the Soviets had suggested the event be limited only to American and Soviet athletes. Turner insisted, however, that other countries be allowed to participate. Following the Soviets’ agreement to this condition, Turner helped recruit the Western team, giving $6 million to the Athletics Congress, which, in turn, paid top American athletes $3,000 each to compete. Turner envisioned star athletes participating in trackand-field events, swimming, boxing, volleyball, and figure skating. The Soviets accommodated the games by building a huge studio for TBS visitors, doubling their police force, and posting banners that promoted sports, friendship, and peace. Soviet leader Mikhail Gorbachev insisted that the message of the games was to be friendship, and Moscow received a huge influx of American visitors flocking to witness the event. The Games Following spectacular opening ceremonies, the Goodwill Games began. During these games, six world, eight continental, and ninety-one national records were broken. On the opening day, Soviet swimmer Vladimir Salnikov set a new record of 7 minutes 50.64 seconds in the 800-meter free style. Soviet pole vaulter Sergei Bubka broke his own world record with a vault of 19 feet 8¾ inches. In the women’s basketball finals, the United States, led by Cheryl Miller, defeated the Soviet team, ending a 152-game, twenty-eight-year Soviet winning streak. Brazil won the bronze medal. In other notable events, American Jackie JoynerKersee compiled 7,148 points in the heptathlon to
The Eighties in America
become the first American woman since Babe Didrikson Zaharias to hold the world record in a multiple-discipline event. Edwin Moses, two-time Olympic gold medalist, was awarded the Goodwill Games gold medal in the 400-meter hurdles. Sprinter Evelyn Ashford, the holder of the U.S. world record, won the 100-meter dash, and U.S. high jumper Doug Nordquist achieved a personal best of 7 feet 8 inches to secure his first win over Soviet world-record holder Igor Paklin. In the cycling competition, new world records were set by Michael Hubner of the German Democratic Republic at 10.2444 and Erika Salumae of the Soviet Union at 11.489. Meanwhile, led by Yuri Korolev and Yelena Shushunova, the Soviet Union swept the gold medals in both the individual and team gymnastics competitions. The rhythmic gymnastics event was won by Bianka Dittrich of the German Democratic Republic. In the boxing competition, the Soviet Union won eleven of the twelve gold medals, with the United States’ Arthur Johnson being the only non-Soviet to win a gold medal. The 1986 Goodwill Games featured thirty-five hundred athletes from seventy-nine countries who, in an invitational format, participated in eighteen sports. Turner’s superstation in Atlanta, along with other outlets, beamed 129 hours of coverage to American households. The games continued in later years, beginning with the 1990 Goodwill Games in Seattle, Washington. Staged more for diplomatic than for financial reasons, however, the games never succeeded financially. The 1986 games lost $26 million, setting a precedent that only continued until the event was finally terminated in 2001. Impact Despite Turner’s loss of money, he continued to stage Goodwill Games for the remainder of the century, maintaining that his project was not about money. A longtime student of history, Turner envisioned himself working outside official networks and accomplishing something of importance. He maintained that his Goodwill Games provided a fresh, exciting meeting ground for athletes, free of political pressure, at which they could measure themselves against one another in a major international sports competition. The games also evolved into charitable ventures, supporting an organization called “Uniting the World’s Best” that provided assistance to children and mothers in developing countries through contributions to the United Nations Chil-
Grant, Amy
■
429
dren’s Fund (UNICEF) and to the Boys and Girls Clubs of America. Further Reading
Goldberg, Robert, and Gerald Jay Goldberg. Citizen Turner: The Wild Rise of an American Tycoon. New York: Harcourt Brace, 1995. Interesting look at the more personal aspects of Ted Turner’s life. Harrington, Geri. Jackie Joyner Kersee. New York: Chelsea House, 1995. Fascinating story of the gifted athlete who suffered from asthma, was prevented by the Olympics Committee from taking her prescribed medication, and yet became a four-time Olympic champion. Senn, Alfred E. Power, Politics, and the Olympic Games. Idaho: Human Kinetics, 1999. Account of the Olympic Games since their beginning in 1896 and the political and social issues surrounding them. Mary Hurd See also Cold War; Lewis, Carl; Olympic boycotts; Olympic Games of 1980; Olympic Games of 1984; Olympic Games of 1988; Soviet Union and North America; Sports; Turner, Ted.
■ Grant, Amy Identification
Contemporary Christian and pop singer-songwriter Born November 25, 1960; Augusta, Georgia Besides being the best-selling Contemporary Christian recording artist of all time, Amy Grant became the first such performer to make the transition to the mainstream without sacrificing her audience. By the time her 1984 album Straight Ahead peaked at number 133 on Billboard magazine’s album charts, Amy Grant had already become a legend in Contemporary Christian music. Grant had released her first album when she was just seventeen, and each of the next five had demonstrated an increasing musical sophistication and had sold better than the one before. Her Age to Age (1982) was the first Contemporary Christian album to be certified platinum. Grant thus found herself faced with a choice. She could either continue her career as it was, marketing herself to a niche audience, albeit a sizable one, or she could attempt to cross over to the mainstream while still
430
■
The Eighties in America
Grenada invasion
preserving her artistic and religious integrity. The latter task would be difficult: No Contemporary Christian act had yet proved palatable to the Top 40 masses. It was into this atmosphere of anticipation that Grant released her 1985 album Unguarded, the first to be released simultaneously on both a gospel and a mainstream label (Myrrh and A&M Records, respectively). The album’s being released with four different covers emphasized the seriousness behind the hopes for its success. Enhanced by what was then the brightest production of Grant’s career, Unguarded highlighted her upbeat disposition while downplaying the overt Christianity for which she had become famous. Songs such as “Love of Another Kind,” “Everywhere I Go,” and the album’s hit single “Find a Way” (accompanied by an MTV video) were crafted to evoke thoughts of romantic love on pop radio and thoughts of divine love on Christian airwaves. The obviousness of the strategy was not lost on Grant’s core audience—a portion of which accused her of “selling out”—or on secular critics, many of whom dismissed Unguarded as evangelism in pop clothing. Nevertheless, the album sold well enough and generated enough positive attention to establish Grant’s credibility as a mainstream performer. She solidified her new standing in December, 1986, when her duet with former Chicago lead singer Peter Cetera, “The Next Time I Fall,” became the number-one song in the United States. A track from Cetera’s album Solitude/Solitaire, the song was produced by Michael Omartian, who, as the producer of hit gospel albums by the Imperials and hit pop albums by Donna Summer and Rod Stewart, was in many ways a perfect match for Grant’s own sacredsecular sensibilities. Impact In abolishing the wall separating Contemporary Christian music and secular pop, Grant made possible the 1990’s crossover success of Christian acts such as Michael W. Smith, Kathy Troccoli, Bob Carlisle, and Sixpence None the Richer, as well as the quadruple-platinum sales of her own 1991 album Heart in Motion. Further Reading
Millard, Bob. Amy Grant: The Life of a Pop Star. New York: St. Martin’s Griffin, 1996. Powell, Mark Allan. Encyclopedia of Contemporary Christian Music. Peabody, Mass.: Hendrickson, 2002. Arsenio Orteza
See also
MTV; Music; Pop music; Religion and spirituality in the United States.
■ Grenada invasion The Event
U.S. forces invade Grenada to rescue threatened Americans and topple a pro-Soviet regime Date October 25-December 15, 1983 Place Grenada island, Grenadines chain, Caribbean Sea With the invasion of Grenada, the United States demonstrated its willingness to remove by force procommunist regimes, especially those close near its borders. The invasion sent a clear message to the Soviet Union that President Ronald Reagan intended to confront the Soviets in a wider Cold War arena. Originally a long-term British possession in the Caribbean, Grenada attempted to position itself outside of the U.S. sphere of influence. In 1979, a pro-Soviet coup led by Maurice Bishop toppled the Britishestablished government, and Bishop’s regime began to receive support from the Soviet Union and Cuba. This support included large caches of Soviet weapons and assistance by Cuban engineers to construct a large airport at Point Salinas, on the southwest corner of the island. President Ronald Reagan, sensitive to Soviet threats to the United States, believed that the airport was intended to serve a military function, specifically to support Cuban forces aiding pro-Soviet causes in Africa. The Grenadian government claimed the airport was intended to facilitate the tourist trade. Tensions on the island increased after October 13, 1983, when Bernard Coard, an ardent communist who believed that Bishop was not actively pursuing an authentically Marxist agenda, seized power. In subsequent days, Coard’s forces executed Bishop and suppressed pro-Bishop protests with deadly force. The main official U.S. interest in Grenada was the safety of the approximately six hundred American citizens on the island, most of them medical students at St. George’s University, which was located near the airport under construction at Point Salinas. Concerned about regional stability, on October 22 the Organization of East Caribbean States (OECS) asked the United States to intervene in Grenada. The Department of Defense hastily made plans to
The Eighties in America
invade the island, using airborne troops, special forces, and Marines diverted from an expeditionary force that had been headed for Lebanon. The United States Invades
On October 25, 1983, U.S. forces invaded Grenada in the largest U.S. military operation since the end of the Vietnam War. Code-named Operation Urgent Fury, the invasion was under the overall command of Admiral Wesley L. McDonald, commander of the U.S. Atlantic Fleet. Approximately seven thousand U.S. troops, along with three hundred OECS soldiers, began to land on the island under the cover of darkness. Army Rangers and Special Forces arrived first to seize key positions and protect U.S. citizens. Paratroopers from the Eighty-second Airborne Division landed at Point Salinas to seize the airport, while elements of the Second Battalion, Eighth Marines landed by helicopter on the east side of the island from amphibious warships offshore. These initial troops were rein-
Grenada invasion
■
431
forced by two battalions of soldiers airlifted from the United States and by additional Marines from the Twenty-second Marine Amphibious Unit (MAU). U.S. forces could receive combat support from Marine helicopters, naval gunfire from several destroyers and frigates, and air support from the aircraft carrier U.S.S. Independence. Facing the U.S. forces were approximately fifteen hundred Grenadian troops and six hundred Cubans. The Cubans were mostly engineers, but they did arrive on the island armed with standard infantry equipment and some heavy weapons. The Grenadian forces amounted to little more than a disorganized, but heavily armed, militia. Grenadian and Cuban resistance was sporadic in some places and stiff in others. Marines landing by helicopter faced opposition, losing two Cobra attack helicopters, while the Army Rangers tasked with taking the Point Salinas airport found the runway blocked and Cuban forces prepared for a landing. In
Members of the Eighty-second Airborne fire artillery during the U.S. invasion of Grenada. (U.S. Department of Defense)
432
■
The Eighties in America
Gretzky, Wayne
other areas, however, U.S. forces could exploit the weak resistance to overrun opposition and accomplish key objectives, including the rapid evacuation of U.S. citizens from the island. Not all things went smoothly, as the different service branches found it difficult to coordinate action through their separate communications systems, and directing air power proved difficult. Locating isolated pockets of Grenadian militias took time, but the process was aided by local civilians, who generally welcomed the American intervention. Major resistance to the invasion ended after three days, but it took until November 2 for U.S. forces to eliminate all opposition and capture all objectives. By mid-December, the Grenadians had established an interim government led by Governor-General Paul Scoon, the local representative of the British Commonwealth, as well as by the OECS. Eventually, a permanent government was established, led by Prime Minister Nicholas Brathwaite. The last American forces left the island on December 15, 1983. Impact The invasion sent a clear signal to the Soviet Union and its allies that the United States intended to wage a much more aggressive Cold War against its communist enemies. Some commentators believed the Grenada invasion, along with other small military operations during Reagan’s presidency, had ended the “Vietnam syndrome” that had hindered America’s willingness to use military power as a foreign policy option since the end of the Vietnam War. The lack of organization and communication among the branches of the U.S. Armed Forces alarmed Congress, however, and it resulted in the passage in 1986 of the Goldwater-Nichols Act, which established a unified U.S. military command structure. Further Reading
Burrowes, Reynold A. Revolution and Rescue in Grenada: An Account of the U.S.-Caribbean Invasion. New York: Greenwood Press, 1988. The first major account of Operation Urgent Fury to include interviews with the commanders and planners of the operation. Cole, Ronald H. Operation Urgent Fury: The Planning and Execution of Joint Operations in Grenada, 12 October-2 November 1983. Washington, D.C.: Joint History Office, Office of the Chairman of the Joint Chiefs of Staff, 1997. The only major work on the invasion that includes detailed accounts of
the pre-invasion planning using official declassified documents. O’Shaughnessy, Hugh. Grenada: An Eyewitness Account of the U.S. Invasion and the Caribbean History That Provoked It. New York: Dodd, Mead, 1985. One of the first major works on the invasion, the book features the experiences of many participants. Steven J. Ramold See also
Cold War; Foreign policy of the United States; Goldwater-Nichols Act of 1986; Reagan, Ronald; Reagan Doctrine.
■ Gretzky, Wayne Identification Canadian hockey player Born January 26, 1961; Brantford, Ontario
Gretzky, known as the “Great One,” dominated his sport for more than twenty years; his greatest success came in the 1980’s. By 1988, Wayne Gretzky was easily the most famous hockey player in the world, and he was reputed by some to be the greatest ever to play the game. Thus, when Gretzky was traded in that year from the Edmonton Oilers to the Los Angeles Kings, the way was paved for an unprecedented spike in the popularity of professional hockey in Southern California. Within a short period of time, the National Hockey League (NHL) expanded into multiple warmweather American cities, including Anaheim, California. Without Gretzky’s successful stay in Los Angeles, that likely would not have happened. Gretzky’s reputation as the Great One was cemented well before that 1988 trade. He began his career with the Edmonton Oilers and led that team to Stanley Cup titles in 1984, 1985, 1987, and 1988. Beginning in 1980, he won the league’s Most Valuable Player (MVP) award eight straight times. One year later, he began a streak of seven straight years in which he led the league in goals scored. During the 1981-1982 season, he scored an amazing 92 goals and 212 points, becoming the first player to break the 200-point barrier. He finished the 1985-1986 season with 215 total points, including 163 assists. It was just two months before the beginning of the 1988-1989 season when Gretzky’s trade to Los Angeles was announced. Edmonton fans were upset, as
The Eighties in America
were Canadians generally, who disliked the idea of their greatest player transfering to a country where hockey was much less of a national passion than it was for them. The Oilers’ owner, Peter Pocklington, defended the controversial deal, saying that he could no longer afford to keep the championship-winning Oiler team together. Cynics said that Gretzky’s wife, actress Janet Jones, also must have had some influence in the decision: Her acting career stood a much better chance of success in Hollywood than it did in Alberta. Gretzky broke down and cried as he said his good-byes in Edmonton. By the time he reached Los Angeles later that day for another press conference, he was all smiles. He wrapped up the 1980’s and his first season in Los Angeles by winning another MVP award, even though Pittsburgh’s Mario Lemieux finished the season with more points. In the postseason, Gretzky led the Kings to a stunning playoff win over his former team. Impact Wayne Gretzky is considered by many to be the best ever to have played in the National Hockey League. When he retired in 1999, he held more than sixty regular-season, playoff, and All-Star Game records. During the 1980’s, Gretzky dominated professional hockey, Canada’s national pastime. His trade to Los Angeles in 1988 angered Canadians, but it also significantly expanded the visibility and appeal of hockey in the United States. More Americans followed the sport, more money came into the NHL, and several more teams joined the league, arguably as a result of Gretzky’s skill on the ice. Further Reading
Messier, Mark. Wayne Gretzky: The Making of the Great One. New York: Beckett, 1998. Gretzky’s family, friends, and opponents discuss his legacy and his greatness. Podnieks, Andrew. The Great One: The Life and Times of Wayne Gretzky. New York: Triumph Books, 1999. A comprehensive look at Gretzky and the impact he had on his sport. Anthony Moretti See also
Canada and the United States; Hockey; Lemieux, Mario; Sports.
Griffith-Joyner, Florence
■
433
■ Griffith-Joyner, Florence Identification American track-and-field athlete Born December 21, 1959; Mojave Desert,
California September 21, 1998; Mission Viejo, California
Died
Florence Griffith-Joyner’s performances on the track set a new standard of excellence for women in the 100-meter and the 200-meter dash and earned her the title of fastest woman in the world. Although she was invited to the Olympic trials in 1980, Delorez Florence Griffith just missed earning a spot on the 200-meter dash team. However, in 1982 she won the National Collegiate Athletic Association (NCAA) 200-meter championship in 22.39 seconds, running for the University of California, Los Angeles (UCLA). A year later, she won the 400-meter race at the same event. At the 1984 Olympic trials, she earned membership on the U.S. Olympic team and was dubbed “Fluorescent Flo” because of the stylish, brightly colored bodysuits she wore. It was at the 1984 Olympics in Los Angeles that the world first came to know Griffith as a runner. She won a silver medal in the 200-meter dash with a time of 22.04 seconds, but she was not allowed to compete on the U.S. relay team because of the ultra-long fingernails she sported. Coaches thought that her nails might prevent her from passing the baton smoothly. Although she was deprived of the opportunity to win her first gold medal as part of the relay team, her distinctive style made an impression. In 1985 and 1987, Griffith ran the fastest times in the United States in the 100-meter dash. After marrying 1984 Olympic gold medalist Al Joyner in 1987, she changed her name to Florence Griffith-Joyner and received the nickname “Flo-Jo.” That year, she came in second in the 200-meter dash at the World Championships in Rome, winning a silver medal, but she also won a gold medal there as part of the 4 × 100-meter relay team. It was 1988 that was to be Griffith-Joyner’s stellar year. At the Olympic trials in Indianapolis, Indiana, on July 16, 1988, she set a world record of 10.49 seconds in the 100-meter dash. At the same event, she set a U.S. record of 21.77 seconds in a 200-meter heat. Later that year, at the Summer Olympics in Seoul, South Korea, she set a new world record of 21.34 seconds in the final race of the 200-meter dash
434
■
The Eighties in America
Guns n’ Roses
competition. She won gold medals in the 100-meter dash, the 200-meter dash, and the 4 × 100-meter relay race, as well as a silver medal as part of the 4 × 200meter relay team, bringing her total medal count to three gold and two silver medals. These performances won her the Associated Press Female Athlete of the Year award, the Sullivan Award for the top amateur athlete, and the Jesse Owens Award as outstanding track-and-field athlete of the year. However, in February, 1989, she announced her retirement. Impact Griffith-Joyner’s world records in the 100meter dash and the 200-meter dash lasted into the
early twenty-first century. Her performance at the 1988 Olympics showcased her incredible talent, while the fact that she had improved markedly over her 1980 and 1984 performances demonstrated the importance of hard work and perseverance in athletics. She thus became a role model for young athletes everywhere. Further Reading
Aaseng, Nathan. Florence Griffith Joyner. Minneapolis: Lerner, 1989. Condon, Robert J. “Delorez Florence GriffithJoyner.” In Great Women Athletes of the Twentieth Century. Jefferson, N.C.: McFarland, 1991. Griffith-Joyner, Florence, John Hanc, and Jackie Joyner-Kersee. Running for Dummies. Hoboken, N.J.: Wiley, 1999. Susan Love Brown See also African Americans; Lewis, Carl; Olympic Games of 1984; Olympic Games of 1988; Sports.
■ Guns n’ Roses Identification Los Angeles-based hard rock band Date Formed in 1985
Guns n’ Roses entered a 1980’s musical landscape that was typically characterized by the regular use of synthesizers, heavy studio production, and pop sensibilities, even in such nominally rough-edged genres as heavy metal. The band’s raw, live musicianship coupled with its hard-living, rockand-roll attitude made it stand out, and it became a center of musical controversy, both critically and culturally.
Florence Griffith-Joyner crosses the finish line to win the women’s 100-meter event at the 1988 Summer Olympics. (Hulton Archive/Getty Images)
In the middle of the 1980’s, L.A. Guns member Tracii Guns and Hollywood Rose singer Axl Rose founded a new band in Los Angeles and named it Guns n’ Roses, after themselves. The name also stood in their minds for the symbolic meeting of the human-made and the natural, the destructive and the beautiful. Guns himself did not remain in the band for very long, but the name stuck, and in 1987 Guns n’ Roses released their debut album, Appetite for Destruction. Though the band’s lineup would change over the years, Guns n’ Roses 1987 roster comprised Rose on vocals, lead guitarist Slash, rhythm guitarist Izzy Stradlin, bassist Duff McKagan, and drummer Steven Adler.Appetite for Destruction was one of the most popular albums of the decade, and many observers attributed its popularity
The Eighties in America
Guns n’ Roses
■
435
struggling musicians in Los Angeles. Their songs were often laced with obscenities and referred unabashedly to sex, drugs, and alcohol. Their follow-up album, Lies (1989), was the only other one they released during the 1980’s. It treated similar themes and included the popular and lighter “Patience.” Though Guns n’ Roses’ music was the group’s main claim to fame, the band’s notorious debauchery, the sexual imagery of its album art, and Rose’s public antics often placed band members in the public eye as well. Known for being a difficult performer, Rose often showed up hours late, walked off stages in the middle of performances, and made comments considered rude, misogynistic, and even racist. Drug problems led to Adler’s dismissal, while debates within the band would eventually cause its dissolution.
Guns n’ Roses star Axl Rose. (Hulton Archive/Getty Images)
to the public’s hunger for “real” rock music, as opposed to Top 40 pop. Guns n’ Roses’ songs were rawer in sound and more serious, honest, and emotional lyrically than those being played on most radio stations in the mid-1980’s. The group’s album featured twelve songs, three of which became national hits: “Welcome to the Jungle,” “Paradise City,” and the love ballad “Sweet Child o’ Mine.” Rose’s nearly operatic vocal range and Slash’s soaring guitar solos decorated songs about the hard life the band’s members knew as
Impact Guns n’ Roses continued to be the center of controversy for many years and recorded three records in the 1990’s, but it was with 1987’s Appetite for Destruction that they made their most significant mark on music and on culture. Considered vulgar and loud by detractors and the saviors of authentic, powerful rock and roll by fans, Guns n’ Roses was one of the most influential and important musical forces of the 1980’s.
Further Reading
Stenning, Paul. Guns n’ Roses: The Band That Time Forgot. London: Chrome Dreams, 2005. Wall, Mick. Guns n’ Roses: The Most Dangerous Band in the World. New York: Hyperion, 2004. Lily Neilan Corwin See also
Bon Jovi; Heavy metal; Mötley Crüe; MTV; Music; Music videos; Pop music.
H ■ Haig, Alexander Identification
Secretary of state under Ronald Reagan from January 22, 1981, to July 5, 1982 Born December 2, 1924; Philadelphia, Pennsylvania Haig was a vocal and controversial figure, who brought significant political baggage to the Department of State when he took its helm in 1981. He set the foreign-relations tone for the first eighteen months of Ronald Reagan’s presidency and ultimately resigned when the administration sought a new voice to represent it to the world. Alexander Haig’s nomination as Ronald Reagan’s secretary of state raised eyebrows in Washington and around the world as soon as it was announced. The former chief of staff to Richard M. Nixon was considered a controversial figure whose military background and forceful personality made him a curious choice, despite his recent experience as commander of North Atlantic Treaty Organization (NATO) forces. Since he had retired from the military only two years earlier in 1979, his confirmation required the U.S. Senate to grant an exception to a law requiring officers to be retired for at least five years before serving as head of the Department of State. Pundits who predicted a rocky tenure for Haig soon proved correct. On more than one occasion, he managed to upstage other cabinet officers and even the president with his bold pronouncements and demand for the spotlight. Haig’s most memorable gaff occurred shortly after President Reagan was shot by John Hinckley, Jr., on March 30, 1981. Vice President George H. W. Bush was not in Washington at the time. To calm concerns that Americans and the country’s allies might have about the functioning of government while the president was undergoing surgery, Haig agreed to address the media. His pronouncement that “I am in charge here” was intended to indicate that until Bush could reach the White House, Haig would handle affairs in the executive branch. Unfortunately, he expounded on his
statement by explaining erroneously the order of succession to the presidency, putting himself after the vice president and ahead of the Speaker of the House and the president pro tempore of the Senate. The same kind of response characterized many of Haig’s other activities in the office. He was constantly butting heads with other cabinet officers, and it was considered ironic that when British forces invaded the Falkland Islands off the coast of Argentina in 1982, Haig put himself forward as a mediator between the two countries. His strong anti-Soviet stance did little to relieve Cold War tensions. His vocal support of Israel also pitted him against others in the administration who were urging a more balanced approach in dealing with the Arab-Israeli conflict. Finding himself repeatedly thwarted by Reagan loyalists such as Michael Deaver and First Lady Nancy Reagan, Haig tendered his resignation in June, 1982, and was replaced by George P. Shultz. Impact Haig’s brief tenure as secretary of state caused considerable consternation among the president’s top aides, but the brash former general managed to remind his colleagues in the administration that Reagan would have to pay attention to foreign affairs even if domestic issues were to be his top priority in the early months of his presidency. In 1988, Haig sought the Republican nomination for president, but his candidacy drew little support. Further Reading
Haig, Alexander. Inner Circles: How America Changed the World—A Memoir. New York: Time Warner, 1992. Morris, Roger. Haig: The General’s Progress. New York: Playboy Press, 1982. Strober, Deborah, and Gerald Strober. The Reagan Presidency: An Oral History of the Era. Rev. ed. Washington, D.C.: Brassey’s, 2003. Laurence W. Mazzeno See also
Bush, George H. W.; Cold War; Elections in the United States, 1980; Elections in the United
The Eighties in America
States, 1988; Foreign policy of the United States; Kirkpatrick, Jeane; Reagan, Ronald; Reagan assassination attempt; Shultz, George P.; Soviet Union and North America.
■ Hairstyles Definition
Styles or manners of arranging the
hair Several attention-getting hairstyles became popular during the 1980’s. Some reflected the luxury and lavishness of the decade, while others reflected the desire of alternative cultures to resist and stand out from the mainstream. Diana, princess of Wales, was one of the people who influenced popular styles of the 1980’s. She wore her hair in a slick bob with a defined back neckline contour, a style that could be varied (combed forward or backward, for example) and one that was emulated throughout the decade. Television shows featuring the affluent and elegant had an influence on women’s hairstyles for the first time, although televisions had been in American homes since the 1950’s.
Hairstyles
■
437
hair mousse and hair gels, along with the old standby hair spray, helped shape wayward tresses. However, the styled look of these products often lasted only one day, causing women to start washing their hair daily. In addition to shows about the wealthy, detective shows were also successful, particularly Miami Vice. Actor Don Johnson, one of the leads in the show, inspired men to adapt the fashion of beard stubble, or “five o’clock shadow,” at all hours. The mullet was another popular hairstyle of the time, and although the cut could be varied, it consisted mainly of short hair on the sides and long hair in back. Mullets were popular in suburban and rural areas among working-class men. This trend contrasted with the conservative look favored by male business professionals, whose groomed, short hair remained part of the business uniform. Another hairstyle fashionable in the 1980’s was the Afro, first introduced by African Americans and then taken up by both men and women of European descent. The
Adult Hairstyles
The television characters that most influenced American hairstyles were the fictitious Carringtons of Dynasty and the Ewings of Dallas, who stood for many Americans as symbols of financial achievement. Once they appeared on the screen, the overblown long hairstyles worn by the women in these evening soap operas became the rage. Krystle, a character on Dynasty played by Linda Gray, had a signature style consisting of platinum blond hair cut in a long, straight, bob, with bangs feathered back from her face. Usually, Krystle wore her hair down, but sometimes it was swept up and puffed out from the sides of her face. For Dynasty watchers, Krystle’s hairstyle was the perfect example of how to look in the 1980’s. So-called big hair was the trend of the decade, even for women with short hair. Working women, who had to look tidy at the office, gave up hair rollers and opted for blow dryers and finger-shaping with wax as the favorite way to finish hair, or they wore long hair tied back. Long, sleek, and perfectly straight hair was another office look. Women’s hairstyles became increasingly long in the last half of the 1980’s, with the domination of blunt cuts that were worn straight across the back. Many new products, such as
Popular singer and actress Cher wears her hair in a distinctively 1980’s “big” style in 1985. (Hulton Archive/Getty Images)
438
■
The Eighties in America
Halley’s comet
style worked for both short and long hair. Naturally curly long or medium hair or curled “permanent” styles were also popular. A corollary to the Afro’s popularity emerged in hairstyling techniques that started in the previous decade and lasted throughout the 1980’s. This was a set of embellishments to the Afro that included the art of hair weaving, dreadlocks, cornrows, and hair plaiting. Hairstyling among African diaspora stylists often exhibits the transmission and readaptation of Old World to New World styles, which then permeate the New World culture. Youth Styles
Pop star Madonna was known as the Material Girl, and many teenage girls and slightly older women copied her fashion look. They had to vary their style, as Madonna changed hairstyles frequently, although she usually wore her hair long. She wore it with a slightly raised crown and curls along her shoulders; perfectly straight with a variegated razor cut; in a long, very sleek bob; and in a straight, funky style below the level of her shoulders, with razored layers of short hair near her face and longer hair at the sides and back to her breasts. Her hairstyles were the trendiest of the decade. Varied hair colors, made popular by pop singer Cyndi Lauper, induced many young women to experiment with dying their hair, and some hair products allowed these women to change their hair color as often as every day. Another look adopted by the young was “street style,” also known as punk. It grew from the Goth look that represented a romantic vision of life shadowed by death. Goths wore black clothes with white pancake-makeup skin, and black dyed hair. The black hair was teased upward as far as it would go or gelled flat with a shaved or painted widow’s peak. The slang word “punk” can mean inferior or worthless, and the style arose as a revolt against the unstyled, free-flowing hair of the hippie generation. Hair was the most important feature of the punk look. The scalp was often shaved with a “Mohawk” strip of hair running from nape to scalp. This strip of hair could be bleached, died a bright color, and then gelled into a tall fan that was startling in appearance.
Impact The excesses of the 1980’s soon became modified, as the stock market temporarily plummeted late in the decade and new environmental concerns dominated the media and seeped into people’s consciousness. The “If you’ve got it, flaunt it” attitude of the 1980’s was replaced by the “Less is
more” slogan of the 1990’s. The decade of flamboyance couldn’t last and evolved into a simpler, less showy decade. Further Reading
Cunningham, Patricia A., Heather Mangine, and Andrew Reilly. “Television and Fashion in the 1980’s.” In Twentieth-Century American Fashion, edited by Linda Welters and Patricia A. Cunningham. New York: Berg, 2005. Detailed discussion on the powerful impact television shows had on the styles, including hairstyles, of the 1980’s. Lewis, Lisa. “Consumer Girl Culture: How Music Video Appeals to Girls.” In Television and Women’s Culture: The Politics of the Popular, edited by Mary Ellen Brown. London: Sage, 1990. Discusses the influence of MTV on girls’ fashion and hairstyles. Panati, Charles. Panati’s Parade of Fads, Follies, and Manias. New York: HarperCollins, 1991. Arranged by decades; includes trendy hairstyles of the 1980’s. Sheila Golburgh Johnson See also Consumerism; Fads; Fashions and clothing; Flashdance; Lauper, Cyndi; Madonna; MTV; Teen films.
■ Halley’s comet The Event Astronomical body passes near Earth Date 1985-1986
The return of Halley’s comet to the inner solar system during the mid-1980’s resulted in the first satellite fly-by explorations of a comet. Halley’s comet (Comet 1P/Halley) is the brightest and most well known of all periodic comets, making a complete orbit of the Sun and returning to Earth’s vicinity every seventy-six years. Detailed observations of Halley’s comet have been recorded since 240 b.c.e. The comet is named for Edmond Halley (1656-1742), who was the first to recognize that the various objects witnessed in the sky at seventy-sixyear intervals were actually the same object. The most recent return of Halley’s comet to the inner solar system was between autumn, 1985, and spring, 1986. During March and April, 1986, Halley’s comet could be viewed with the naked eye, if the observer was well south and far from urban lights. For most
The Eighties in America
earthbound observers, however, viewing conditions were unfavorable. Despite the difficulty of seeing the comet from Earth, the comet’s return was extremely rewarding to scientists. In an excellent display of international cooperation, a massive effort was undertaken to study the comet. Detailed earthbound observations of Halley’s comet on its return began in 1982. During 1985, an international fleet of six spacecraft were launched toward the comet. The spacecraft were to fly by the comet and send back immense quantities of data about its composition and structure, as well as its interaction with the solar wind. The spacecraft included the Soviet Vega 1 and 2 probes, Japanese Suisei and Sakigake probes, European Space Agency Giotto probe, and the United States’ International Cometary Explorer (ICE). Additional observations were made by spacecraft orbiting Earth and Venus. Returning comet data from all astronomical disciplines were coordinated by the International Halley Watch and archived at the Jet Propulsion Laboratory in Pasadena, California. The six spacecraft encountered Halley’s comet between March 6 and March 25, 1986. The Vega and Giotto probes made the closest approaches to Halley’s comet. Acting as pathfinders, the Vega probes surveyed the comet first from distances of over 5,000 miles. Vega data were used to correct Giotto’s course as it closed to within 375 miles of the comet on March 14, 1986. Using remote sensing techniques, Giotto imaged the nucleus of Halley’s comet. The nucleus was found to be an irregular potato-shaped body 9 miles long and 5 miles wide with a density of 0.3 g/cm3. The nucleus’s surface was pockmarked by impact craters and covered with an almost-black crust. The comet was found to be ejecting three tons of material per second from its surface. The comet ejecta was 80 percent water, 10 percent carbon monoxide, and 2.5 percent carbon dioxide, with the remainder made up of ammonia and methane. Impact
Science’s understanding of comets was greatly advanced by data obtained from the Halley’s comet spacecraft flybys. The results of these data have changed astronomers’ conceptual models of cometary structure and evolution. Halley’s comet will next return in 2061 and will offer even less favorable viewing conditions than those that existed in 1985-1986.
Handmaid’s Tale, The
■
439
Further Reading
Brandt, John, Malcolm Niedner, and Jurgen Rahe. The International Halley Watch Atlas of Large-Scale Phenomena. Boulder: University of Colorado Press, 1992. Ottewell, Guy, and Fred Schaaf. Mankind’s Comet. Greenville, S.C.: Astronomical Workshop, 1985. Schaaf, Fred. Comet of the Century. New York: Copernicus, 1997. Randall L. Milstein See also Astronomy; Science and technology; Space exploration.
■ Handmaid’s Tale, The Identification Dystopian novel Author Margaret Atwood (1939Date Published in 1985
)
By projecting into the future the conflicting political and cultural concerns of the troubled 1980’s, Canadian author Margaret Atwood envisioned a bleak and chilling result: the America of The Handmaid’s Tale. Set in the late twentieth century in Cambridge, Massachusetts, The Handmaid’s Tale (1985) portrays a world in which an extreme fundamentalist Christian group has overthrown the U.S. government and assassinated its leaders. The new theocratic government had rescinded the Constitution and established a new nation called the Republic of Gilead. A theocracy founded upon a literal interpretation of the Bible, the government of Gilead endorses racial and religious intolerance: Jews are forced to convert or emigrate to Israel, and the so-called Children of Ham (people of African descent) are relocated to the wilds of North Dakota. At the same time, women are relegated to rigid social classes—identified by color-coded habits reminiscent of religious orders— and their rights are completely eliminated. Abortionists, homosexuals, Quakers, and other religious dissidents are publicly executed in the name of traditional values, and their bodies are displayed on the Wall at Harvard University, which has become the headquarters of the secret police. In the novel, the American birthrate has dropped precipitously because of pollution, pesticides, or radiation poisoning. As a result, procreation is severely restricted and ritualized. Only men in author-
440
■
Handmaid’s Tale, The
The Eighties in America
ity are allowed to marry, but they are usually older and frequently sterile; their wives have difficulty conceiving, so a class of women called Handmaids is established. These healthy young women who have all successfully borne children in the past are responsible for perpetuating the human race. Following the biblical example of Rachel and her handmaid Bilhah in a bizarre extension of surrogate motherhood, a Handmaid must serve her Commander and his barren Wife as the vessel that they hope will produce a child, which then belongs to the Wife. Government censorship forbids women from writing or reading anything (including the Bible), holding property, or retaining their own identity. There is no free Margaret Atwood. (The Washington Post; reprinted by permission of the D.C. Pubpress; radio and television stations lic Library) broadcast only religious programs. Money has disappeared. Credit cards still exist, but women’s cards have been summarily canceled. Unwomen (femiFurther Reading nists like the protagonist’s mother) are exiled to the Bloom, Harold, ed. Margaret Atwood’s “The Handcolonies, where they will clean up toxic waste until it maid’s Tale.” Philadelphia: Chelsea House, 2004. kills them. The novel concludes with an appendix Ehrenreich, Barbara. “Feminism’s Phantoms.” The that reveals that the Republic of Gilead no longer exNew Republic 194, no. 11 (March 17, 1986): 33-35. ists: The entire narrative forming the bulk of the text Howells, Coral Ann, ed. The Cambridge Companion to turns out to be a historical document being studied Margaret Atwood. New York: Cambridge University by members of a later, presumably more enlightPress, 2006. ened society. Mohr, Dunja M. Worlds Apart: Dualism and Transgression in Contemporary Female Dystopias. Jefferson, Impact The Handmaid’s Tale offered a satire on bigN.C.: McFarland, 2005. otry, nightmarish political repression, and antifemTroy, Gil. Morning in America: How Ronald Reagan Ininist ideals. Atwood denied that she was attacking vented the 1980’s. Princeton, N.J.: Princeton UniChristianity itself, claiming instead to trace a historiversity Press, 2005. cal pattern of religious persecution within the Joanne McCarthy United States, as well as the extremism that caused well-intentioned people to do terrible things. AlSee also Abortion; Air pollution; Biopesticides; though the somewhat sanitized film version, released Conservatism in U.S. politics; Falwell, Jerry; Femiin 1990, was not particularly well received, this connism; Moral Majority; Religion and spirituality in the troversial novel was widely praised as a feminist verUnited States; Robertson, Pat; Women’s rights. sion of George Orwell’s Nineteen Eighty-Four (1949).
The Eighties in America
■ Hands Across America The Event
A benefit to fight hunger and homelessness Date May 25, 1986 Place From Long Beach, California, to New York City During the 1980’s, many major charity events were held to address social problems around the world. Hands Across America was one such event, and it raised awareness among the American public about two important social issues. Hands Across America was a charity event staged on May 25, 1986, in which millions of people held hands to form a human chain across the United States from Long Beach, California, to Battery Park in Manhattan. The project was organized by Ken Kragen, a musician manager. Kragen was also instrumental in organizing the USA for Africa project, which featured many famous musicians singing the song “We Are the World,” recorded to raise money
Hands Across America
■
441
for famine in Africa. Unlike Kragen’s previous venture, the Hands Across America project was a domestic project, focused on fighting hunger and homelessness in the United States. Participants in the event were asked to donate ten dollars to the cause, and prior to the event organizers estimated that it would take six million people to complete the chain, bringing the total expected contributions to $60 million. In addition, Citibank and the Coca-Cola Company donated a combined $8 million to the effort. On the afternoon of the event, at exactly 3:00 p.m. eastern standard time, all participants sang the songs “America the Beautiful” and “Hands Across America.” The latter song had been written specifically for the occasion. Participating at various points along the chain were numerous high-profile celebrities, including Liza Minelli and Gregory Hines in New York City, President Ronald Reagan at the White House in Washington, D.C., Kathleen Turner in St. Louis, Kenny Rogers in Amarillo, Texas, and Richard Dreyfuss in Santa Monica, California.
Members of the Hands Across America chain link hands in Santa Monica, California, on May 25, 1986. (AP/Wide World Photos)
442
■
Hannah, Daryl
Following the event, despite the fact that millions of Americans participated and a large amount of publicity surrounded the charitable function (including promotional ads on McDonald’s placemats and commercials during the Super Bowl), Hands Across America was deemed by many to be a failure. Large gaps occurred in several places along the route, sometimes spanning hundreds of miles, so the chain had to be completed by ribbons and ropes. In addition, while the event raised nearly $50 million, it was so large and required so much organizational effort that it necessitated a staff of four hundred and cost nearly $17 million to put together. In the end, event organizers were able to donate only $20 million to soup kitchens, food pantries, and other organizations to help the hungry and homeless across the country. Impact Even though Hands Across America raised far less money than organizers had hoped, the event still served to raise public awareness of two important social issues during the 1980’s: hunger and homelessness. These issues were sometimes dismissed during the decade, as general U.S. prosperity led to denigration rather than sympathy for impoverished Americans. The rising public awareness that extreme poverty was a serious social problem in the United States and that it was not caused by laziness coincided with an increase in charitable donations by the American public. Further Reading
Beck, Melinda, et al. “A New Spirit of Giving.” Newsweek, June 2, 1986, 18. Hands Across America, May 25, 1986. New York: Pocket Books, 1986. Lindsay Schmitz See also
Business and the economy in the United States; Farm Aid; Homelessness; Live Aid; USA for Africa.
■ Hannah, Daryl Identification American actor Born December 3, 1960; Chicago, Illinois
Hannah overcame personal inhibitions and won recognition in her film roles to become a Hollywood icon of the 1980’s.
The Eighties in America
Daryl Hannah was a shy and sickly child, so her mother, Susan, signed her up for dance lessons in hopes of strengthening her legs. The tall, blond, and willowy beauty was a talented ballet student, but she chose to practice other forms of dance, as well as soccer and track, at the Francis W. Parker School in Chicago. In 1972, at the age of twelve, she debuted on stage in a production of Who’s Afraid of Virginia Woolf? (pr. 1962). As a child with insomnia, Hannah began watching movies late at night, developing an interest that led her to enroll in acting classes at the Goodman Theatre. In 1978, while still in high school, Hannah played football and made her first movie appearance in a small role in the thriller The Fury. After relocating to Los Angeles, Hannah met Chuck Binder, who became her manager, and singer-songwriter Jackson Browne, who became her boyfriend. From 1981 to 1984, she landed seven roles, including the part of the android Pris in Blade Runner (1982), and participated in a Browne MTV promotional video. In March, 1984, she appeared as Madison, a beautiful mermaid, in the Disney movie Splash. Cast with Tom Hanks and John Candy in this fairy tale, Hannah exuded the ideal mixture of splendor, sexuality, and innocence. Although not comfortable with nudity, Hannah made the perfect mermaid; since she knew how to scuba dive, she performed the underwater parts herself. With this lead role, Hannah became a star, and Disney earned needed funds for its coffers. Throughout the 1980’s, Hannah acted in a variety of roles and worked opposite many leading actors. Among her characters were: Diane, an actress and aerobics instructor, in The Pope of Greenwich Village (1984); Ayla, a Cro-Magnon woman, in the Clan of the Cave Bear (1986); Mary, a two-hundred-year-old ghost, in High Spirits (1986); and Annelle, a beautysalon worker, in Steel Magnolias (1989). Her costars included Robert Redford in Legal Eagles (1986), Steve Martin in Roxanne (1987), Michael Douglas in Wall Street (1988), and Julia Roberts, Sally Fields, Olympia Dukakis, Dolly Parton, and Shirley MacLaine in Steel Magnolias. Although the popularity of her various films varied widely, Daryl Hannah remained a strong attraction in movie genres that included drama, romantic comedy, science fiction, and thrillers. Impact Daryl Hannah’s role as Madison, a sexy yet sweetly innocent girl with a fish tail, launched her ca-
The Eighties in America
reer and widened Disney’s audience. Splash represented an endeavor by Disney’s Touchstone Studios to attract teens and young adults to its films. People flocked to the production, which grossed over $69 million and made Hannah an American idol. Further Reading
“Hannah, Daryl.” In Current Biography Yearbook 1990, edited by Charles Mopritz. New York: H. W. Wilson, 1991. Martindale, David. “Daryl Hannah: Still Making a Splash.” Biography 6, no. 1 (January, 2002): 66-72. Prince, Stephen. A New Pot of Gold: Hollywood Under the Electronic Rainbow, 1980-1989. Berkeley: University of California Press, 2002. Cynthia J. W. Svoboda
Hart, Gary
■
443
St. John’s, Newfoundland, delaying the hunt and sharply lowering that year’s take. This action was followed by pitched battles with the Canadian Coast Guard and sealers off Nova Scotia. Watson and his crew were caught and sent to prison. Media footage of sealers clubbing pup after pup to death garnered widespread sympathy for the protesters around the world, and in 1984 the European parliament banned the import of Canadian baby harp seal pelts. This ban collapsed the market for the pelts and led the Canadian government to ban vessel-based seal hunting, although it still allowed more limited, landbased hunting. In 1984, the Quebec Court of Appeals overturned the conviction of Watson, and the Canadian Supreme Court upheld that ruling in 1985.
■ Harp seal hunting
Impact The harp seal hunt abated in the later 1980’s, as the market for seal pelts shrank dramatically, but the practice was never completely abandoned. In later years, the hunt was resumed, as new markets for seal products opened in the 1990’s. As the hunt increased, so too did protests and actions designed to prevent it.
Definition
Further Reading
See also Auel, Jean M.; Blade Runner; Film in the United States.
Traditional Canadian hunting activity
Animal-rights activists increased their opposition to the Canadian practice of hunting harp seals in the early 1980’s. Their protests and acts of civil disobedience brought international attention to the issue. In the northwest Atlantic, the Canadian harp seal hunt is an annual spring ritual that has long been a factor in the maritime economy, which benefits from trade in the animals’ valuable oils and pelts. The young pups are usually killed by clubbing. Led by longtime activist Paul Watson, the environmental activist group Greenpeace started to send protesters against this commercial hunt in 1976, and in 1977 Watson garnered media attention by bringing along film star Brigitte Bardot, as well as by getting physically beaten by a group of the sealers as he peacefully tried to stop one of their ships. In 1979, Watson and others sprayed harmless red dye on over one thousand harp seal pups to make their pelts unmarketable. For this offense, the Canadian government imprisoned Watson briefly in 1980 and forbade him from entering eastern Canada, but he defied this parole order and returned to the ice floes off Canada in 1981, this time using blue dye to disrupt the hunt. The conflict escalated in 1983, as Watson’s boat, Sea Shepherd II, effectively blocked the harbor of
Nadeau, Chantal. Fur Nation. New York: Routledge, 2001. Watson, Paul. Seal Wars: Twenty-Five Years on the Front Lines with the Harp Seals. Richmond Hill, Ont.: Firefly Books, 2003. Scot M. Guenter See also Aboriginal rights in Canada; Business and the economy in Canada; Environmental movement; Europe and North America; Fashions and clothing.
■ Hart, Gary Identification
U.S. senator from 1975 to 1987 and a presidential candidate in 1984 and 1988 Born November 28, 1936; Ottawa, Kansas Senator Gary Hart of Colorado pursued the Democratic Party’s presidential nomination twice in the 1980’s. His second effort ended in scandal, when, in 1987, he was photographed with a young woman sitting on his lap on a ship named Monkey Business. Gary Hart graduated in 1958 from Bethany Nazarene College. He earned a divinity degree from Yale
444
■
Hart, Gary
The Eighties in America
on a ship aptly named Monkey Business.The affair came to light after Hart issued a challenge to journalists. The New York Times had confronted Hart about rumors of extramarital affairs, and the senator in response had invited journalists to “Follow me around,” presumably to establish his innocence. When the press accepted the invitation, however, Hart unwisely continued his activities. As a result, the Miami Herald was able to secure a photograph of Rice perched on the smiling Hart’s lap on the Monkey Business. Hart initially withdrew from the race, but he then reentered it, saying “Let the people decide.” A scandal-ridden Gary Hart announces his withdrawal from the presidential primary His momentum could not be reelections on May 8, 1987. (AP/Wide World Photos) gained, however, and he withdrew a second time, ceding the party’s presidential nomination to Masthree years later. He received a Yale University law sachusetts governor Michael Dukakis, who lost the degree in 1964 and was admitted to the Colorado general election to Republican vice president and District of Columbia bars the following year. In George H. W. Bush. 1972, Hart managed Senator George McGovern’s Impact United States senator Gary Hart’s efforts to anti-Vietnam War presidential campaign. While that focus the national debate on the need for change effort was unsuccessful, Hart won a Senate seat from and reassessment in the post-Vietnam War world Colorado two years later and became a vocal spokeswere destroyed by the scandal that wrecked his presiperson for the reform wing of his party. dential aspirations. From 1975 to 1987, Hart built a record in the Senate as a thoughtful advocate of change and substanFurther Reading tive reform of the military, the economy, and naDionne, E. J., Jr. “Paper and Hart in Dispute Over Artional security. In 1984, Hart ran as a Democratic ticles.” The New York Times, May 4, 1987, p. A16. Party alternative to former vice president Walter Drew, Elizabeth. Election Journal: Political Events of Mondale. Hart stressed the need for his party’s presi1987-1988. New York: William Morrow, 1989. dential nominee to focus on the future, not the past. Hart, Gary. The Courage of Our Convictions: A ManiIn response to Hart’s insistence that the nation festo for Democrats. New York: Times Books, 2006. needed new ideas, Mondale implied that this was an Toner, Robin. “Hart Stresses Ideals, Formally Enters empty phrase: Quoting a popular fast-food television the 1988 Race.” The New York Times, April 14, 1987, commercial, Mondale asked “Where’s the beef?”—a p. A16. phrase of the 1980’s with which Hart found himself Joseph Edward Lee associated, to his detriment. While Hart was unsuccessful in wresting the nomination from Mondale, See also Advertising; Bush, George H. W.; Conhowever, his 1984 run laid the foundation for a gress, U.S.; Dukakis, Michael; Elections in the stronger campaign four years later in 1988. That efUnited States, 1984; Elections in the United States, fort, however, was wrecked by a 1987 extramarital 1988; Mondale, Walter; Scandals. encounter with a young woman named Donna Rice
The Eighties in America
■ Hawkins, Yusef Identification
African American teenager and murder victim Born 1973; New York, New York Died August 23, 1989; New York, New York Hawkins was shot to death by a member of a gang of whites in the Bensonhurst section of Brooklyn, New York, in a racial incident that shocked the nation. On Wednesday, August 23, 1989, Yusef Hawkins (also known as Yusuf) and three African American friends traveled by subway to Bensonhurst to look at a used Pontiac automobile they had seen advertised for sale. The Bensonhurst section of Brooklyn, inhabited mostly by blue-collar people of Sicilian heritage, had few African American residents. It also had a history of racial violence, including attacks by white mobs upon African American men in 1983 and
Hawkins, Yusef
■
445
1987. None of the injuries in the earlier attacks had been fatal. Unknown to Hawkins and his friends, Bensonhurst youths heard a rumor that an African American teenager had been invited to a local girl’s birthday party that night. Gina Feliciano had been going out with one of the Bensonhurst youths, Keith Mondello, but had since started dating an African American man. Neighborhood men were planning to harm this man if he arrived at Feliciano’s party. These men witnessed Hawkins and his friends arrive at about 9:00 p.m. on Bay Ridge Avenue. Witnesses reported that the men shouted, “Let’s club the niggers.” A mob of ten to thirty people armed with baseball bats and several guns then chased the four young men. Hawkins told the crowd that he knew nothing about the girl, but, in response, one man shot Hawkins twice in the chest with a semiautomatic .32-caliber pistol. Of the other three African
Al Sharpton, center, leads a march protesting the murder of Yusef Hawkins in 1989. (Christian Razukas/cc-by-sa-2.0)
446
■
The Eighties in America
Health care in Canada
Americans in the group, Troy Banner was slightly wounded, apparently by a gunshot, while Luther Sylvester and Claude Stanford escaped without injury. Hawkins died shortly afterward. Police quickly arrested Mondello, Joseph Fama, and Steven Curreri, all eighteen years of age; Charles Stressler, twenty-one years old; Pasquale Raucci, nineteen; and James Patino, twenty-four. All of the men lived in Bensonhurst. The suspects were charged with a variety of crimes, including first-degree felony assault, first-degree rioting, aggravated harassment, and two misdemeanor counts of menacing the victims and violating their civil rights. In addition, Mondello and Curreri were charged with conspiracy as a result of their previous discussions about attacking African Americans. All but Patino were charged wtih criminal possession of weapons—two-by-fours and golf clubs. The most serious of the charges was the felony assault charge, which carried a maximum penalty of fifteen years’ imprisonment. In May, 1990, a jury convicted Fama of being the triggerman. He was sentenced to thirty-two years to life in prison. Two others were convicted of felony charges, two more were convicted of misdemeanor counts, and three were acquitted, including Mondello. Federal civil rights charges were not brought against Hawkins’s attackers, because New York State courts convicted five of the defendants. The Justice Department had a dual-prosecution policy limiting the circumstances under which a federal prosecution could be brought following a state trial for the same act. Impact The killing of Yusef Hawkins represented the latest outbreak of racial violence in a city already tense from a spate of similar crimes during the decade. The murder caused particular outrage in the African American community, with protesters parading through Bensonhurst the weekend after the attack. Mayor Ed Koch, running behind African American challenger David Dinkins in the mayoral race, criticized the city’s African Americans for inflaming tensions. He did not condemn the whites of Bensonhurst for chanting racist slogans at the marchers. Dinkins later won the election, becoming New York City’s first African American mayor. Further Reading
DeSantis, John. For the Color of His Skin: The Murder of Yusuf Hawkins and the Trial of Bensonhurst. New York: Pharos, 1991. Caryn E. Neumann
See also
African Americans; Brawley, Tawana; Central Park jogger case; Crime; Gangs; Goetz, Bernhard; Howard Beach incident; Racial discrimination.
■ Health care in Canada Definition
Delivery of medical services to the Canadian public
Canada consolidated its public health system in the 1980’s. Canadians expressed basic satisfaction with their health care, preferring their system to the U.S. private-enterprisebased health system. In the 1980’s, Canadian governments financed and regulated a health care system operated by independent physicians and hospitals. Each of Canada’s ten provinces and two territories managed its own system, which was partially funded by the federal government and subject to principles set by the central authority. The federal government provided health care for the armed forces and their veterans, the Royal Canadian Mounted Police, inmates of federal prisons, and First Nations peoples living on reservations. Public Health Care in Canada The Canadian national health care system began to develop in 1947, when Saskatchewan established a hospital insurance plan controlled by the provincial government. Residents paid annual premiums entitling them to have all basic hospital bills paid. Imitated by other provinces, the practice proved so successful and popular that the federal government in 1957 undertook to pay half the costs. By 1961, all provinces operated plans that collectively covered 99 percent of the population. In 1962, Saskatchewan introduced insurance covering doctors’ fees, leading to the 1966 federal Medical Care Insurance Act; by 1971, every province had a similar plan. In the 1980’s, two provinces still collected premiums, but by and large, general tax revenue funded the system. Provincial plans universally covered all medically necessary hospital and physician services, but differed on other costs. In 1984, only Saskatchewan and Manitoba had universal drug insurance, but all provinces covered those on social assistance, and all except Prince Edward Island provided prescription benefits for persons over sixty-five. Provisions varied regarding payment for ambulances, cosmetic surgery, eyeglasses, hearing aids, and dentistry outside
The Eighties in America
hospitals. Personal expenditures for drugs, dental work, and other services amounted to one-quarter of health costs, partially paid by personal or employer-provided private-enterprise insurance. The 1984 Act
In 1984, the federal government consolidated previous health legislation in the Canada Health Act, setting out five principles that provinces had to observe. Each plan had to (1) be run by a public authority accountable to the provincial government, (2) cover all necessary physician and hospital services, (3) be open to all residents, (4) be portable, covering residents wherever they needed services, and (5) be accessible regardless of income or ability to pay. In support of the fifth principle, the 1984 act threatened to reduce federal grants dollar for dollar if provinces permitted extra billing by doctors or hospitals beyond approved insurance payments. On June 12, 1986, the Ontario Medical Association reacted to legislation banning extra billing with a strike, but negative public reaction and dissent from within the association’s own ranks caused the association to abandon the job action on July 6, after twenty-five days.
Criticisms Physician associations opposed government insurance from the beginning, staging a bitter strike in Saskatchewan when the province started its 1962 insurance plan and a brief work stoppage in Quebec in 1970. Doctors resented having their income limited by government fee schedules. Although fees were negotiated by provincial medical associations on behalf of doctors, Canadian physicians’ incomes regularly ran about two-thirds those of American physicians. Some patients loudly protested excessive waiting times for elective surgery. The longest waits were for orthopedic or eye surgery, averaging nearly six months—in Manitoba, the average wait for a hipreplacement was over a year. Emergency heart surgery was rarely delayed, but waiting lists for elective coronary artery bypass surgery stretched to nearly a year in the late 1980’s. Public anger led the provinces to devote more resources to the procedure, markedly reducing wait times. Pressured to hold down costs, Canadian hospitals purchased few technologically advanced medical machines. When magnetic resonance imaging (MRI) machines became available, wait times to access them greatly exceeded wait times in the United States.
Health care in Canada
■
447
Economists criticized lengthy wait times as a form of nonmonetary rationing and insisted that private enterprise could more efficiently ration limited resources by price. Proponents of Canada’s system were more impressed with studies examining use of open heart surgery in the United States and Canada: In the United States, people from low-income neighborhoods benefited from such procedures much less often than those from high-income areas, but no such disparity existed in Canada. Impact By the 1980’s, Canadians had constructed a health care system that, despite some criticisms by economists and popular discontent with its unsatisfactory aspects, won the overwhelming support of the majority of its citizens. The plans were popular with employers, who liked having government provide health care for their employees. Defenders of the system recalled earlier times, when it was not unusual for a private insurance company to cancel policies if an insured’s health worsened, and boasted that such cancellations were no longer possible. Critics correctly predicted costs would rise if medical services were freely available to users. Whereas total health expenditures represented 6.1 percent of Canada’s gross domestic product (GDP) in 1970, by 1987 they reached 8.6 percent. However, U.S. expenditures went from 6.0 percent to 11.2 percent of that nation’s GDP in the same period. Canadians were also pleased to learn that their system produced slightly better results in terms of life expectancy than did private enterprise in the United States. Canadian life expectancies in the mid-1980’s were 73.04 years for men and 79.73 years for women; in the United States, they were 71.5 years for men and 78.4 years for women. A health care system that guaranteed universal access to medical services had the proud support of the overwhelming majority of Canadian citizens. Further Reading
Bennett, Arnold, and Orvill Adams. Looking North for Health: What We Can Learn from Canada’s Health Care System. San Francisco: Jossey-Bass, 1993. Writing in clear, simple language, the authors praise the Canadian system. Chernomas, Robert, and Ardeshir Sepehri, eds. How to Choose? A Comparison of the U.S. and Canadian Health Care Systems. Amityville, N.Y.: Baywood, 1998. Collection of heavily statistical essays exam-
448
■
Health care in the United States
ining comparative costs, access, and satisfaction in Canada and the United States. Crichton, Anne, David Hsu, and Stella Tsang. Canada’s Health Care System: Its Funding and Organization. Rev. ed. Ottawa, Ont.: CHA Press, 1994. Evaluates the organization and evolution of the Canadian health care system. McFarland, Lawrie, and Carlos Prado. The Best-Laid Plans: Health Care’s Problems and Prospects. Montreal: McGill-Queen’s University Press, 2002. Critical analysis of past and present plans to reform Canada’s health care system. Sutherland, Ralph W., and M. Jane Fulton. Health Care in Canada: A Description and Analysis of Canadian Health Services. Ottawa, Ont.: Health Group, 1988. Detailed description of Canada’s health system in the 1980’s. Milton Berman See also Business and the economy in Canada; Business and the economy in the United States; Canada Health Act of 1984; Health care in the United States.
■ Health care in the United States Definition
Delivery of medical services to the U.S.
public The 1980’s was a decade of expansion, medical advances, rising costs, and the growth of HMOs. Diverse health care systems exist in the world today, the most advanced and most effective being found in Western Europe (Great Britain, France, Switzerland, and Norway) and North America (Canada and the United States). Americans often pride themselves in having the most effective health care system. While comparative research on developed countries indeed shows the U.S. system to be the most sophisticated, however, it is by far not the world’s best when such issues as health insurance, health affordability and accessibility, and doctor-patient quality consultation are weighed against Canada, France, and Sweden, whose systems U.S. politicians derogatorily label “socialized health care,” “socialized medicine,” or “social welfare engineering.” The health care system in the United States is a result of the major changes that occurred during the
The Eighties in America
early twentieth century, when the so-called epidemiological transition or the impact and prevalence of communicable diseases (measles, mumps, smallpox, syphilis, yellow fever, malaria, and polio) began to be surpassed by the burden and high mortality rates from chronic diseases (stroke, cancer, cardiovascular disease, and diabetes). Further milestones occurred during the 1940’s and 1950’s, spurred by the needs of wounded American soldiers during World War II (1939-1945); the establishment of medical boards to certify and employ physicians and advance the code of ethics in medicine, including the American Board of Pediatrics (1933), the American Board of Internal Medicine (1936), the American Board of Surgery (1937), and the American Board of Neurological Surgery (1940); and the creation of the Centers for Disease Control and Prevention (CDC) during the 1950’s. The current state of health care in the United States is a result of the medical revolution of the 1980’s, with a vast number of physicians, nurses, hospitals, nursing homes, and health maintenance organizations (HMOs); sophisticated and available drugs, antibiotics, and hormone therapy; scanning and diagnostic equipment such as magnetic resonance imaging (MRI); and advanced procedures including kidney dialysis, mechanical ventilation, open heart surgery, organ transplant surgery, and hip and knee replacement. Age-adjusted death rates were reduced from 308 per 100,000 in 1950 to 184 per 100,000 in 1984. With improvements in health care delivery, however, came high health care costs, rising from $40 billion in 1965 to $250 billion in 1980 and almost $700 billion in 1990, representing 6 percent, 9 percent, and 12 percent of the U.S. gross domestic product (GDP), respectively. The expenditures would rise to more than $1 trillion by 2000, accounting for about 14 percent of the country’s GDP. Thus, it is no wonder that, during the 1980’s, 60 percent of Americans grew alarmed by the high rise in health care costs, while 10 percent were dissatisfied with the quality of care, preventing many from switching jobs in fear of losing their expensive health benefits. At least onethird of the American population were left at the periphery of the most successful medical advances of modern times. Organization The health care system in the United States can be characterized as complex and diffuse
The Eighties in America
in that it is embedded in the federal government, in the states, and in local government. On the federal level, the Department of Health and Human Services is supported by federal agencies that deal with one or another aspect of citizens’ health, such as the Veterans Administration, the Department of Labor’s Occupational and Health Administration (OSHA), the Bureau of Mines in the Department of the Interior, and the Department of Agriculture, which protects farmers and animals from harm in the field. The federal government essentially provides grants and loans to state and local governments and to universities, and it issues guidelines on health, such as the code of ethics for physicians. Thus, states and local governments are the main implementers of health care laws in the United States, administering Medicare to senior citizens and Medicaid to the poor through their health and social work departments and local boards. Parallel to the federal government, state and local institutions, and agencies exist voluntary organizations such as the American Cancer Society, the American Red Cross, the American Medical Association, and the American Public Health Association. All of these organizations promote health awareness and recommend policies. Private businesses also perform various health functions. These include physicians’ clinics and industrial plants with their own health care facilities designed to protect their workers (such as in the mining, railroad, and lumber industries). Since the 1980’s, the health care system has been overwhelmed by for-profit, private market-driven corporations, powerful pharmaceutical companies, health insurance companies, and a wide spectrum of hospital ownership and management. Administrators of programs for the poor, such as Medicaid, and an increased number of physicians and nurses groups have become directly involved in lucrative health care. The Health Revolution of the 1980’s During the 1980’s, the health care system experienced an expansion determined mostly by profit-minded corporations, government efforts to cut the cost of health care, and a capitalist philosophy that the private sector was better suited than the government to provide advanced, effective, and financially cheap health care. The move forced many states to experiment with several new health plans that relegated health
Health care in the United States
■
449
care to private or proprietary companies and agencies. This led to the emergence of powerful HMOs, which had been authorized by Congress through the Health Maintenance Organization Act of 1973. In the 1980’s, HMOs had a membership of 28 million Americans. The number of medical colleges for the training of physicians rose to 127, or 1 per 1,920,000 people (or to 142, if osteopathic schools are included). Medical students completed their training in four years following a baccalaureate degree and were required to spend one paid residency in a hospital, with specialization in one of the health fields. By 1985, 85 percent of new physicians had specialties. In 1986, there were 218 physicians per 100,000 population, compared to 150 per 100,000 during 1900 to 1910. Also, the number of nurses, many of whom had to complete at least two years of academic training, skyrocketed to 666 per 100,000 among affluent Americans. During the decade, women and minorities began to enroll in medical schools in relatively large numbers. Only 5 percent of women were in the medical field in 1950, and by 1983 this figure had risen to 33 percent. The number of drugstores had reached 51,000 in 1980, estimated as one per 4,700 Americans, with many experts claiming that the number was excessive. The number of pharmacists stood at 55 per 100,000 in 1985. By 1986, the number of dentists, a fast-growing health care group working with forprofit organizations, reached the ratio of 57 per 100,000. In 1981-1982, there were 6 hospital beds for every 1,000 population, with hospital admissions of 158.5 per 1,000 on the average and an average length hospital stay of eight days. Eventually, the number of hospital beds reached 6,500 as a result of demographic growth and the aggressive drug advertising by pharmaceuticals and physicians. During that decade, among the 75 percent of the hospital beds privately owned, 15 percent were administered by not-for-profit religious or nonsectarian organizations and 10 percent were proprietary or for-profit. The government virtually gave up control of nursing homes, convinced that relegating them to the private sector would result in substantial cost reductions and improved overall management and accountability. In 1980, moreover, the number of government-run hospitals accounted for only 4 percent of the total, while 15 percent were under religious and not-for-profit organizations, leaving 81 percent
450
■
The Eighties in America
Health care in the United States
in the hands of proprietors, the smallest ones having an average of 68 percent of the total beds in the country. Thus, in 1986, there were 16,000 nursing homes, 90 percent of which had 100 beds on the average. Yet, 56 percent of the nursing homes received government subsidies, in contrast to only 20 percent in 1960, spurred by the baby-boom generation. During the 1980’s, the federal government provided 56 percent of the health budget, the remaining 44 percent coming from state and local governments. Beginning in 1982, the government imposed strict limits on Medicare reimbursements to hospitals. It also insisted on using diagnosis-related groups (DRGs) with minimal governmental interference as the best way to cut costs. By 1989, however, health care costs and doctors’ fees continued to rise, despite a law mandating that charges be based on the “customary, prevailing, and reasonable” practices of the health care system. (Physicians, pharmaceuticals, health facilities, and insurance companies, seen as part of the market, determined what was “customary, prevailing, and reasonable.”) At the same time, physicians and nurses determined primarily all referral pathways and were encouraged to channel patients to the supposedly most costeffective care available—making doctors, in fact, case managers. In addition, health care malpractice suits increased, forcing insurance companies to increase their premiums and thus making it difficult for people to afford health insurance. The situation was exacerbated by an oversupply of doctors by 25,000, which was 50 percent above the commonly agreed need during the late 1970’s and the 1980’s. In 1986, 2 million poor people and the severely disabled relied on Medicaid. The number rose to 31 million soon thereafter. Concomitantly, in 1984, there were 15,000 private clinics, but only between 15 and 20 percent were government-run. Health centers had increased tremendously since the 1920’s, but only 1,000 of them received special subsidies from the government. Unlike in other developed countries, public hospitals in the United States are primarily designed to serve the poor; the rich go to private clinics or to specialized health facilities owned by groups of physicians or by private corporations. Since the 1980’s, the pharmaceutical companies have become the major sponsors of drug research, along with some universities that have a medical
school or a school of public health. Pharmaceutical companies are often the source of the research dollars that go into the manufacture of expensive brandname drugs, with 25 percent of resources dedicated to advertisement. This situation accounts for the fact that, during the 1980’s (and thereafter), most patients’ expenditures went into drug purchases. During that decade, twenty corporations controlled the whole drug manufacturing industry. Impact The 1980’s were a time of major changes in the health care system. In 1940, only 9 percent of Americans were medically unprotected, but, at the dawn of the 1980’s, close to 80 percent were protected through insurance, mainly by Blue Cross/ Blue Shield. By 1986, however, the number of insured had been reduced to 37 million, and states began to implement health care policies that increased the burden on patients. In the name of the free market forces, deregulation, and a social and political environment that espoused the philosophy that med icine should not be government business, the United States witnessed the commercialization of the health care system, in which drug prices and insurance premiums were solely determined by the pharmaceutical complex and by physicians and lobbyists in Washington, D.C. For some analysts, the introduction of HMOs resulted in misery for the sick public, especially the poor and lower middle class, and in unscrupulous profiteering. The outcry against HMOs grew so loud that, by 1987, less than 15 percent of the population was covered by the new health plans. Further Reading
Fuchs, Victor R. The Health Economy. Cambridge, Mass.: Harvard University Press, 1986. One of the best sources illustrating how health care systems are affected by economic systems and policies. Issues of deregulation, cost effectiveness, competitiveness, and the powerlessness of the population are highlighted. Ludmerer, Kenneth M. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press, 1999. A historical account of the evolution of medical education and the impact of managed care entrusted to market forces. Maillet, Pete, and Steve Halterman. “The ConsumerDriven Approach: Defining and Measuring Success.” Benefits Quarterly, 2004, 7-15. Highlights the impact of consumerism in health care, which
The Eighties in America
most often does not benefit the consumers but the private health care sector. Roemer, Milton I. The Countries. Vol. 1 in National Health Systems of the World. Oxford, England: Oxford University Press, 1991. One of the best available comparative approaches to world health systems. U.S. Bureau of the Census. Statistical Abstract of the United States. 112th ed. Washington, D.C.: Government Printing Office, 1992. Official statistics from the federal government. Mario J. Azevedo See also
Alternative medicine; Baby Fae heart transplantation; Fetal medicine; Health care in Canada; Health maintenance organizations (HMOs); Medicine; Plastic surgery; Transplantation.
■ Health maintenance organizations (HMOs) Definition
Organizations that coordinate and provide group prepaid medical services
HMOs represented an attempt by the U.S. government to use free-market, private-enterprise principles to control skyrocketing health care costs. During the 1980’s, this experiment failed, as costs kept skyrocketing, largely because HMOs needed to realize profits, and they focused more attention on doing so than on providing accessible quality health care to their patients. Health maintenance organizations (HMOs) were officially introduced by a 1973 act of Congress and emerged during the late 1970’s, making their greatest impact in the following decade. The resulting prepaid health plans sought to replace the predominant model of fee for service, in which patients paid their health care providers for the services they received. Prior to the 1973 legislation, there were a few HMOs in the country, such as Kaiser Permanente, a nonprofit organization founded in the 1930’s. Up until the 1970’s, however, they had virtually no impact on the U.S. health care system, and they struggled constantly against opposition by organized medical groups, public skepticism, and court challenges. Competition and the For-Profit Motivation The rise of HMOs during the 1970’s and 1980’s was prompted by soaring costs related to health care. Experts and
Health maintenance organizations (HMOs)
■
451
politicians speculated that HMOs would provide the government with a means to contain the upward spiral of medical costs while still maintaining a competitive, liberalized, market-driven system. A businessimposed system, managed care is a “variety of reimbursement plans in which third party payers [the government, for example] attempt to control costs by limiting the utilization of medical services,” rather than the fee-for-service payments. Starting from the premise that competition and deregulation by government lower the cost of doing business, the proponents of HMOs argued that the cost of medical care would fall dramatically once they were implemented. They asserted that doctors and hospitals for the first time would be compelled to think about cost when determining fees; that patients would have the rights of consumers; and that health care would benefit from modern information echnologies and business practices that could limit costs. Moreover, the unnecessary use of other, more expensive medical technologies, such as magnetic resonance imaging (MRI), would be limited, and HMOs would force doctors to move expensive treatments and procedures from hospitals to private practices or ambulatory clinics, thereby lowering the burden on health care insurers and reducing premiums. The general goals of HMOs were certainly noble. It was alleged that they not only would be less costly but also would provide a higher quality of health care and make it more affordable to millions of Americans. However, by the early 1980’s, it became clear that these goals could not be obtained simultaneously, as long as profit was the driving motive behind the system. In general, HMO owners began devoting about 5 percent of their investors’ capital to health care while pocketing and sharing with stockholders the remaining 95 percent. It was claimed that, as high administrative costs and unnecessarily costly tests were reduced or eliminated, more funds would be freed to develop new drugs and new therapies. Unfortunately, that is not what happened. The cost of health care continued to rise. Furthermore, doctors who cooperated with the HMOs were viewed with suspicion by the public as “double agents,” as the new plans precluded patients from choosing their preferred doctor or hospital. HMOs also restricted the length of patients’ time in the hospital and required pre-authorization to check in at a health care facility. In addition, physicians could prescribe drugs only from a pre-approved list or “formulary.”
452
■
Health maintenance organizations (HMOs)
Constraints and Failures of the HMO Industry
It also became clear that the HMOs were failing to register the poor, the elderly, and the inner-city populations, shattering the federal and state governments’ hopes that the private sector would relieve them of the burden of providing care to at-risk and underserved Americans. Moreover, employers were at times exerting pressure on their employees to enroll in an HMO, rather than a more flexible, but more expensive, traditional health insurance plan. All these shortcomings shattered the prediction that, by the end of the 1980’s, 90 percent of Americans would be covered by the emerging for-profit HMO plans. In small towns, where doctors worked alone, both patients and physicians resisted enrolling in the new plans at first, until the HMOs allowed people to choose their physicians from a small pool, making it possible for doctors to dedicate a portion of their practice in what was called the individual practice associations (IPAs). Because of this arrangement with physicians, IPAs increased their number by 87 percent, from 97 to 181, between 1980 and 1985. During the same period, the number of prevailing HMOs increased by only 17 percent, from 132 to 162. Other plans, called preferred provider organizations (PPOs), brought groups of doctors and hospitals together to serve certain public and private beneficiaries at competitively low prices. This variation on HMOs allowed patients to choose to patronize other doctors if they were willing to pay higher fees to do so. In addition, diagnosis-related groups (DRGs) entailed “prospective” (advance) payments rather than “retrospective charges for each unit of services” being made to hospitals by each patient covered under Medicare. In some states, DRGs encompassed all insurance payers. The impending failure of the HMOs was reflected in the continuing rise of health care costs and its impact. For example, in constant 1982 dollars, U.S. spending on health care rose from $333.47 billion in 1980 to $385.74 billion in 1987, and it climbed another 16 percent in 1988. The increase in spending was spurred by administrative overheads, which rose by 186 percent, from $9.2 billion to $24 billion, during the 1980-1986 period. The consequent losses affected the running of the HMOs. Indeed, in 1987 alone, the overall HMO industry lost $692 million. Some 179 of the 243 HMOs (almost 75 percent) registered a loss that year, according to the National Underwriter.
The Eighties in America
Unhealthy competition among the HMOs resulted in some falling by the wayside, while others were being swallowed by the largest. As the quality of services became imperiled and as the public claimed that accessibility to treatment and medication for enrollees had rapidly eroded, the situation was exacerbated by the fact that several HMOs were going bankrupt or were on the brink of bankruptcy, including the largest HMO, Maxicare. Maxicare collapsed and declared bankruptcy in 1989. It had lost $225 million out of its $1.8 billion in assets. Thus, hospitals began to resist their associations with HMOs, fearing that they would be left with the costly burden of uncollected fees for services rendered. Loss in trust of physicians, medical schools, and medical education ensued, and the frequency of malpractice suits shook the very foundations of the entire U.S. health care system as the end of the 1980’s approached. HMOs were also being criticized for lack of transparency in providing the purchasers, especially those linked with big institutions, with data on “cost, use, and quality” ratings of their health care plans. The problem became so great that in 1988, Congress amended the 1973 act to ban employers from forcing their employees to enroll in a specific HMO. The end result of the decade’s struggle between HMOs and the public was the organizations’ unpopularity and near demise, which resulted in states entrusting their health care plans to such insurance companies as Blue Cross/Blue Shield. Impact The wholesale introduction of HMOs during the 1970’s and 1980’s failed, because it focused primarily on cost cutting and profiteering rather than on making quality health care affordable and accessible, thereby neglecting those Americans who could not afford it. The new system continued to favor the wealthy, who demanded high-cost services, such as complex tests, diagnoses, and tertiary treatments that required expensive equipment such as MRIs. As most experts note, the rising cost of health care was fueled in the 1980’s by administrative and bureaucratic practices that unnecessarily siphoned billions of dollars away from medical workers and services—as well as by an excessive number of physicians aggressively competing against one another, since the market was glutted by an excess of twentyfive thousand physicians, or 25 percent more than was needed. Rising costs were also attributable to new, expensive testing devices and treatments, such
The Eighties in America
as transplants and dialysis machines. They were exacerbated, moreover, by the consistent downturn of the American economy, an inflation spiral, the aging of the baby-boom generation—which was living longer and required an unprecedented level of care and medical services—and the perennial problem of people refusing to change their unhealthy behavior. Further Reading
Anderson, O., T. Herold, and B. Butler. HMO Development: Patterns and Approaches. Chicago: Pluribus Press, 1985. Required reading for those interested in the history of health care and its problems during the 1970’s and 1980’s. Harris, Louis, et al. American Attitudes Toward Health Maintenance Organizations. New York: Louis Harris, 1980. Interesting overview of the pros and cons of HMOs from their inception. Rodwin, Marc A. Medicine, Money, and Morals: Physicians’ Conflicts of Interest. New York: Oxford University Press, 1993. Indictment of the medical profession, including doctors’ ambivalent attitudes toward HMOs’ profit-driven philosophy. Roemer, Milton I. The Countries. Vol. 1 in National Health Systems of the World. Oxford, England: Oxford University Press, 1991. Comparative analysis of the global health care system and a useful source for assessing the U.S. system in relation to others. Mario J. Azevedo See also
Alternative medicine; Health care in Canada; Health care in the United States; Medicine.
■ Heat wave of 1980 The Event Severe heat and drought Date June-September, 1980 Place Southern and southeastern United States
The high temperatures and low rainfall afflicting a large part of the United States in the summer of 1980 cost many human lives and led to enormous agricultural losses. In late June, 1980, high atmospheric pressure developed over the southern plains, deterring rain and driving seasonably high temperatures significantly higher. On June 23, the temperature in Dallas/Fort Worth reached 104 degrees Fahrenheit, and the temperature reached at least 100 degrees Fahrenheit every day thereafter through August 3. In Wichita Falls,
Heat wave of 1980
■
453
Texas, the high temperature each day from June 24 through July 3 was 110 degrees Fahrenheit or more, with a maximum of 117 degrees Fahrenheit on June 28—the highest temperature ever recorded there. All told, Wichita Falls had highs of 100 degrees Fahrenheit or above for seventy-nine days in 1980. The heat wave also hit Oklahoma and Kansas. Oklahoma City recorded highs of at least 100 degrees Fahrenheit fifty times that summer and, with an average July temperature of 88.3 degrees Fahrenheit, tied the mark it had set in 1934 for the hottest July in the city’s history. Tulsa had thirty days on which its minimum temperature tied or set a high record, with a low on July 16 of 87 degrees Fahrenheit, the highest minimum ever for the city. Across the Kansas state line, Wichita also had a severe summer, with three highs of 110 degrees Fahrenheit, and a high on July 12 of 112 degrees Fahrenheit. During July, the unusual heat spread east. The high in Little Rock, Arkansas, in the seven days from July 12 through July 18, was never less than 106 degrees Fahrenheit. In Tennessee, Nashville set a record high for July 16 with a temperature of 104 degrees Fahrenheit. On the Gulf Coast, the temperature in Pensacola, Florida, set record highs for the days from July 7 through July 14, with a reading of 106 degrees Fahrenheit on the 14th. On the Atlantic coast, Charleston, South Carolina, had a high of 100 degrees Fahrenheit on July 13. Even areas north of those with the highest temperatures fell victim indirectly to the heat wave because of the giant derecho—a cluster of thunderstorms with strong winds—it produced from the night of July 4 well into the afternoon of the next day. This storm cluster moved from southwestern Iowa and northwestern Missouri through several states, until it reached the Virginia and Maryland on the Atlantic coast. In all, it killed six persons and injured sixty-seven others. Impact With the drought that accompanied it, the heat wave produced enormous agricultural damage, hurting such crops as corn, soybeans, and cotton and killing or debilitating huge numbers of cattle and chickens. In all, farms, ranches, and related businesses lost twenty billion dollars. The direct human toll, however, was the worst effect. Although determining the number of deaths produced by a heat wave is difficult, estimates for the 1980 heat wave range from 1,250 to 10,000 dead.
454
■
Heaven’s Gate
The Eighties in America
Further Reading
National Oceanic and Atmospheric Administration. National Weather Service Web site. http://www .weather.gov. Stein, Paul. The Macmillan Encyclopedia of Weather. New York: Macmillan Reference, 2001. Victor Lindsey See also
Agriculture in the United States; Business and the economy in the United States; Cold Sunday; El Niño; Environmental movement; Farm Aid; Farm crisis; Natural disasters.
■ Heaven’s Gate Identification American film Director Michael Cimino (1939Date Released November 19, 1980
)
to salvage it, the print and television media relentlessly ridiculed him and his movie, creating a permanent impression in the public’s mind that it was one of the worst films ever made. Rereleased in early 1981 in a two-and-one-half-hour version, Heaven’s Gate earned only $1.5 million at the box office and made Westerns unfashionable for a decade. By 1981, Americans, deflated by recession and the scandals of Watergate and Vietnam, no longer wanted films that questioned the myths that had built America. Escapist hits like Star Wars (1977) and Raiders of the Lost Ark (1981) presented a more appealing and comforting world of clear good and evil, presaging the jingoistic machismo and outsized patriotism of the Reagan era. Impact The failure of Heaven’s Gate prompted Transamerica, United Artists’ parent company, to sell the studio to Metro-Goldwyn-Mayer, which incorporated the studio’s properties but did away with
Significance Heaven’s Gate became one of the most famous box-office flops in film history. The film’s wellpublicized demise helped end the 1970’s trend of young directors being given significant control of their films. It also contributed to the demise of the United Artists studio.
Heaven’s Gate dramatized the 1892 Johnson County, Wyoming, range war, in which rich cattle ranchers, with the alleged approval of the U.S. government, hired a mercenary army to kill immigrant farmers homesteading on their grazing lands. Kris Kristofferson played Jim Averill, a Harvard-educated marshal vainly trying to prevent the bloodshed. With a cast featuring Christopher Walken, John Hurt, Sam Waterston, Jeff Bridges, and French actress Isabelle Huppert, the gritty and violent film discarded many standard Western conventions yet featured breathtaking sequences like the Harvard graduation ball, a frontier dance on rollerskates, and the climactic battle between immigrants and mercenaries. Director Michael Cimino, who had won an Academy Award for The Deer Hunter (1978), was obsessed with detail, sending production costs soaring to $36 million and triggering prerelease press denouncing the large amount of money being spent on a single film. When the film finally premiered, American critics condemned the three-hour, thirty-nine-minute epic as pretentious, overlong, incoherent, self-indulgent, and even un-American. United Artists, which had financed the film, immediately withdrew it from release. While Cimino reedited the film in an attempt
Director Michael Cimino at the Berlin Film Festival in 1979. (AP/Wide World Photos)
The Eighties in America
its name, thus creating a myth that one film had destroyed a movie studio. It also ended the so-called second golden era of Hollywood filmmaking, during which directors were allowed to make big-budget films with minimal studio interference. Instead, movie companies shifted creative control away from directors and toward studio executives, who were often less willing to produce films whose subject matter was either unfamiliar or controversial. Further Reading
Bach, Steven. Final Cut: Art, Money, and Ego in the Making of “Heaven’s Gate,” the Film That Sank United Artists. New York: Newmarket Press, 1999. Wood, Robin. Hollywood from Vietnam to Reagan . . . and Beyond. New York: Columbia University Press, 2003. Richard Rothrock See also
Bridges, Jeff; Epic films; Film in the United States.
■ Heavy metal Identification
Rock music genre
Heavy metal achieved mainstream success in the 1980’s. It both appealed to a wide range of the decade’s listeners and influenced a generation of musicians. The term “heavy metal” was derived from Steppenwolf’s 1968 hit song “Born to Be Wild,” but Black Sabbath’s 1970 eponymous album served as the starting point of the new music genre. The album’s dark tone, guitar riffs, and haunting vocals captured not only Black Sabbath’s industrial roots in workingclass Birmingham, England, but also the morose mood of many Americans at the end of the 1960’s. Subsequent bands expanded Black Sabbath’s sound, but at the end of the 1970’s, many listeners abandoned hard rock music. Years of over-production, unrealized expectations, and internal discord had led to the declining popularity of bands such as Deep Purple, Black Sabbath, and Kiss. These influential bands survived the decade in one form or another, but the following generation of musicians in the 1980’s crafted a harder and more powerful sound than their predecessors. Growth and Maturity
In the early 1980’s, American listeners embraced British heavy metal groups such
Heavy metal
■
455
as Judas Priest, Iron Maiden, Motörhead, and Def Leppard. This second wave led to the founding and success of fledgling American heavy metal groups, including Metallica, Mötley Crüe, Van Halen, and Quiet Riot. Exposure on cable television channel MTV, especially through its weekly program Headbangers Ball, and extensive touring allowed these groups to achieve commercial success. Alienated youth, working-class Americans, and white teenage males in particular found refuge in the themes and sounds of heavy metal. Long hair, often in the form of mullets, made heavy metal fans easy to identify. Throughout the 1980’s, as the genre proliferated it also fragmented. The styles, sounds, and subject matter of the various heavy metal groups diverged. Groups such as Van Halen and Def Leppard typified pop, or light, metal, which mixed heavy guitar riffs and melodic lyrics with themes of love, happiness, and sexual gratification. Glam, or hair, metal bands such as Poison and Mötley Crüe mixed these themes with androgyny, initially dressing themselves to resemble women. Speed, or thrash, metal bands such as Metallica and Slayer accelerated the tempo of the music and focused on themes of destruction, religion, and death. Speed metal groups appeared in concerts wearing simple street clothes, thus better connecting to their fans. Finally, despite their changing sounds, 1970’s groups such as Judas Priest and Deep Purple became known as classic metal groups. Some other groups were harder to classify, including perhaps the most controversial heavy metal group of the 1980’s, Guns n’ Roses. The group’s 1987 debut album Appetite for Destruction, sold over 15 million copies, surpassing all other heavy metal albums, except for AC/DC’s 1980 album Back in Black, which sold over 21 million records. Criticism and Decline Heavy metal music was not well received by fundamentalist religious groups, concerned parents, and politicians. Many Americans saw it as a corrupter of youth; it was blamed for inciting murder, encouraging suicide, and promoting Satanism. In 1985, Tipper Gore, the wife of freshman senator Al Gore of Tennessee, cofounded the Parents’ Music Resource Center to protect youth from offensive music. The group charged that listening to heavy metal music led teens to sex, violence, drug and alcohol use, and the occult. Following congressional hearings, record companies agreed to put warning stickers on potentially offensive albums.
456
■
The Eighties in America
Heavy metal
Selected 1980’s Heavy Metal Albums Year
Title
Performer
1980
British Steel
Judas Priest
Iron Maiden
Iron Maiden
On Parole
Motörhead
On Through the Night
Def Leppard
Woman and Children First
Van Halen
Back in Black
AC/DC
1981
1982
1983
1984
1985
Blizzard of Ozz
Ozzy Osbourne
Point of Entry
Judas Priest
Killers
Iron Maiden
High ’n’ Dry
Def Leppard
Too Fast for Love
Mötley Crüe
Fair Warning
Van Halen
For Those About to Rock We Salute You
AC/DC
Diary of a Madman
Ozzy Osbourne
Screaming for Vengeance
Judas Priest
The Number of the Beast
Iron Maiden
Iron Fist
Motörhead
Diver Down
Van Halen
Speak of the Devil
Ozzy Osbourne
Piece of Mind
Iron Maiden
Another Perfect Day
Motörhead
Pyromania
Def Leppard
Kill ’Em All
Metallica
Shout at the Devil
Mötley Crüe
Mental Health
Quiet Riot
Look but Don’t Touch
Poison
Show No Mercy
Slayer
Flick of the Switch
AC/DC
Bark at the Moon
Ozzy Osbourne
Defenders of the Faith
Judas Priest
Powerslave
Iron Maiden
Ride the Lightning
Metallica
1984
Van Halen
Condition Critical
Quiet Riot
Perfect Strangers
Deep Purple
Theatre of Pain
Mötley Crüe
Fireworks
Deep Purple
Hell Awaits
Slayer
Fly on the Wall
AC/DC
The Eighties in America
Heavy metal
Year
Title
Performer
1986
Turbo
Judas Priest
Somewhere in Time
Iron Maiden
Master of Puppets
Metallica
5150
Van Halen
QR III
Quiet Riot
Look What the Cat Dragged In
Poison
Reign in Blood
Slayer
Who Made Who
AC/DC
1987
1988
1989
The Ultimate Sin
Ozzy Osbourne
Orgasmatron; Rock ’n’ Roll
Motörhead
Hysteria
Def Leppard
Girls, Girls, Girls
Mötley Crüe
The House of Blue Light
Deep Purple
Appetite for Destruction
Guns n’ Roses
Ram It Down
Judas Priest
Seventh Son of a Seventh Son
Iron Maiden
. . . And Justice for All
Metallica
OU812
Van Halen
Quiet Riot
Quiet Riot
Open up and Say . . . Ahh!
Poison
Nobody’s Perfect
Deep Purple
South of Heaven
Slayer
Blow up Your Video
AC/DC
No Rest for the Wicked
Ozzy Osbourne
Dr. Feelgood
Mötley Crüe
G n’ R Lies
Guns n’ Roses
The press and critics also attacked heavy metal. The 1984 mockumentary film This Is Spin¨al Tap spoofed heavy metal musicians. More serious, in 1985, Los Angeles police caught Richard Ramirez, the so-called Night Stalker serial killer. He had allegedly left an AC/DC hat at one of his crime scenes, and newspapers quickly reported that the heavy metal group might have inspired the murders. During the mid-1980’s, both Judas Priest and Ozzy Osbourne faced accusations that their songs contained subliminal messages that had led to the suicides of several troubled teenagers. In both instances, the musicians were cleared. In 1989, the National Academy of Recording Arts and Sciences created a Grammy Award category for heavy metal, but the in-
■
457
augural award in that category went to British rock group Jethro Tull instead of favored first-time American nominee Metallica. The excesses of the heavy metal musicians themselves were just as damaging to the genre’s reputation. Music that celebrated drug and alcohol use, violence, and sexual promiscuity often was mirrored in the offstage self-destructive behavior by band members. An overabundance of heavy metal bands also led to parity and decreased album sales for each such band. By the end of the decade, heavy metal had receded from the highwater mark of its mainstream popularity. The arrival of grunge music a few years later forced many groups back into the underground music scene.
458
■
Heidi Chronicles, The
The Eighties in America
Impact During the 1980’s, heavy metal blossomed into a number of unique musical styles and sounds. Their fast-paced rhythms, guitar solos, and provocative lyrics proved to be powerful alternatives to the decade’s pop music. Heavy metal influenced music, culture, and fashion of the 1980’s and beyond. Further Reading
Christe, Ian. Sound of the Beast: The Complete Headbanging History of Heavy Metal. New York: HarperCollins, 2004. Valuable charts and insightful comments make this an exhaustive and accessible account of 1980’s heavy metal. Konow, David. Bang Your Head: The Rise and Fall of Heavy Metal. New York: Three Rivers Press, 2002. Numerous interviews give this overview of 1980’s heavy metal a first-person perspective. Popoff, Martin. The Eighties. Vol. 2 in The Collector’s Guide to Heavy Metal. Toronto: Collector’s Guide, 2005. Reviews hundreds of 1980’s heavy metal albums, with commentary and cross-references. Strong, Martin C. The Great Metal Discography. Edinburgh: Canongate Books, 1998. The most detailed listing of band histories, recordings, and chart positions for heavy metal groups. Walser, Robert. Running with the Devil: Power, Gender, and Madness in Heavy Metal Music. Hanover, N.H.: Wesleyan University Press, 1993. Scholarly study of heavy metal’s appeal, significance, and meaning. Weinstein, Deena. Heavy Metal: The Music and Its Culture. Rev. ed. New York: DaCapo Press, 2000. Sophisticated analysis of heavy metal culture, focused on the fans, musicians, lyrics, and myths. Aaron D. Purcell See also
Androgyny; Guns n’ Roses; Mötley Crüe; MTV; Mullet; Music; Music videos; Night Stalker case; Osbourne, Ozzy; Pop music; This Is Spin ¨ al Tap; Van Halen.
■ Heidi Chronicles, The Identification Award-winning play Author Wendy Wasserstein (1950-2006) Date Produced in 1988
Wendy Wasserstein’s award-winning and best-known play explores women of the baby-boom generation, the women’s movement in the 1980’s, and how many women’s views of success differ from those of many men.
Playwright Wendy Wasserstein. (Courtesy, Dartmouth College)
The Heidi Chronicles follows the life of Heidi Holland from her promising 1960’s high school days to her experiences with the consciousness-raising women’s groups of the 1970’s to her life as a single woman trying to have it all in the 1980’s. Throughout, Heidi’s understanding of and search for equality and feminist ideals are tested. Well-educated yet unsure how to make herself happy, Heidi has a no-commitment affair with magazine editor Scoop Rosenbaum while leaning heavily on her best friend, gay pediatrician Peter Patrone. Getting older and feeling alone, she decides to adopt a baby and attempt single parenting, hoping it will bring her the fulfillment and companionship for which she longs. The most telling discussion of Heidi’s views on feminist principles—often taken to be those of Wasserstein as well—occurs toward the latter part of the play in a speech Heidi gives discussing both her work as an art historian and her view of her own life. Heidi became an independent woman, as the feminist movement of the 1960’s and 1970’s encouraged, but
The Eighties in America
by doing so, she has come to feel isolated and alone. She admonishes the crowd by telling them that she feels stranded, when she thought the whole point was that women wouldn’t feel stranded; rather, they would support one another and feel closer as a gender. The 1980’s may have brought liberation, but that liberation did not bring contentment. While many praised Wasserstein’s play for its unflinching examination of mainstream American feminism and Heidi’s struggle to define herself, others felt her attack on the women’s movement and joke-ridden dialogue betrayed the importance of women moving forward as both mothers and workers. Others felt that by creating a character that is lost and often voiceless, Wasserstein put too much credence in the points of view expressed by Scoop and Peter, two men. Wasserstein herself countered that Heidi, and indeed most women in the 1980’s, were lost and voiceless and that the definition of “success” was not the same for women as it was for men. Whatever disagreements the play presented among critics, audiences were very receptive to a contemporary feminist play and agreed that Wasserstein’s Heidi was a fitting representation of babyboom women and their fears that success in the workplace would not be enough for a happy life.
Henley, Beth
■
459
■ Henley, Beth Identification American playwright Born May 8, 1952; Jackson, Mississippi
After receiving the Pulitzer Prize for Crimes of the Heart, her first professionally produced play, Beth Henley emerged in the early 1980’s as a significant new playwright whose work enlarged upon the contributions of earlier writers from the American South. In addition to their treatment of traditional Southern Gothic themes, her plays have been noted for presenting resilient women characters. Crimes of the Heart (pr. 1979, pb. 1982), which opened Off-Broadway in 1980, met with both critical and popular success. Set in a small southern town, it depicts three adult sisters who struggle against social repression in the form of gossipy relatives and neighbors, violent chauvinists, and racists—all of whom became familiar elements in Henley’s successive plays. With Crimes of the Heart, Henley became the first woman in twenty-two years to win the Pulitzer Prize; the play also earned the New York Drama
Impact Wasserstein’s play opened starring Joan Allen, Boyd Gaines, and Peter Friedman and won the Pulitzer Prize for drama, the Tony Award, the Drama Critics Circle Award, and scores of other honors. It was eventually made into a television movie starring Jaime Lee Curtis, Tom Hulce, and Peter Friedman that was seen by—and influenced—a much wider audience. Further Reading
Austin, Gayle. “The Heidi Chronicles (Review).” Theatre Journal, March, 1990, 107-108. Balakian, Jan. “The Heidi Chronicles: The Big Chill of Feminism.” South Atlantic Review 60, no. 2 (May, 1995): 93-101. Ciociola, Gail. Wendy Wasserstein: Dramatizing Women, Their Choices, and Their Boundaries. Jefferson, N.C.: McFarland, 1998. Tom Smith See also Big Chill, The; Feminism; Theater; Women in the workforce; Women’s rights. Beth Henley. (AP/Wide World Photos)
460
■
The Eighties in America
Heritage USA
Critics Circle Award. In 1986, Henley contributed the screenplay for a film adaptation of Crimes of the Heart, directed by Bruce Beresford and starring fellow playwright Sam Shepard. Henley’s The Miss Firecracker Contest (pr. 1980, pb. 1985) continued in the same vein as the playwright’s previous work—tragicomedy with bizarre characters set in small-town Mississippi—but with a stronger concentration on the grotesque. The longing for spiritual fulfillment that was touched upon in Crimes of the Heart was amplified in the later play. The film version, Miss Firecracker, was produced in 1988, with Holly Hunter re-creating for the screen her OffBroadway performance as Carnelle, the beauty pageant contestant in search of “eternal grace.” Although Henley contributed the screenplays for the film adaptations of both plays, Miss Firecracker is generally regarded as the better film, while Crimes of the Heart is considered the superior play. Henley has admitted the influence of Anton Chekhov, whom she has cited as a favorite playwright, and Crimes of the Heart has been compared to Chekhov’s Tri sestry (pr., pb. 1901; The Three Sisters, 1920). Also, because of the presence of the decadent and grotesque in her work, Henley has been compared to fellow southerner Tennessee Williams. Some critics have also noted a resemblance between Henley’s writing and that of Flannery O’Connor, though Henley has said she had never read O’Connor until after those comparisons had been drawn. Henley is frequently mentioned alongside contemporary playwrights such as Shepard, Marsha Norman, and Tina Howe. Henley’s other plays include The Wake of Jamey Foster (pr. 1982, pb. 1983), The Debutante Ball (pr. 1985, pb. 1991), The Lucky Spot (pr. 1986, pb. 1987), and Abundance (pr. 1990, pb. 1991). Impact Crimes of the Heart and The Miss Firecracker Contest were Henley’s most significant works of the 1980’s. They made her a powerful voice of southern women’s experience, bringing a viewpoint that was both regionally rooted and uniquely her own to the American theater. Further Reading
Bigsby, C. W. E. Modern American Drama, 1945-2000. Cambridge, England: Cambridge University Press, 2000. Bryer, Jackson R., ed. The Playwright’s Art: Conversations with Contemporary American Dramatists. New Brunswick, N.J.: Rutgers University Press, 1995.
Harbin, Billy J. “Familial Bonds in the Plays of Beth Henley.” The Southern Quarterly 25, no. 3 (Spring, 1987): 81-94. Henly, Beth. “Interview with Beth Henley.” Interview by John Griffin Jones. In Mississippi Writers Talking, edited by John Griffin Jones. Jackson: University Press of Mississippi, 1982. Thad Cockrill See also Film in the United States; Literature in the United States; Shepard, Sam; Theater; Women’s rights.
■ Heritage USA Definition
Complex housing recreational, commercial, and residential facilities, as well as the PTL television ministry offices and studios Date In operation from 1978 to 1989 Place Outside Fort Mill, South Carolina Heritage USA’s twenty-three hundred acres included a theme park, a water park, campgrounds, a shopping center, and homes. For many critics of televangelism and the abuses that attended it during the 1980’s, the complex epitomized the fraudulent excesses of evangelical Christian preachers who had been gradually shifting their attentions from traditional ministry to valueless consumerism. Jim Bakker’s vision for Heritage USA represented the pinnacle of his dreams for a ministry he had been developing, together with his wife Tammy Faye Bakker, since the 1960’s. The two former Biblecollege students were pioneers in the development of televised ministries with Christian networks. They began with Pat Robertson’s Christian Broadcasting Network (CBN), then helped create the Trinity Broadcasting Network (TBN) with Paul and Jan Crouch, and finally started their own PTL (praise the lord, or people that love) network. Viewers of the Bakkers’ show on PTL became enthusiastic supporters of the likable and friendly couple, sending checks to help achieve the ministry’s fund-raising goals. When Bakker offered incentives connected with the theme park and resort he was constructing, supporters upped their donations and became “partners” in the potentially lucrative investment. In exchange for a donation of $1,000, for example, investors were to receive free nights at one of the park’s hotels and discounts at its shops.
The Eighties in America
The completed project, Heritage USA, included office and studio space for the Bakkers’ television production company; a campsite; hotels, motels, and time-share condos; a water park; and the popular theme park, complete with shops, activities, and biblically themed games. The park featured a Main Street area meant to evoke small-town American life, a place set apart from the increasingly secularized world in which visitors spent the rest of the year. The park also included the Upper Room building, where visitors could view models arranged to resemble famous portrayals of the Last Supper. However, the complex did not include a sufficient number of hotel rooms for the Bakkers to make good on their promise of free stays for every thousand-dollar donor. Angry donors filed lawsuits when the ministry reneged on its promises. By the end of the 1980’s, it had been discovered that Bakker was stealing money from his own ministry. He was convicted of fraud and imprisoned, and Heritage USA was closed. Impact
Heritage USA was marketed to a large group of American Christians who sought alternatives to traditional vacation packages that would better align with the values they espoused. It represented an early realization of the potential for the growing industry of Christian entertainment. The demise of both the theme park and the larger entertainment complex contributed to ongoing criticism not only of televangelism but also of the greed that often accompanied it.
Further Reading
O’Guinn, Thomas C., and Russell W. Belk. “Heaven on Earth: Consumption at Heritage Village, USA.” Journal of Consumer Research 16 (1989): 227ff. Presents a compelling study of visitors to the theme park and looks at the interesting intersections of sacred and secular values. Shepard, Charles E. Forgiven: The Rise and Fall of Jim Bakker and the PTL Ministry. Rev. ed. New York: Atlantic Monthly Press, 1991. Biographical study by the award-winning Charlotte Observer reporter whose reporting contributed to Bakker’s downfall. Jennifer Heller See also
Bakker, Jim and Tammy Faye; Religion and spirituality in the United States; Televangelism.
Herman, Pee-Wee
■
461
■ Herman, Pee-Wee Identification Comedic children’s character Creator Paul Reubens (1952) Date Introduced in 1981
The creation of comedian Paul Reubens, Pee-Wee Herman was a successful character onstage, in television specials and guest appearances, and on film in the first half of the 1980’s. In the decade’s second half, he was featured on the only live-action Saturday morning children’s network television show. Sporting a tight, gray, houndstooth suit, a red bow tie, a fastidiously manicured crew cut, and an infectious laugh, the character Pee-Wee Herman became a tremendous success in the 1980’s. Herman was the creation of Paul Reubens, a comedian working with the Groundlings improvisational troupe in Hollywood, California. The character appealed to both adults and children, and he was featured onstage, in film, and on television during the decade. Joined by Phil Hartman and other Groundlings, Reubens created and performed The Pee-Wee Herman Show (pr. 1981). The show featured Pee-Wee, an adult man who acted like a five-year-old as he welcomed the audience to his whimsical and kitschy playhouse. The show successfully ran for five sold-out months at Hollywood’s Roxy Theater, attracting the attention of several producers. Cable television channel Home Box Office (HBO) aired one of the performances on September 11, 1981, and Reubens and Hartman sold the screenplay to a film, Pee-Wee’s Big Adventure, shortly thereafter. Pee-Wee’s Big Adventure (1985) reached the big screen directed by Tim Burton. It followed Pee-Wee as he searched the country for his beloved stolen bike. Along the way, he met several eccentric people whom he helped through crises, and he performed the Pee-Wee Herman dance in platform shoes to “Tequila,” a performance that became an instant hit across America. With the movie’s success, the Columbia Broadcasting System (CBS) took a gamble on the lovable, bizarre, and somewhat annoying character, airing the first episode of Pee-Wee’s Playhouse in 1986, at a cost of $325,000 per episode. Pee-Wee’s Playhouse was set in a colorful playhouse that contained talking furniture, appliances, animals, and a wish-granting genie. The show featured cartoon and claymation segments, puppets, and live guests, including Lau-
462
■
The Eighties in America
Herman, Pee-Wee
Pee-Wee Herman, right, appears in a scene from Pee-Wee’s Playhouse. (AP/Wide World Photos)
rence Fishburne and Hartman. The show was a huge hit with children and adults, running five years, earning fifty-nine Emmy nominations, and winning sixteen of the awards. Pee-Wee’s popularity continued with a second film, Big Top Pee-Wee (1988), which was set on Planet Pee-Wee. In the movie, Pee-Wee, a farmer whose talking animals make pancakes and get tucked into beds at night, suddenly finds himself hosting a circus that lands on his farm after being swept up by a hurricane. Pee-Wee acquires a love interest and participates in the longest cinema kiss of its time. Some criticized Reubens for allowing Herman to partake in an adult relationship, while others claimed that it was what the character needed to attract a larger audience. Impact Pee-Wee Herman charmed children and adults alike throughout the 1980’s by allowing children to relate to him on a personal level, while adults
lived vicariously through his character, who refused to grow up. In an era obsessed with achievement and advancement, Pee-Wee Herman offered an escape into absurdity, fantasy, and eternal childhood. Further Reading
Burke, Timothy, and Kevin Burke. Saturday Morning Fever: Growing Up with Cartoon Culture. New York: St. Martin’s Griffin, 1999. Neuwirth, Allan. Makin’ Toons: Inside the Most Popular Animated TV Shows and Movies. New York: Allworth Press, 2003. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of That Decadent Decade. New York: St. Martin’s Griffin, 1996. Sara Vidar See also
Children’s television; Comedians; Film in Canada; Film in the United States; Television.
The Eighties in America
■ Hershiser, Orel Identification American baseball pitcher Born September 16, 1958; Buffalo, New York
Orel Hershiser was one of the top starting pitchers in the major leagues in the last half of the 1980’s. Nicknamed “Bulldog” by Los Angeles Dodgers’ manager Tommy Lasorda, mild-mannered Orel Hershiser developed into a consistent, determined, and formidable pitcher. In 1984, his first full year playing Major League Baseball, the right-hander made the conversion from relief work to starting assignments. The following year, his record was a spectacular nineteen wins to only three losses, the best in the majors. Hershiser, along with the rest of the fine Dodger pitching staff, helped his team win the Western Division of the National League. In 1988, Hershiser had one of the best seasons in major-league pitching history. With twenty-three wins that season, he led the National League. He also tied for the most complete games and the most shutouts in either league. Hershiser ended the regular season amid great fanfare, because he had a chance to beat Don Drysdale’s seemingly untouchable record of fifty-eight and two-thirds consecutive scoreless innings pitched. After his fifth straight shutout, Hershiser was only nine and two-thirds innings shy of the record, but he had only one regularseason starting opportunity left. Remarkably, on September 28, 1988, Hershiser held the San Diego Padres scoreless for ten innings to break Drysdale’s record. In his first playoff appearance, moreover, he pitched eight more shutout innings. Hershiser’s overall playoff performance in 1988 was equally remarkable. He saved one game against the New York Mets then shut the Mets out in the climactic seventh game to win the National League Championship pennant. The 1988 World Series became known for Kirk Gibson’s game-winning home run in game one. Hershiser, however, shut out the Oakland As in game two, holding them to only three hits. He went on to win the clinching game of the series, yielding only four hits in that contest. He was named the most valuable player of both the National League Championship and the World Series. In addition to winning the 1988 Cy Young Award and a Gold Glove, Hershiser was selected as the Associated Press (AP) Professional Athlete of the Year, as well as being named Sports Illustrated’s Sportsman of the Year.
Hill Street Blues
■
463
In three seasons in the 1980’s, Hershiser led the National League in innings pitched, testifying to his durability. Three times, his earned run average was the third best in the National League, and once it was second. Only teammate Fernando Valenzuela and the Toronto Blue Jays’ Dave Stieb hurled more shutouts in the 1980’s than did Hershiser. The Dodgers’ hitting was often mediocre, a fact that is reflected in Hershiser’s win/loss record, but his consistency made Hershiser a feared opponent. Impact Orel Hershiser’s pitching performance in 1988 was one of the best on record. He helped lead the Los Angeles Dodgers to their second World Series triumph of the decade. His talent and his gentlemanly respect toward the press and fans made Hershiser a role model in a decade during which professional baseball experienced much turmoil and role models were in demand. Further Reading
Hershiser, Orel, and Jerry B. Jenkins. Out of the Blue. Rev. ed. New York: Berkeley, 1990. Hershiser, Orel, and Robert Wolgemuth. Between the Lines: Nine Principles to Live By. New York: Warner Books, 2001. Stout, Glenn. The Dodgers: 120 Years of Dodgers Baseball. Boston: Houghton Mifflin, 2004. M. Philip Lucas See also
Baseball; Baseball strike of 1981; Gibson, Kirk; Sports; Valenzuela, Fernando.
■ Hill Street Blues Identification Television drama series Creators Steven Bochco (1943) and
Michael Kozoll (1940) Aired from January 15, 1981, to May 12, 1987
Date
Hill Street Blues built its success on a large ensemble cast. Portraying an interconnected set of dramas involving the lives of its many regular characters—based in and around an urban police station—the show captured many of the complexities and contradictions of the 1980’s, which it represented as an age of economic recession and inequality; institutional corruption, indifference, and inadequacy; racism; and personal frustration.
464
■
Hill Street Blues
Hill Street Blues was a classic example of what has come to be known as “quality television,” featuring serious (although also frequently comic and even absurd) plotlines, multilayered characters, and unresolved story arcs. Expanding far beyond the conventions of the action-oriented and often conservative formula of “cops versus robbers” that characterized most police dramas, Hill Street Blues attempted to portray a thick slice of contemporary life. The police station at the heart of the show became a meeting place for all sectors of society and a microcosm of the human struggle for love, dignity, justice, and, at the very least, safety. At the heart of many of the show’s episodes was an experience that some consider to have been a defining mark of the 1980’s, that of being overwhelmed, whether by criminal assaults and robberies, by unruly social forces and prejudices, by bureaucratic foul-ups, by institutionalized indignities, or by uncontrollable personal impulses. Given this thematic backdrop, the successes experienced by the show’s characters were often modest, especially compared to the happiness quotient portrayed on most television programs. However, each assertion of order over chaos, charity over hate and indifference, security over danger, or dignity over dehumanization, was hard-earned and precious. From beginning to end, Hill Street Blues stayed true to the stoic advice of Alcoholic’s Anonymous it often invoked, dramatizing the wisdom of living day by day. It also embodied the essence of the “blues” referred to by the show’s title, telling stories of tremendous hardship and pain. Each week, however, the progress from morning roll call to evening wrap-up celebrated the resiliency of the show’s characters and the bittersweet triumph of survival. There were no unsullied heroes in Hill Street Blues, as even the most admirable of the show’s characters were flawed. This narrative and thematic truth was emphasized by the distinctive and original visual style of the series, which relentlessly presented characters in candid views. Throughout the series, the camera often moved through space to follow the main characters, making them seem more like atoms than icons and creating a disorienting sensation much different from the visual stability of typical television dramas. The characters were shown not only in public, with their best face on, but also in private, revealing their insecurities, personality quirks, faulty judgments, and all-too-human weaknesses. In the world of Hill Street Blues, though, being all-too-
The Eighties in America
human is not a stigma but a virtue; recognizing this quality is a necessary step in finding the allimportant balance that keeps one from being too hard on oneself, as well as being too hard on the people one is surrounded by, almost all of whom are victims as well as victimizers. Individuals stood out in Hill Street Blues, each embodying issues and tensions frequently debated in 1980’s culture. Among the most memorable of these characters was Frank Furillo (Daniel J. Travanti), the precinct captain trying to be a figure of fairness and sensitivity as well as of law and order; Joyce Davenport (Veronica Hamel), a woman of stunning beauty, poise, and privilege, apparently out of her element in her professional role as a public defender; and Mick Belker (Bruce Weitz), a detective and growling loner who was constantly drawn into compassionate relationships with doomed characters. Hill Street Blues, though, was ensemble drama at its best, a format well suited to the show’s vision of the world as an ecological system of intimately related parts sharing not only vulnerability and pain but also strength, joy, and responsibility for one another. The show’s vision stood in stark contrast to the emphasis of Ronald Reagan’s White House on rugged self-reliance, justifiable inequality, and stern authority. Impact Hill Street Blues was not popular in its first season. However, at the end of that season, it won a record-setting eight Emmy Awards, ensuring both its renewal and an influx of new viewers. It remained a critical success throughout its run, although it was never a top-rated hit. Still, Hill Street Blues became extremely influential as a model for other shows. Its visual style, emphasis on a large ensemble, use of music, and narrative structure were all imitated by other shows during the rest of the decade and beyond. The decision of the National Broadcasting Company (NBC) to keep the show on the air for six seasons demonstrated the cultural potential of a medium more often than not dominated by stale conventions, censorship, consumerism, and an unambitious definition of what is entertaining and provocative. Further Reading
Gitlin, Todd. “Hill Street Blues: Make It Look Messy.” In Inside Prime Time. Rev. ed. Berkeley: University of California Press, 2000. Schatz, Thomas. “Hill Street Blues: U.S. Police Procedural/Melodrama.” Museum of Broadcasting.
The Eighties in America
www.museum.tv/archives/etv/H/htmlH/hill streetb/hillstreetb.htm. Thompson, Robert J. Television’s Second Golden Age: From “Hill Street Blues” to “ER.” Syracuse, N.Y.: Syracuse University Press, 1997. Sidney Gottlieb
Hip-hop and rap
■
465
was the decisive factor in the national and international emergence of hip-hop culture and rap music as dominant economic and social forces. Yet, even as hip-hop culture and rap music began to garner national attention in the 1980’s, divisions within the communities began to emerge, leading to the socalled East Coast/West Coast split in the late 1980’s.
See also
African Americans; Cagney and Lacey; Crime; Demographics of the United States; Domestic violence; Gangs; L.A. Law; Latinos; Marriage and divorce; Organized crime; Reaganomics; St. Elsewhere; Television; Women in the workforce.
■ Hip-hop and rap Definition
African American cultural forms
Beginning in the late 1970’s, hip-hop culture and rap music began to diversify, influencing dress, music, language, and film in the United States. At the same time, the culture and music began to make inroads in European and Asian countries. Although the first major rap hit single was “Rapper’s Delight,” released by the Sugar Hill Gang in 1979, hip-hop culture in the form of baggy pants and sweatshirts, artistic graffiti, and break dancing had been a part of teenage life in the South Bronx of New York City since the early 1970’s. While break dancing and graffiti would soon become passé within the United States, all the other elements of hip-hop would be widely adopted and perceived as definitive aspects of the culture. Neither hip-hop culture in general nor rap in particular, however, would have become the international phenomenon they both are without a number of technological and cultural developments that facilitated their growth beyond their local place of origin. Chief among the technological developments was the boom box, or “ghetto blaster,” an enlarged version of the earlier transistor radio. Unlike the Sony Walkman, which hit the market in 1979, the boom box, a creation of the mid-1970’s, permitted teenage pedestrians to emphasize the public, if not communal, nature of the music. More important, the boom box became the primary mode of publicity for rap singles; it was the technological equivalent of wordof-mouth news. Among the cultural developments, the debut of the cable channel MTV (Music Television) in 1981
Political Consciousness Versus Gangsta Rap
Although many commentators on hip-hop culture see the release of Yo! Bum Rush the Show (1987) and It Takes a Nation of Millions To Hold Us Back (1988) by East Coast-based Public Enemy as defining the political stance against which Straight Outta Compton (1988) by the West Coast-based N.W.A. was the defiant retort, this is only partly the case. From its inception, rap music was riddled by rapas-politics versus rap-as-entertainment divisions. On one hand, the most popular form of rap in the early 1980’s (and beyond) was “party” rap, exemplified by “Rapper’s Delight.” The Sugar Hill Gang began a trend that DJ Jazzy Jeff and the Fresh Prince (Will Smith), the Beastie Boys, Run-D.M.C., Kid ‘n’ Play, and others would follow: rap as pure entertainment. At the same time, Grandmaster Flash and the Furious Five’s smash hit “The Message” indicated that another segment of the rap public was willing to engage social and political issues. KRS-One, Eric B. and Rakim, and others would join Public Enemy in insisting on socially responsible music and lyrics. The emergence of gangsta rap in the mid-1980’s, signaled by N.W.A., Schoolly D, Ice-T, Dr. Dre, and others, opened up a third “thug” front even as Queen Latifah, Roxanne Shanté, and Salt-n-Pepa were formulating a “feminist” fourth front in the music. The analogues to these divisions in rap music were also evident in break dancing and graffiti. Both had utilitarian and aesthetic poles. Break dancing contests were originally used by street gangs as an alternative to violence. Graffiti was once used to convey public messages to other gangs, usually in the form of boasts, threats, and elegies. Once break dancing was popularized as an aesthetic form of expression, however, largely via rap videos, its connection to gangs was jettisoned; it simply became another dance trend. Meanwhile, graffiti became a mode of artistic expression, and many graffiti artists—including some former gang members— found themselves being invited to display their work in the art galleries of Manhattan. Among artists
466
■
The Eighties in America
Hip-hop and rap
Selected 1980’s Rap Songs Year
Song
Performer
1982
“Planet Rock”
Afrika Bambaataa
“The Message”
Grandmaster Flash
1983
“Looking for the Perfect Beat”
Afrika Bambaataa
1984
“Rock Box,” “It’s Like That”
Run-D.M.C.
1985
“I Can’t Live Without My Radio”
L L Cool J
1986
1987
1988
1989
“King of Rock,” “Rock the House”
Run-D.M.C.
“The Show”
Doug E. Fresh
“Fight for Your Right,” “Hold It Now,” “Hit It”
Beastie Boys
“The Source”
Grandmaster Flash
“Walk This Way”
Run-D.M.C., with Steven Tyler and Joe Perry
“I Need Love”
L L Cool J
“Yo! Bum Rush the Show,” “Public Enemy #1”
Public Enemy
“It’s Tricky”
Run-D.M.C
“Boyz-n-the Hood”
N.W.A.
“Tramp”
Salt-n-Pepa
“Stop the Violence”
Boogie Down Productions
“Follow the Leader,” “Move the Crowd”
Eric B. and Rakim
“Going Back to Cali”
L L Cool J
“Bring the Noise,” “Prophets of Rage,” “Don’t Believe the Hype”
Public Enemy
“Gangsta, Gangsta”
N.W.A.
“We Want Eazy”
Eazy-E
“I’m Your Pusher”
Ice-T
“Shake Your Thang”
Salt-n-Pepa
“Lyte as a Rock”
MC Lyte
“Parents Just Don’t Understand,” “Nightmare on My Street”
Jazzy Jeff and The Fresh Prince
“Jack of Spades,” “Why Is That?”
Boogie Down Productions
“Me Myself and I,” “Say No Go”
De La Soul
“I’m That Type of Guy”
L L Cool J
“Fight the Power”
Public Enemy
“High Roller”
Ice-T
“Expression”
Salt-n-Pepa
“Come into My House,” “Ladies First”
Queen Latifah
“Cha Cha Cha”
MC Lyte
“I Think I Can Beat Mike Tyson”
Jazzy Jeff and The Fresh Prince
“Bust a Move”
Young MC
“Rap Summary (Lean on Me),” “Smooth Operator”
Big Daddy Kane
“I’ll House You”
Jungle Brothers
The Eighties in America
clearly influenced by graffiti in the 1980’s, JeanMichel Basquiat is perhaps the best known. Hip-Hop Film
The emergence of gangsta rap was reinforced by director Brian De Palma’s remake and release of Scarface (1983). This blood-splattered melodrama meshed well with the sector of rap music that reveled in violence. The film became such a widespread favorite of real and would-be gangsters that it showed up in comedy skits and in several black gangster films of the following decade, including New Jack City (1991). By contrast, hip-hop films of the early 1980’s— such as Krush Groove (1985), Wild Style (1983), Style Wars (1983), and Beat Street (1984)—were relatively simple, rags-to-riches celebrations of hip-hop culture. More important, these films tended to be regional or “black-only” hits. By the end of the decade, however, things had changed. The national commercial success of Spike Lee’s Do the Right Thing (1989) and, two years later, John Singleton’s Boyz n the Hood (1991), had been foreshadowed by the sleeper hit Colors (1988). All three films offered a blunt, unflinching assessment of African American life delivered over hard-core gangsta rap sound tracks. It was significant that the East Coast/West Coast divisions seemed to play out in the endings of Do the Right Thing and Boyz n the Hood: Lee’s Brooklynbased film focused on racial/ethnic clashes, while Singleton’s South Central L.A.-based film dealt with intraracial rivalries. For Lee, the most dangerous threat to African Americans was black-white conflict; for Singleton, it was black-on-black violence. The differences between the two directors seemed to echo the East Coast’s emphasis on political consciousness understood as African American cultural nationalism and the West Coast’s emphasis on “getting over” by whatever means (with drug-dealing, gun violence, or education understood as equally viable, consequences notwithstanding). Hip-Hop Fashion The baggy pants, sweatshirts, and jackets so prevalent in hip-hop communities have at least two origins. On the East Coast, these clothes were worn by the first break dancers because they facilitated complex gyrations, turns, and spins. On the West Coast, the same clothes referred to prison uniforms, especially the trend of wearing pants below the hip lines since prisoners were not permitted to wear belts. For both East and West Coasters, the
Hip-hop and rap
■
467
clothes signaled a defiant rejection of the attire of popular music in general and disco glitter in particular, although the showcasing of gold teeth, largely from Southern and Midwestern rappers, seemed to undermine the antimaterialism posture. Performers such as Public Enemy, Kid ‘n’ Play, and others also wore bright, neon-colored clothes as mocking commentaries on high fashion. Combat fatigues and military boots were popular among the more socially conscious members of the hip-hop community, suggesting that they were literally at war with the “system.” On the West Coast, white T-shirts and gym shoes were the norm, along with hooded sweatshirts (to hide one’s face from the police), as features of the growing drug-and-gang wars. All these clothing styles would themselves eventually become demographic markets for the fashion industry as new lines (Timberland, Fubu, Pelle Pelle) competed with old standbys (Converse, Adidas) for hip-hop patronage. Impact Hip-hop culture changed not only the music world of the 1980’s but also the worlds of film and fashion. It drove mainstream rock music and most pop music to the bottom ends of the radio dial; its only serious rival was (and continues to be) “new” country music. Run-D.M.C.’s collaboration with Aerosmith on a remake of the song “Walk This Way” revived the rock band’s lagging career and made stars out of the rappers. Rap music sampling— present even in “Rapper’s Delight”—led to a number of copyright violation lawsuits as performers and songwriters worked through the complicated concepts of authorship. Just as important, hip-hop and rap offered an avenue to money and material success for those without any noteworthy athletic or musical instrument skills, a development that only exacerbated the apparent irrelevance of formal education to both the working poor and the middle classes. Like the punk rock movement of the previous decade, hip-hop and rap elevated immediate success, personality, and flair over the delayed gratification ethos of education in general and musical literacy in particular. Like the punk movement, hip-hop culture shocked the middlebrow members of respectable society, unleashing an ongoing debate within mainstream popular music communities about whether rap should even be considered music. Within African American communities, the debate focused on the extent to which
468
■
The Eighties in America
Hip-hop and rap
hip-hop culture and rap music played—or did not play—into racial stereotypes. The influence of hip-hop on film was enormous, especially in the late 1980’s. Not only did more rap songs start showing up in sound tracks but also more films based primarily on the gangsta rap ethos of violence were made in Hollywood, though the first commercially successful hip-hop comedies, House Party (1990) and House Party II (1991), would appear shortly after the end of the decade. Subsequent Events
The emergence of laptop computers, Napster (the first “free” online site where fans could share and trade music, bypassing the traditional record companies and retail outlets), and MTV Yo’ Raps (an offshoot of MTV, dedicated exclusively to hip-hop culture and rap videos) in the 1990’s made hip-hop culture an international force. A number of developments indicated that this particular form of African American culture had gained unparalleled appeal to the youth of such countries as Belgium, France, Japan, Russia, Egypt, Spain, South Africa, and Brazil: the ubiquity of hiphop lingo in advertising and everyday conversation; its creation of a whole new mode of fashion known as urban gear; the evolution of rap into “cowhop” (U.S. Southwest), “trip hop” (England), and “New Jack” (New York), among others; and the merging of American gang violence ethos with Chinese and Japanese martial arts in films, making starts of veteran Asian actors Jackie Chan, Jet Li, and Chow Yun-Fat. During the 1990’s and into the twenty-first century, two more facets of hip-hop culture emerged: hip-hop literature and spoken word poetry (sometimes called “floetry”). While there has been a tradition of “street literature” in the urban centers of the United States since at least the early twentieth century, the fascination with the gritty realism of these novels turned into a multimillion dollar industry. Citing legendary urban crime fiction writers such as Iceberg Slim and Donald Goines as their heroes, young hip-hop activists/writers churned out pulp fiction that, like their film counterparts, viewed with the often-brutal world of drugs, gang wars, and the rap music industry through the lens of ordinary young men and women. These writers included Erica Kennedy, Renay Jackson, Vickie Stringer, Bertice
Berry, and Sister Souljah, whose novel The Coldest Winter Ever (1999) was a runaway best seller. At the same time, rap music rhythms were adapted to poetic meters, resulting in the revitalization of poetry as oral performance. So successful was this kind of poetry that local and national poetry slam contests emerged, drawing on the talent developed in local coffeehouses and clubs. Because performance poetry circulated primarily on compact discs (CDs), it made best-selling stars of performers such as Saul Williams and Patricia Smith. Clearly, the widespread appeal of hip-hop culture and rap music across the world has constituted the most viable alternative to the “straight” life, however defined in any particular culture or nation, since the countercultural movements of the late 1950’s and the 1960’s. Further Reading
Hebdige, Dick. Cut ’n’ Mix: Culture, Identity, and Caribbean Music. New York: Methuen, 1987. Hebdige examines the Caribbean sources of rap music in “toastin” parties in the late 1950’s and early 1960’s, focusing in part on DJ Kool Herc, a Jamaican immigrant widely considered the father of rap music. Rose, Tricia. Black Noise: Rap Music and Black Culture in Contemporary America. Hanover, N.H.: Wesleyan University Press, 1984. Rose’s comprehensive analysis of hip-hop culture focuses on the social and political forces in the United States (such as Reaganomics) that led to the rapid development and expansion of hip-hop culture. Her critique of the sexism and machismo within rap music remains unparalleled. Ross, Andrew, and Tricia Rose, eds. Microphone Fiends: Youth Music and Youth Culture. New York: Routledge, 1994. This collection of essays from various authors examines the cultural, gender, and aesthetic impact of hip-hop culture and rap music. The most outstanding essays take a broad international perspective on the culture and music. Tyrone Williams See also
African Americans; Break dancing; Cable television; Do the Right Thing; Fashions and clothing; Film in the United States; MTV; Music; Music videos; Public Enemy; Television.
The Eighties in America
■ Hobbies and recreation Definition
Leisure-time activities and pastimes
Americans engaged in a wide variety of leisure-time pursuits during the 1980’s, finding many different ways to spend their increased disposable income, even as the amount of leisure time available to them shrank rapidly. When the 1980’s began, the Iran hostage crisis was two months old, and memories of the Vietnam War were fading. With the election of Ronald Reagan as president and his promise of “Morning in America,” Americans were ready to find new means of escape from their work schedules. In 1980, total personal expenditures for recreation were $149 billion; by 1989, that figure rose to $250 billion. Overall, there was a shift in the number of hours worked, from 40.6 hours per week in 1973 to 48.4 hours per week, not including unpaid overtime hours and second jobs that most Americans held or were required to perform. This increase in work hours generated more income, and Americans embarked on a new spending spree, many buying on credit. A large portion of the consumer market was made up of teenagers and college students. Sports
Throughout the decade, sports and games continued as the most popular American recreational activities. The major organized team sports— baseball, basketball, football—represented opportunities both for participation and for spectatorship. In addition, sporting events that centered around gambling—such as horse racing, dog racing, and jai alai—achieved some popularity. Indeed, gambling became one of the most common forms of entertainment outside the home, and offtrack betting centers sprung up in which gamblers could bet on events taking place around the country, following them all via live television broadcasts. The most popular form of gambling took place at gaming casinos, which expanded well beyond Las Vegas and included riverboats. These modern duplicates of nineteenth century showboats were revived along the nation’s waterways to exploit the ability of water vessels to ignore many antigambling statutes. In 1985, annual spectatorship at all professional sporting events reached 223.2 million. Baseball recorded an all-time high of 47.7 million spectators, a figure that further increased to 56.8 million by 1992. Professional basketball attendance also rose from
Hobbies and recreation
■
469
11.5 million spectators in 1985 to 18.6 million in 1992. Annual college basketball attendance remained steady at 28.7 million for men’s teams, but attendance at women’s college basketball games experienced an impressive increase from 2 million in 1985 to 3.4 million in 1992. Fans attending professional hockey games in 1985 numbered 13.9 million. During the late 1980’s, baseball-card collecting became a lucrative business, as did trade in autographed memorabilia. Collectors sought mint-condition baseball cards, while both current and retired athletes, for a fee, appeared at card collector conventions, as fans waited in long lines for autographs. Prices varied depending on the status of the player and the rarity of the card and autograph. Television In 1983, a media poll reported that over 74 percent of Americans watched an average of more than two and one-half hours of television a day. Other leisure activities within the home were carryovers from earlier decades, including reading a book or newspaper, listening to music, talking on the telephone, exercising, spending an evening talking with family and friends, and working on hobbies. In 1980, Dallas was the top-rated television show, and its “Who Shot J. R.?” episode was viewed by 106 million people, making it the most-watched individual television episode to date. More popular was the two-hour series finale of M*A*S*H (February 28, 1983), which was seen by more than 120 million people. The Day After (1983), which dealt with the aftermath of a nuclear attack on the United States, was viewed by more than 100 million people. Other popular shows of the decade included The Golden Girls, 60 Minutes, Dynasty, The Cosby Show, and Roseanne. In addition to these network juggernauts, the advent of cable television introduced a plethora of choices for television viewers, creating dozens of modest successes on lesser-seen channels, including channels devoted to specific sports and hobbies. Video- and Other Games In 1971, the first coinoperated video arcade game was Computer Space. As the number of video games increased, video arcades opened in shopping malls and other public places, but their success began to fade after 1983, a decline attributed to growing concerns that they contributed to juvenile delinquency. Some parents and mall owners started a cleanup campaign to rid the malls of the video arcades. Others simply dismissed the games as merely a fad that would eventually disap-
470
■
The Eighties in America
Hobbies and recreation
pear. The main factor in the arcades’ decline was the development of Nintendo Entertainment System (NES), which quickly became the top-selling toy or game, surpassing such classic favorites as Barbie, GI Joe, and Monopoly. By the end of the decade, Nintendo controlled over 85 percent of the home video-game market, with Nintendo systems in more than 30 million American homes. In 1986, a board game was introduced that quickly became a home favorite: Pictionary, a variation on charades in which teams had to guess a word based on drawings rather than pantomime. By 1988, Pictionary was one of the best-selling games in the nation. Other popular toys and games of the 1980’s included Cabbage Patch Kids, unique soft-body dolls with homely plastic faces that came with adoption papers; Teenage Mutant Ninja Turtles, a major merchandising franchise spun off from a low-budget independent comic book; Hacky Sack; Wacky Wall Walkers; Dungeons and Dragons; Transformers, toy robots that could be turned into toy vehicles; Trivial Pursuit, a card-based, question-and-answer game invented in Montreal but introduced at the International Toy Fair, New York, in 1982; and Rubik’s Cube, a fad of the 1980’s, a multicolored, hand-held puzzle. Cinema In the early 1980’s, break dancing emerged in New York City and quickly spread to the streets of Los Angeles. Break dancing was improvised dancing to hip-hop music that incorporated acrobatic spins. It was a series of Hollywood films, however, that most influenced contemporary dance styles. In 1980, Urban Cowboy kicked off a renewed interest in country twostep dancing boots, while Flashdance (1983) triggered a different fashion trend that included leg warmers, tank tops, and workout clothes. Dirty Dancing (1987), set in a Catskills resort during the early 1960’s, became a cultural phenomenon. It renewed interest in older dances, such as the cha-cha and mambo; sparked new interest in the lambada, a sensual Latin dance involving close pelvic contact between partners; and spurred a Broadway show, a nationwide tour, a short-lived television series, and a sound-track album that sold more than ten million copies. Nine of the top ten movies of the decade in terms of their box-office rental fees were either sciencefiction or fantasy movies, including the film with highest rentals, E.T.: The Extra-Terrestrial (1982) at $228 million; followed by The Return of the Jedi (1980) at $168 million; Batman (1989) at $150 million; The
Empire Strikes Back (1983) at $141 million; Ghostbusters (1984) at $130 million; Raiders of the Lost Ark (1981) at $115 million; Indiana Jones and the Last Crusade (1989) at $115 million; Indiana Jones and the Temple of Doom (1984) at $109 million; and Back to the Future (1985) at $104 million. (These films’ domestic grosses were all substantially higher than their rentals alone, and considering their total revenues would produce a different list.) Music and Theater In 1984, A Chorus Line (pr. 1975) broke the Broadway endurance record set by Grease (pr. 1972), and when it finally closed in 1990, after 6,137 performances, it was the longest-running show in Broadway history. Three other shows opened during the decade that would go on to surpass the record set by A Chorus Line : Cats (pr. 1982), Les Miserables (pr. 1987), and The Phantom of the Opera (pr. 1988). Two musical artists who achieved phenomenal success in the music video era of the 1980’s were pop singers Madonna and Michael Jackson. In 1982, Jackson released elaborate music video productions in conjunction with his album Thriller. Sales of the album were boosted by the resulting MTV exposure, and Thriller sold more than 25 million copies, becoming the top-selling individual album of all time. Another musical artist who benefited from music video exposure was Bruce Springsteen, who sold more than 13 million copies of his Born in the USA (1984). His subsequent nationwide concert tour attracted more than five million fans. In 1987, the heavy metal-hard rock hybrid band Guns n’ Roses released Appetite for Destruction, which sold more than 12 million copies. Musical artists also worked to raise awareness of the hungry and destitute and for many other charitable causes. On April 5, 1985, musical artists from all genres joined to sing “We Are the World” and to promote the USA for Africa program, which helped provide food and supplies for starving people in Africa. The song sold more than seven million copies, and it along with its companion video were in continuous rotation on MTV and on radio stations worldwide. Healthy Participation
A 1987 study published by the Journal of Physical Education, Recreation, and Dance revealed that in the early 1980’s, participation rates at health clubs and in activities such as bicycling, fishing, hiking, skiing, and swimming were higher among those twenty-five to fifty-five years of age than
The Eighties in America
they were for teenagers and those in the eighteento-twenty-four-year-old range. Running and jogging claimed more than 40 million active participants nationwide, and many of these people were marathon runners. During the 1980’s, marathons in places such as Boston and New York City regularly attracted more than twenty thousand runners. Even more spectators lined up along the twenty-six mile routes to watch these events. Women-only running events, such as the L’eggs Mini Marathon in New York’s Central Park, regularly attracted more than six thousand participants. By 1987, Time magazine reported that there were more than twenty thousand health clubs nationwide, with over $5 billion spent on membership. Meanwhile, Americans spent over $700 million on exercise equipment for their homes, and more than 33 million people trained with weights, did aerobics, and walked for fitness. In conjunction with this trend, many fitness books and videos were best sellers. Jane Fonda’s Workout Video (1983) was the top-selling video in the mid-1980’s; she followed this success with five other best-selling workout videos, spawning a series of exercise books and exercise videos by celebrities including Raquel Welch, Richard Simmons, and Angela Lansbury. Impact Leisure habits in the 1980’s were dominated by a developing electronic technology that gave a certain nostalgia to the decade and transformed the manner in which Americans spent their leisure time. This domination began a trend that would continue, as Americans came to devote more time to sedentary electronic pursuits and less time to physical activities.
Hockey
■
471
Giordano, Ralph G. Fun and Games in TwentiethCentury America: A Historical Guide to Leisure. Westport, Conn.: Greenwood Press, 2003. Overview of the twentieth century that contextualizes trends of the 1980’s. Goodale, Thomas L., and Peter A. Witt, eds. Recreation and Leisure: Issues in an Era of Change. 3d ed. State College, Pa.: Venture, 1991. Another study emphasizing changing trends in leisure and the effects of the economy on hobbies and recreational pursuits. Hoffmann, Frank W., and William G. Bailey. Sports and Recreation Fads. New York: Harrington Park, 1991. A look at fads and other flash-in-the-pan trends in American leisure. Munson, Robert S. Favorite Hobbies and Pastimes: A Sourcebook of Leisure Pursuits. Chicago: American Library Association, 1994. Handbook of hobbies that explains their nature and appeal to prospective hobbyists. Nasaw, David. Going Out: The Rise and Fall of Public Amusements. New York: Basic Books, 1993. Study of leisure activities in the public sphere and their downfall in the new economy. Williams, Stephen. Tourism and Recreation. New York: Prentice Hall, 2003. Analyzes the relationship between travel and leisure. Martin J. Manning See also Action films; Advertising; Baseball; Basketball; Cable television; Children’s television; Film in the United States; Football; Golf; Hip-hop and rap; Hockey; MTV; Music; Music videos; Pop music; Science-fiction films; Sports; Television; Toys and games; Video games and arcades.
Further Reading
Cross, Gary S., ed. Encyclopedia of Recreation and Leisure in America. 2 vols. Farmington Hills, Mich.: Charles Scribner’s Sons, 2004. Includes entries on many activities popular in the 1980’s. Gartner, William C., and David W. Lime, eds. Trends in Outdoor Recreation, Leisure, and Tourism. New York: CABI, 2000. A study of outdoor pursuits, both near home and in the context of travel. Gelber, Steven M. Hobbies: Leisure and the Culture of Work in America. New York: Columbia University Press, 1999. Cultural-studies approach to hobbies and recreation that looks at the development of a work-centered culture and its detrimental effects on leisure-time activities.
■ Hockey Definition
Team sport
Hockey remained the national game of Canada during the 1980’s. The sport also garnered substantial worldwide attention in 1980, when the United States surprised observers by winning the gold medal in the Winter Olympics. This victory, combined with the acquisition by the Los Angeles Kings of superstar player Wayne Gretzky in 1988, resulted in a dramatic increase in U.S. hockey fandom. Hockey experienced increased popularity in the 1980’s. The rise in interest occurred for two reasons.
472
■
Hockey
At the amateur level, the surprisingly strong performance of the United States in the 1980 Winter Olympics led many Americans to watch hockey matches who would not have done so otherwise. Professionally, the sport became known for dominating teams with great individual players, as two National Hockey League (NHL) teams built dominant dynasties during the decade. The New York Islanders formed the first such dynasty in the early 1980’s, while the Edmonton Oilers superseded them as the reigning team during the remainder of the decade. Miracle on Ice
Internationally, the Soviet Union had dominated hockey for decades, especially at the Winter Olympics. Entering the 1980 Winter Olympics in Lake Placid, New York, the Soviets were pre-
The Eighties in America
dicted to win another gold medal. Just prior to the Olympics, the Soviet Union defeated the NHL AllStars and followed that victory by crushing the U.S. national team 10 to 3 in an exhibition game. The U.S. team was composed of amateurs from various colleges and coached by Herb Brooks. The average age of the players was twenty-two. The team was seeded seventh out of twelve teams competing in the Olympics. Thus, they were not expected to do well. The United States began the Olympics by playing successive games against the two teams favored to advance to the medal round from their division. In its first game, the United States tied Sweden 2 to 2, scoring a goal in the last minute of the game. The U.S. team followed that performance by easily defeating Czechoslovakia 7 to 3. With victories over Norway,
Edmonton Oiler Wayne Gretzky (left) passes Pittsburg Penquin Gregg Sheppard during the first period of a 1981 game. The Edmonton Oilers dominated the National Hockey League during the late 1980’s. (AP/Wide World Photos)
The Eighties in America
Romania, and West Germany, the United States advanced to the medal round undefeated, with a record of 4 wins, 0 losses, and 1 tie. The Americans’ first game in the medal round was against the Soviet Union. The Soviets had won their division with a perfect 5-0 record. Despite falling behind early, the United States quickly tied the game. Indeed, each time the Soviet Union scored, the United States responded with a goal, always keeping the game close. With ten minutes remaining, the United States took the lead, 4 to 3, with a goal by team captain Mike Eruzione. The score held, and the United States achieved perhaps the biggest upset victory in the history of the Olympics, leading to the game being called the “Miracle on Ice.” One game remained to be played. As it had in every previous game but one in the Olympics, the United States surrendered the first goal to its opponent, Finland. The Americans trailed 2 to 1 entering the final period. During the final twenty minutes, however, the United States dominated. The result was a 4-2 victory and a U.S. gold medal. Many Americans viewed the U.S. hockey team as heroes. Their shockingly impressive performance was considered a welcome escape from reality, as the country was experiencing economic problems and tension abroad. Thus, the U.S. hockey team’s achievements provided some Americans with a source of national pride. Decade of Dynasties
Professional hockey also provided the sport with notable attention in the 1980’s. The top professional league in North America entering the decade was the NHL. The NHL became even more significant at the beginning of the 1980’s, as it expanded by annexing four teams from the World Hockey Association: the Edmonton Oilers, the Hartford Whalers, the Quebec Nordiques, and the Winnipeg Jets. The NHL began the decade with the end of one dynasty and the beginning of another. After winning four consecutive Stanley Cups to end the 1970’s, the Montreal Canadiens failed to win a fifth straight championship in 1980. Instead, the New York Islanders won the championship. Featuring such stars as Mike Bossy, Denis Potvin, Billy Smith, and Bryan Trottier, the team could play a variety of styles of hockey quite well. It had a great combination of speed, quickness, power, and defensive skill, and the Islanders’ talents led them to four consecutive
Hockey
■
473
Stanley Cup victories from 1980 to 1983. The Islanders’ dynasty ended in 1984, and a new one began. The Edmonton Oilers won their first Stanley Cup that season. They were led by Wayne Gretzky, who many experts consider the greatest hockey player of all time. By the end of his career, Gretzky had won several Hart Trophies (awarded to the league’s most valuable player), and he led the league in points in many seasons. The Oilers won the Stanley Cup three more times during the decade, in 1985, 1987, and 1988. When Gretzky was traded to the Los Angeles Kings following the 1988 season, he brought a new level of excitement to U.S. West Coast hockey fandom. Impact Hockey affected people in multiple ways during the decade. The U.S. gold medal at the 1980 Olympics provided a patriotic moment for a country whose national mood was generally somber. It also led to a sharp increase in the popularity of the sport in the United States, as colleges experienced more interest in their hockey programs and the quality of the game at the university level improved. Professionally, the sport attracted more spectators, who appeared interested in seeing great teams featuring many individual stars. Further Reading
Boyd, Bill. All Roads Lead to Hockey: Reports from Northern Canada to the Mexican Border. Lincoln: University of Nebraska Press, 2006. Includes case studies of the popularity of hockey in selected towns in the United States and Canada. Coffey, Wayne. The Boys of Winter: The Untold Story of a Coach, a Dream, and the 1980 U.S. Olympic Hockey Team. New York: Three Rivers Press, 2005. Focuses on the United States’ upset of the Soviet Union and looks at the lives of the U.S. players and coach Herb Brooks after the 1980 Winter Olympics. Fischler, Stan, and Shirley Walton Fischler. The Hockey Encyclopedia. New York: Macmillan, 1983. Thorough coverage of the history of hockey, including the evolution of the game’s rules, the history of the NHL, and a list of champions and individual award-winners in the sport. Morrow, Don, et al. A Concise History of Sport in Canada. Toronto: Oxford University Press, 1989. Covers the evolution of various sports in Canada, including professional leagues. Describes great teams and individual stars.
474
■
The Eighties in America
Hoffman, Dustin
Wallechinsky, David. The Complete Book of the Winter Olympics. 1998 ed. Woodstock, N.Y.: Overlook Press, 1998. A brief overview and order of finish for all sports in the Winter Olympics in the modern era. Kevin L. Brennan See also
Gretzky, Wayne; Miracle on Ice; Olympic Games of 1980; Sports.
■ Hoffman, Dustin Identification American actor Born August 8, 1937; Los Angeles, California
Already a major film star and an Academy Award winner, Dustin Hoffman starred in two of the most popular films of the 1980’s, as well as one of its most notorious flops, and won another Oscar. After years of struggle to establish himself, Dustin Hoffman finally got his big break with The Graduate (1967), solidified his stardom with Midnight Cowboy (1969), and flourished during the 1970’s, ending the decade with an Academy Award-winning performance in Kramer vs. Kramer (1979). Hoffman worked less frequently in the 1980’s, preferring to spend more time with his growing family. His first film of the decade, however, proved to be one of his biggest critical and commercial successes. In Tootsie (1982), Hoffman played frustrated New York actor Michael Dorsey. As the film opens, Michael has such a reputation for being difficult that he has become unemployable. (The character was said to resemble the young Hoffman.) Determined to prove his worth to his longtime, exasperated agent (Sydney Pollack, who also directed the film), Michael disguises himself as a woman—“Dorothy Michaels”—and wins a role on a television soap opera. Soon, Dorothy’s no-nonsense attitude portraying a tough hospital administrator makes “her” a star. Michael’s subsequent identity crisis is complicated by his affection for one of his co-stars (Jessica Lange) and the attraction of her father (Charles Durning) to Dorothy. Hoffman balanced his portrayal of Dorothy with an expert blend of ironic humor and poignancy, never going for cheap laughs or sentimentality. Previously a chauvinist, as the film
progresses, Michael learns to appreciate women as his equals, as Tootsie considers gender issues without didacticism or condescension. Nominated for an Oscar, Hoffman won a British Academy Award and a Golden Globe. Hoffman received a second Oscar for his other major success of the decade, Rain Man (1988). In that film, self-centered Charlie Babbitt (Tom Cruise), representing the era’s mania for greed, discovers that his father’s $3 million fortune is being held in trust for Raymond Babbitt (Hoffman), the older brother he never knew he had. Charlie takes the autistic Raymond from a Cincinnati nursing home and, as they drive to California, learns to care about someone other than himself. Much as Peter Sellers had in Being There (1979), Hoffman found subtle nuances within the character’s limited emotions. Rain Man also won Academy Awards for Best Picture, Best Director (Barry Levinson), and Best Original Screenplay (Ronald Bass and Barry Morrow). Hoffman’s other two 1980’s films were not met with enthusiasm. Ishtar (1987), with Hoffman and Warren Beatty as bumbling songwriters, was the decade’s biggest financial disaster other than Heaven’s Gate (1980), and in Family Business (1989), Hoffman, Sean Connery, and Matthew Broderick failed to convince anyone that they were a family of criminals. In 1984, Hoffman returned to Broadway as an actor for the first time since 1969, starring in Arthur Miller’s Death of a Salesman (1949). The production was filmed for television in 1985. Impact With his two hit films, Hoffman continued to show an amazing range, delighting audiences and inspiring innumerable other actors. Further Reading
Bergan, Ronald. Dustin Hoffman. London: Virgin, 1993. Dworkin, Susan. Making “Tootsie”: A Film Study with Dustin Hoffman and Sydney Pollack. New York: NewMarket Press, 1983. Freedland, Michael. Dustin: A Biography of Dustin Hoffman. London: Virgin, 1989. Michael Adams See also Academy Awards; Cruise, Tom; Film in the United States; Murray, Bill.
The Eighties in America
■ Holmes, Larry Identification
World heavyweight boxing champion from 1978 to 1985 Born November 3, 1949; Cuthbert, Georgia Holmes dominated the heavyweight division in professional boxing during the first half of the 1980’s, reigning as world champion until his defeat by Michael Spinks. Larry Holmes began his boxing career in 1973 and achieved a string of twenty-seven consecutive victories before winning the World Boxing Council heavyweight title from Ken Norton on June 19, 1978. He successfully defended his title four times during the remainder of the 1970’s, then during the 1980’s he made a remarkable sixteen additional defenses of the title before finally losing to Michael Spinks on September 21, 1985. During the 1980’s, Holmes, who was nicknamed the Easton Assassin after his homebase of Easton, Pennsylvania, fought most of the top fighters in what was at that time a rather mediocre heavyweight division. His opponents included an aging Muhammad Ali, who he defeated by technical knockout (TKO) in eleven rounds in 1980, Trevor Berbick, Leon Spinks, Renaldo Snipes, Gerry Cooney, Tim Witherspoon, James Smith, and Carl Williams. Several of these fighters either had held or would later hold portions of the then-fragmented heavyweight title. The bout with Cooney, which took place on June 11, 1982, and which Holmes won by a TKO in the thirteenth round, may well stand as his greatest performance during the decade. The bout also had strong racial symbolism attached to it, because Cooney was billed as the “Great White Hope”—the hope expressed by many white Americans at various points in boxing history of overcoming African American dominance in the heavyweight division. Although Holmes technically held the World Boxing Council title from 1978 to 1983 and then gave it up for the International Boxing Federation title, which he held until 1985, he was generally recognized as the world heavyweight champion during this entire period. At the time of his defeat by Michael Spinks, Holmes was just one fight short of the undefeated career record held by earlier champion Rocky Marciano, and his number of title defenses was second only to that of Joe Louis. After losing a close decision to Michael Spinks in a rematch in 1986, Holmes retired from boxing. He
Home shopping channels
■
475
came out of retirement in 1988, at the age of thirtyeight, to fight Mike Tyson, losing by a fourth-round TKO. Tyson rose to dominate the division during the latter years of the decade. Impact Holmes ended his career in 2002 with an overall record of sixty-nine wins—forty-four of them by knockout—and only six losses—just one of which, the defeat by Tyson, came by knockout. He ranks among the top heavyweight champions of boxing, and he was clearly the preeminent boxer in that weight class during the first half of the 1980’s. Further Reading
Brunt, Stephen. “Larry Holmes.” In Facing Ali. Guilford, Conn.: Lyons Press, 2002. Holmes, Larry, with Phil Berger. Larry Holmes: Against the Odds. New York: St. Martin’s Press, 1998. Scott Wright See also
African Americans; Boxing; Sports; Tyson, Mike.
■ Home shopping channels Definition
Television channels dedicated to selling products or services
Television home shopping gained national prominence in the 1980’s, eventually growing into a multibillion dollar business. Dedicated twenty-four-hour shopping channels, mostly on cable or satellite systems, developed new markets and marketing techniques, changing the way some products were bought and sold, especially apparel, jewelry, and cosmetics. Home shopping channels represent a form of retailing known as direct response marketing. In this retail format, potential customers watch television programs that demonstrate or market commodities, then respond to those programs by telephone, placing orders directly with the marketing company. The format includes dedicated channels, such as Home Shopping Network (HSN) and QVC; infomercials, program-length commericials featured on general television channels, usually late at night; and shorter direct response commercials that can last for as little as thirty seconds but encourage immediate action with such phrases as “operators are standing by to take your order.”
476
■
The Eighties in America
Home video rentals
not available in conventional retail stores, either because their manufacturers had chosen not to incur the expenses necessary to secure shelf space or because they could not produce enough items to stock national chains. Later, the channels would expand their offerings to include merchandise provided by upscale retailers such as Saks Fifth Avenue, Bloomingdale’s, and Nordstrom and designers such as Donna Karan, Calvin Klein, Todd Oldham, and Marc Jacobs.
With the advent of twenty-four-hour home shopping channels in the 1980’s, the sleep-deprived viewer with a credit card became an important new market for direct sales products. (PhotoDisc)
In 1981, HSN began marketing products on a local-access cable television channel. By 1985, it was broadcasting nationally. The channel, a subsidiary of IAC/InterActiveCorp that is headquartered in St. Petersburg, Florida, first began airing direct response advertising at a small AM radio station in 1977. The concept developed by accident, when one of the station’s advertisers could not pay a bill. The owner of the station took 112 can openers as payment and then auctioned them off to listeners. The auction, billed as Suncoast Bargaineers, led to a regularly scheduled radio show, from there to localaccess cable, and from there to a home shopping empire. HSN evolved into a global multichannel retailer, with product offerings of thousands of unique items in fashion, beauty, home, jewelry, and electronics. It also acquired competitors, particularly QVC (whose initials stand for quality, value, and convenience), based in West Chester, Pennsylvania, and the Shop at Home channel of Knoxville, Tennessee. During the 1980’s, these channels primarily marketed moderately priced, mass-produced merchandise. Often, they marketed commodities that were
Impact The availability of cheap, targeted advertising time on cable television transformed the practice of advertising and marketing in the United States. One of its consequences was the rise of home shopping channels, which provided small manufacturers or specialty firms with the wherewithal to market their products to a national audience. It also introduced American consumers to a form of consumption that provided instant gratification in the privacy of one’s own home. In later years, the Internet would take advantage of this same model of privacy and convenience. Further Reading
Berman, B., and J. R. Evans. Retail Management: A Strategic Approach. 9th ed. Upper Saddle River, N.J.: Prentice Hall, 2004. Dickerson, K. G. Inside the Fashion Business. 7th ed. Upper Saddle River, N.J.: Prentice Hall, 2003. Leigh Southward See also Advertising; Business and the economy in Canada; Business and the economy in the United States; Cable television; Television.
■ Home video rentals Definition
Short-term rentals of prerecorded videocassettes to be viewed at home
Small stores began renting videocassettes, and sometimes the equipment on which to play them, in the 1980’s. The rentals made it affordable for Americans to watch—and control the playback of—full, uncut motion pictures on their own televisions at home, significantly changing U.S. movie-viewing habits and forcing Hollywood to modify its filmmaking and marketing practices. The development of the videocassette recorder (VCR) dates back to the 1950’s, when an inventor
The Eighties in America
named Charles Ginsburg began leading a research team at the Ampex Corporation. Two years before his death, Ginsburg would be inducted into the National Inventors Hall of Fame for “one of the most significant technological advances to affect broadcasting and program production since the beginning of television itself.” The VCR became available to the public in the 1970’s and 1980’s. Movie studios at first resisted the VCR, as they had resisted television some three decades earlier. Congress debated a number of bills attempting to regulate public use of VCRs, but it failed to pass any of them. An appeals court initially ruled against Sony’s use of VCRs to record movies, but in 1984 the U.S. Supreme Court reversed that decision on the grounds that home recording fell under “fair use” copyright provisions. Developing the Marketplace At first, two different videocassette standards competed for consumers’ dollars, JVC’s VHS and Sony’s Betamax, or Beta, format. Betamax dominated the market at first, but VHS eventually took over the market. In addition, video discs sought a share of the prerecorded film marketplace. Most of the early VCRs were manufactured in Japan; the video disc was created in the United States. Initially VCRs were expensive, costing in the range of $1,000, and so were prerecorded movies. They were seen in the early 1980’s as luxury items. It was more economic for home viewers to rent both the player and the videocassette from small neighborhood stores that sprang up in communities throughout the United States and Canada. The rented devices usually only played prerecorded tapes; they generally did not offer renters the option of recording other programming from television. Three years before the start of the 1980’s, a Detroit, Michigan, businessman named Andre Blay obtained permission from Twentieth Century-Fox to sell on videocassette fifty films from the studio’s library through the Video Club of America. Few people owned VCRs at that time, but some nine thousand consumers joined the club. The Video Club of America may have constituted the initial impetus for the development of the video rental market. It was the first corporation to release movies on VHS, and it soon expanded and acquired imitators. Seeing its success, Twentieth Century-Fox bought Blay’s video company and, in 1982, reorganized it as Twentieth Century-Fox Video. Later that
Home video rentals
■
477
year, Fox merged its video operations with CBS Video Enterprises. It was possible to purchase prerecorded videocassettes rather than renting them, but they, too, were prohibitively expensive—often costing one hundred dollars or more. Given those costs, it made more sense for most people to pay around five dollars to rent a video and a player for a limited period—ranging from overnight to a few days—than it did for them to buy their own tapes and players. Video rental stores would sell their renters a membership, which had to be purchased before any movies could be taken home. Some rental stores were independently owned and operated; others, like Video Stop and Adventureland Video, were franchised. The stock of available movie titles often varied from one store to another. In the 1980’s, hardly any of the rental video players came with a remote control. A viewer had to move to the VCR to turn the player on or off or to stop or pause the tape. The earliest so-called remote control was physically connected by a plug-in wire to the front of the VCR. Even then, such devices were generally available only to purchasers, rather than renters, of VCRs. The movie and television industries encouraged consumers to rent rather than buy videos during much of the 1980’s. Another factor affecting the rise of video rentals was that, by the mid-1980’s, television viewers paying for cable or other viewing packages were becoming dissatisfied with the viewing options provided by commercial television. This dissatisfaction coincided with a gradual decrease in the price of VCRs to the point where many families could afford to buy one. Just as gradually, the mom-and-pop video stores found it less necessary to rent VCRs to their customers and began simply to rent the videocassettes themselves. By 1983, most of the movie studios had dropped their attempts to license their movies on video and began instead to sell videotapes to both rental stores and home users with their own VCRs. By the mid-1980’s, rental outlets had become so widespread that increasing numbers of customers felt the lower prices justified their own purchase of a home VCR. The first movie significantly to exploit the new demand for videocassettes was Lady and the Tramp, a popular 1955 Walt Disney animated film focusing on the adventures of a pampered pooch (Lady) and a roving mongrel (Tramp). Walt Disney Productions
478
■
released the movie on video in 1987 at a price of $29.95 and, by 1988, had sold 3.2 million copies—a record that would last until the video of E.T.: The Extra-Terrestrial sold some 14 million copies by 1993. Impact By the late 1980’s, the home video rental business had started to migrate from small momand-pop stores to huge video rental chains such as Blockbuster. Video rentals began to account for a significant portion of a motion picture’s total revenue, and Hollywood began to plan for videocassette revenues in setting the budgets of films. Moreover, the studios began to modify the format of their films in anticipation of the home rental market, because theatrical widescreen films would not fit unmodified on television screens. Further Reading
Levy, Mark R., ed. The VCR Age: Home Video and Mass Communication. Newbury Park, Calif.: Sage, 1989. Focuses on patterns of home video use. Levy, Mark R., and Barrie Gunter. Home Video and the Changing Nature of the Television Audience. London: Libbey, 1988. Explains how recording devices altered film and television viewing habits. Lyman, Peter. Canada’s Video Revolution: Pay-TV, Home Video, and Beyond. Toronto: J. Lorimer/ Canadian Institute for Economic Policy, 1983. Catalogs changes in communications technologies at the start of the 1980’s and their effects on Canadian society. The Video Age: Television Technology and Applications in the 1980’s. White Plains, N.Y.: Knowledge Industry, 1982. Overview of devices such as the VCR. Wasko, Janet. Hollywood in the Information Age: Beyond the Silver Screen. Austin: University of Texas Press, 1995. Analyzes the effects of new technologies on film marketing and distribution. Paul Dellinger See also
The Eighties in America
Homelessness
Cable television; Camcorders; Colorization of black-and-white films; Compact discs (CDs); Film in Canada; Film in the United States; Miniseries; Multiplex theaters; Music videos; PG-13 rating; Pornography.
■ Homelessness Definition
The state of being without permanent shelter, an emerging urban social issue of the 1980’s
During the 1980’s, homelessness in urban areas of North America increased at a rapid rate. No longer identified with single male alcoholics, the ranks of the homeless began to include entire families and the working poor. While homelessness was not new to the United States during the 1980’s, it did start taking on new characteristics as the reported number of homeless individuals began to increase during the decade. Prior to the 1980’s, the homeless were frequently characterized as vagrants and were often men with alcohol or substance-abuse problems. As the number of individuals found living on the streets and in emergency shelters began to increase, this characterization started to change. Beginning in 1983, the New York Times Index began using the term “homeless persons” instead of “vagrancy and vagrants” to classify articles dealing with the homeless. This marked a significant change in attitude, as homelessness began to be seen as a problem that afflicted not only the “undeserving” but also the “deserving” poor. Moreover, the role of post-traumatic stress disorder in homelessness came to be discussed more prominently, as Americans took notice of the number of military veterans living on the streets. Defining Homelessness During the 1980’s, the term “homeless” signified two very different but overlapping groups. In one context, the word “homeless” was used to refer to a social class that included panhandlers, bag people, the shabbily dressed, and other visibly poor individuals seen in public places and apparently lacking social ties. This common usage of the term, however, clouded the issue of who was truly without housing. Lacking a universally accepted definition of homelessness, America found itself mired in controversy regarding the number of homeless individuals and how to deal with homelessness. Homeless advocate Mitch Snyder, in the early 1980’s, claimed that there were 3 million homeless people in the United States. Although Snyder later admitted that this figure was made up to signify the importance of the issue—big problems require big numbers—this figure was often quoted in the news and by those seeking to help the homeless. A study
The Eighties in America
conducted by the U.S. Department of Housing and Urban Development (HUD), released in 1984, estimated that there were between 250,000 and 350,000 homeless in the United States on any given night. Local Los Angeles County activists, however, estimated that there were between 50,000 and 75,000 homeless people in Los Angeles alone. The wide range is indicative of the difficulty of counting a population defined by its lack of fixed address. HUD defined the homeless as those residing in emergency shelters and public and privates spaces not designed for shelter. Other estimates, based on various definitions of homelessness, often fell between these figures. One area of agreement was that the number of people seeking emergency shelter during the 1980’s had begun to grow. In New York City, the nightly average number of individuals in city-run shelters during January increased from about twenty-seven hundred in 1981 to just over ten thousand in 1987. Homeless families in city shelters increased these figures by nearly one thousand in 1981 and nearly forty-seven hundred in 1987. Similar trends were seen in other urban areas, both in the United States and Canada.
Homelessness
■
479
counted for approximately 50 percent of the homeless, single females about 10 percent, and families accounted for about one-third. A small percentage comprised children living on their own. About 50 percent of homeless adults had a high school diploma, and approximately 25 percent had some type of employment. During the 1980’s, families represented the fastest-growing segment of the homeless population. In comparison with the pre-1980’s homeless, the homeless of the 1980’s were younger and more racially diverse, including increasing numbers of African Americans and Hispanics. Since roughly one-third of homeless individuals were diagnosed with severe mental illness, it was argued that one structural factor contributing to the increase in homelessness was deinstitutionalization— the release of patients from mental institutions— and changes in the legal requirements to have someone institutionalized involuntarily. In the mid-1950’s,
Reasons for the Increase
As homelessness increased during the 1980’s, numerous studies to uncover its causes were undertaken in the United States in an effort to find solutions. Canada had homeless trends similar to those in the United States during the 1980’s, but the research effort there lagged by nearly a decade. The research that was conducted had two main objectives: One was to identify the characteristics of homeless individuals that caused them to be homeless, and the other was to identify societal or structural changes that might be causing homelessness to increase. The characteristics of homeless individuals varied in different urban areas, but they fit general patterns. Widely reported estimates found about onethird of homeless people to be substance abusers, about one-third to be suffering from severe mental illness, and about one-third of homeless males to be veterans. Since these groups are not mutually exclusive, individuals could be included in more than one classification. The average age of homeless adults was estimated to be about thirty-five; however, given the high number of homeless children, the overall average age was significantly lower. Single males ac-
A homeless man sits by his belongings in a public plaza in New York City. (Colin Gregory Palmer/cc-by-a-2.0)
480
■
The Eighties in America
Homelessness
patients in U.S. mental institutions numbered more than 550,000. In 1980, the figure was slightly less than 120,000. Since much of this reduction took place prior to 1980, however, this factor alone could not account for the dramatic increase in homelessness during that decade. The large increase in crack cocaine use in the mid-1980’s was seen as another possible cause of the increase in homelessness. A study conducted by the Cuomo Commission in New York found, through urinalysis of homeless people in New York City shelters for single individuals, that 65 percent tested positive for some form of substance abuse. Some 83 percent of those testing positive were cocaine users. Substance abuse, it was argued, makes it difficult to hold on to a steady job, breaks down social ties that could prevent homelessness, and consumes a large portion of available income that could be used for housing. Although cocaine use among homeless individuals did increase during the 1980’s, however, the percentage of substance abusers among the homeless did not, which called into question the assumption that a change in the drug of choice had led to an increase in the number of homeless people.
single-room-occupancy hotels (SROs), gentrification, and abandonment of low-income housing. These changes reduced the amount of low-income housing available. It was also posited that the increase in the quantity and quality of homeless shelters during the 1980’s acted as a magnet to attract those in less desirable housing situations, thereby increasing the count of homeless individuals. In addition to a reduction in the availability and affordability of low-income housing was a reduction in the number of low-skill and casual jobs. Structural changes in the economy reduced the number of jobs available to day laborers. Moreover, the reduction in real income, coupled with the increase in the cost of available housing, forced increasingly more individuals and families out of the housing market and onto the streets. Parents in low-income jobs were forced to choose between feeding and clothing their children or providing a roof over their heads, and they chose food over shelter, contributing to the increase in homeless families. Research revealed that homelessness was a complex problem that lacked a simple solution. Response to Homelessness
Social Structures and Homelessness
It was argued that studies that identified the characteristics and composition of the homeless were useful in finding out who was at the bottom of the poverty ladder and ended up homeless but were not as useful in understanding why homelessness increased during the 1980’s. A more fruitful approach was to identify structural changes that took place just prior to and during the increase in homelessness. Since homelessness is, at its core, a lack of housing, this research looked at changes that affected the availability and affordability of housing. One early and persistent structural explanation for the increase in homelessness was found in policies implemented by the Ronald Reagan administration. In early 1984, President Reagan was interviewed on Good Morning America saying of the homeless that they were “homeless, you might say, by choice.” Critics of the administration argued that the cutbacks in public housing and other programs designed to help those with low incomes were the problem, rather than individuals choosing to be homeless. Other structural changes often cited for the increase in homelessness include the reduction in
The primary response to homelessness occurred at the local level and varied considerably from city to city. In some places, such as Miami and Houston, the response was left almost entirely to charities and other private organizations. In other areas, such as Chicago and San Francisco, city governments worked with and helped fund nonprofit organizations to provide services to the homeless. Still other cities, such as New York and Philadelphia, ran their own shelters in addition to those run by nonprofit agencies. As homeless populations grew during the decade, this patchwork response faced a number of challenges. In New York City and elsewhere, homeless advocates utilized the courts to guarantee a minimum level of response. By contrast, as the size and number of homeless shelters expanded, residents living near existing or planned shelters often sought to block them. As the decade progressed, cities came under increasing pressure not only to provide a place for the homeless to sleep but also to reduce the number of homeless people in public places during the day. For much of the 1980’s, the U.S. government left the response to homelessness up to state and local governments. This changed in 1987 with the passage of the McKinney Homeless Assistance Act, which
The Eighties in America
placed some of the burden of homelessness on the federal government. This legislation provided a multi-part response that not only included emergency food and shelter for the homeless but also provided for housing and job training to transition individuals out of homelessness. Advocates for the homeless welcomed the legislation as an acknowledgment that homelessness is a national issue, but critics wished that the implemented programs would do more. Impact Homelessness in North America during the 1980’s changed from being perceived as a personal problem to being acknowledged as a public issue. As the homeless became more visible and began to include families and the working poor, homelessness no longer was viewed simply as the result of poor individual choices. It had seemingly become a problem that could affect anyone. To many, the increase in homelessness was symbolic of the failure of society to provide adequately for those unable to compete for jobs and resources. Homelessness continued to be a problem in urban areas beyond the end of the decade, but in response to the homelessness of the 1980’s, government at all levels expanded its role to provide shelter and reduce the visible signs of homelessness. Further Reading
Blau, Joel. The Visible Poor: Homelessness in the United States. New York: Oxford University Press, 1992. Provides an insightful discussion of homelessness, its causes, and the public response; outlines possible solutions. Emphasizes the experience of New York City. Hombs, Mary Ellen. American Homelessness: A Reference Handbook. 3d ed. Santa Barbara, Calif.: ABCCLIO, 2001. Comprehensive guide to the issues, research, and resources related to homelessness during the 1980’s and after. Jencks, Christopher. The Homeless. Cambridge, Mass.: Harvard University Press, 1994. Good survey of the research on U.S. homelessness during the 1980’s that discusses important causes and offers solutions. Kozol, Jonathan. Rachel and Her Children: Homeless Families in America. New York: Crown, 1988. The stories of homeless families’ day-to-day struggles to survive in New York City welfare hotels during the 1980’s are explored in detail. O’Flaherty, Brendan. Making Room: The Economics of
Homosexuality and gay rights
■
481
Homelessness. Cambridge, Mass.: Harvard University Press, 1996. In an accessible way, economic analysis is used to explain the importance of various causes of homelessness and how these causes contributed to patterns experienced during the 1980’s in selected cities in North America and Europe. Rossi, Peter. Down and Out in America: The Origins of Homelessness. Chicago: University of Chicago Press, 1989. Puts homelessness into a historical perspective and provides a detailed look at homelessness in Chicago during the 1980’s. Randall Hannum See also
Demographics of Canada; Demographics of the United States; Gentrification; Income and wages in Canada; Income and wages in the United States; McKinney Homeless Assistance Act of 1987; Reagan, Ronald; Reaganomics; Recessions; Unemployment in Canada; Unemployment in the United States; Welfare.
■ Homosexuality and gay rights Definition
Same-sex relationships and the struggle for legal and cultural acceptance of gay men, lesbians, and bisexual people
The 1980’s saw the gay rights movement shift its focus to include the AIDS crisis. The crisis brought a renewed attack on alternate sexualities from the Religious Right, as well as a new sense of purpose and renewed efforts by activists fighting for gay rights. The history of gay and lesbian rights in the 1980’s is closely tied to the history of the acquired immunodeficiency syndrome (AIDS) crisis. By the time the gay liberation movement reached the 1980’s, it was beginning to lose momentum. However, when AIDS was discovered in the early part of the decade, gay rights activists had a new rallying point. Especially in the disease’s early years in North America, the majority of its victims were homosexual men. Political conservatives therefore exploited the AIDS crisis to attack gays, lesbians, and bisexuals. The effect of this backlash was to draw the gay, lesbian, bisexual, transgender (GLBT) community closer together. The AIDS Crisis and the Conservative Backlash
Because so many early sufferers from AIDS were gay
482
■
Homosexuality and gay rights
The Eighties in America
Some Significant Events Affecting Gay, Lesbian, Bisexual, and Transgender People in the 1980’s 1980 The Human Rights Campaign Fund is founded as a political action committee to raise funds for candidates for public office support of gay and lesbian civil rights. The Canadian Union of Postal Workers ratifies a contract with Canada Post and the Canadian Treasury that includes a nondiscrimination clause protecting gay and lesbian employees—the first time federal employees in any country receive such protection. 1981 Parents, Families, and Friends of Lesbians and Gays is founded in the United States. Police in Toronto conduct a massive raid on gay bathhouses, arresting almost three hundred men— the largest mass arrest of gay men in North American. The subsequent demonstrations and protests come to be known as Canadian Stonewall. The New York Times and a medical newsletter report about cases of Kaposi’s sarcoma and pneumocystis carinii appearing in gay men—the first recognition in the mainstream media of acquired immunodeficiency syndrome (AIDS). 1982 The Institute for the Protection of Lesbian and Gay Youth is founded in New York City. The group helps establish the first high schools for gay and lesbian teens and confronts accusations that gay and lesbian teachers are a threat to children. Wisconsin becomes the first state to enact a gay, lesbian, and bisexual civil rights law, adding the term “sexual orientation” to the state’s list of prevailing civil rights statutes. Gay-related immunodeficiency, or GRID, is renamed AIDS by medical researchers to better reflect that the disease is not exclusive to gay men. 1983 Gerry Studds, a Democratic member of the House of Representatives representing southeastern Massachusetts, comes out after he is accused of sexual misconduct with a male congressional page. Studds is the first gay member of Congress to come out and the first gay or lesbian officeholder at the national level to acknowledge his sexual orientation. 1984 French and American officials announce that scientists in their respective countries have isolated what is believed to be the virus, or pathogen, that causes AIDS: human immunodeficiency virus, or HIV. The head of the San Francisco Health Department closes fourteen gay bathhouses, bookstores, movie theaters, sex clubs, and other businesses after investigators who went to these businesses saw what they believed were sex acts with a high risk for HIV transmission. West Hollywood, California, incorporates as a city and elects a majority gay and lesbian city council, becoming the first “gay city” in the United States. Berkeley, California, becomes the first American city to extend health and other employee benefits to the gay and lesbian domestic partners of city employees. 1985 Actor Rock Hudson announces he has AIDS, an announcement that leads to increased public awareness of the disease. He dies on October 2.
The Eighties in America
Homosexuality and gay rights
■
483
1986 Ruling in Bowers v. Hardwick, the U.S. Supreme Court upholds the state of Georgia’s power to criminalize private, consensual, adult sexual relations between men. The U.S. Food and Drug Administration (FDA) ends clinical trials of and releases the experimental drug AZT so it can be prescribed to people with AIDS. California voters reject Proposition 64, a measure supported by Lyndon LaRouche, which would have allowed health officials to quarantine people with AIDS. 1987 ACT UP, a radical and confrontational street-action group demanding public and governmental attention to the AIDS-HIV epidemic, is founded in New York City. Old Lesbians Organizing for Change (OWL) is created to address the invisibility of older lesbians within society in general and within the women’s and lesbian rights movements in particular. Barney Frank, a Democrat representing Massachusetts, is the second member of the House of Representatives to come out—and the first to do so willingly. The Second National March on Washington for Lesbian and Gay Rights brings more than 500,000 marchers and protesters to the nation’s capital. 1988 The Canadian parliament amends a section of the criminal code to decriminalize sodomy and anal intercourse between consenting adults age eighteen or older. The Presidential Commission on the Human Immunodeficiency Virus Epidemic releases a report recommending how the United States should address HIV-AIDS. President Ronald Reagan, however, rejects most of the recommendations. Oregon voters repeal a 1987 executive order by Governor Neil Goldschmidt that banned discrimination on the basis of sexual orientation. The first World AIDS Day is commemorated on December 1. 1989 The U.S. Supreme Court, ruling in Price Waterhouse v. Hopkins, affirms that the prohibition in Title VII of the Civil Rights Act of 1964 against discrimination because of sex or gender extends to discrimination based on gender-role stereotypes. Some legal scholars and others maintain the ruling widens the rights of lesbians and gays because sexuality and sexual expression are kinds of gender-role stereotypes. A federal court of appeals orders the reinstatement of U.S. Army sergeant Perry Watkins, who had been dismissed from the military because he was gay.
men, the disease was initially labeled gay-related immune deficiency (GRID). Public scrutiny fell onto the group most traumatized by the disease, and much of the attention came in the form of opprobrium and condemnation. While Rock Hudson’s 1985 death from AIDS increased sympathy for the disease’s sufferers, plenty of anti-gay politicking still surrounded the issue. Since sexual transmission was identified early in the disease’s residency in North America, one important factor in containing the outbreak involved
sex education. In the early 1980’s, gay men were flocking to New York, San Francisco, Toronto, and other large cities. For many of these men, increased sexual contact was part of the point of the gay revolution. Rather than being closeted and secretive, they were at last fully free to explore their sexuality. Many had numerous anonymous lovers, sometimes several over the course of one evening. Bathhouses designed to facilitate such contact flourished in the early part of the decade. These same bathhouses became the targets of public officials attempting to
484
■
Homosexuality and gay rights
corral the spread of AIDS, once its sexually transmitted nature had been identified. The facilities tried to remain open by agreeing to post notices about the importance of condoms and the dangers of unprotected anal sex, but by 1985, most had been closed for public health reasons, even as promiscuity among gay men declined and the disease’s spread in that group leveled out. The gay and lesbian revolution of the 1970’s had sparked changes in favor of gay rights. However, a large percentage of the U.S. population still objected to what was widely perceived as immoral behavior. Conservative politicians used AIDS against the GLBT community with growing success throughout the decade. Many, like California legislator William Dannemeyer, masked homophobia under the guise of concern for the public health. The AIDS crisis exacerbated this perspective and encouraged potential neutrals to join the backlash. Violence was not uncommon, with hate crimes against gays tripling in the United States between 1982 and 1985. The “patient zero” myth, which theorized that all AIDS cases in North America could be traced back to one gay Canadian airline attendant, Gaetan Dugas, did not help the GLBT cause, particularly as the myth was long accepted as medical truth, even by such strong GLBT advocates as journalist and author Randy Shilts. The Religious Right and Politics
Particularly vocal in the antigay backlash was the right-wing fundamentalist Protestant movement known as the Religious Right. Led by such televangelists as Jimmy Swaggart and Jim and Tammy Faye Bakker, the Religious Right argued that homosexuality was a perversion and that AIDS was God’s punishment for sexual deviancy. Furthermore, by equating homosexuality with pedophilia, the Religious Right attempted to make it difficult or impossible for gays and lesbians to achieve workplace equality. Other efforts to demonize gays and lesbians extended as far as opposition to hate-crime laws that would protect gays and lesbians. Some conservative religious organizations opposed to homosexuality opted to believe that homosexual behavior could be changed with the correct environment. Using reparative and aversion therapy, both available since the 1960’s but not popularized until the 1980’s, these groups set up centers where gays could come to be “cured” of their condi-
The Eighties in America
tion. The centers gained wide support among conservative Christians; however, they generally served as meeting places for gays, and most graduates returned to their homosexual behavior. The American Psychiatric Association declared this form of therapy unsuccessful and dangerous. In 1979, the Reverend John Kuiper became the first gay man to adopt a child. Only one state allowed such adoptions, however, and those adoptions legally applied only to one individual: Same-sex partners could not both adopt the same child. Gains in this area were limited in the 1980’s, and the idea of both partners adopting a child was still considered unlikely throughout the decade. Thus, in the 1980’s, when one member of a same-sex couple was able to adopt a child, that couple faced a dilemma: The non-adopting partner was left in a terrible situation in the face of tragedy or the end of the relationship, in many cases losing custody rights entirely. However, not all Americans were opposed to gay and lesbian rights, and social, religious, and political gains came throughout the decade as well. For example, in 1980, Sergeant Leonard Matlovich finalized a win over the U.S. Army. After being discharged for homosexuality in the 1970’s, he had been fighting anti-gay attitudes for his reinstatement. In September of 1980, a federal judge ordered Matlovich reinstated and given back pay. The army instead worked out a costly settlement, which Matlovich accepted, fearing that further pursuit of the case would result in a loss at the Supreme Court level. After winning a similar court case ahead of Matlovich and pursuing the army to allow her to return in a lengthy series of lawsuits, Sergeant Miriam Ben-Shalom was finally allowed to resume her post and finish her tour of duty in 1988. (Unfortunately, the army did not allow her to reenlist the following year.) Other gains in the decade included the federal government’s 1980 decision to drop its ban on hiring gay employees and Wisconsin’s 1982 decision to outlaw all discrimination based on sexual orientation. The first Gay Games, modeled after the Olympics, were held in San Francisco in 1982. In 1984, the city of West Hollywood incorporated, breaking away from Hollywood, and the majority of its city council, an elected body, was gay or lesbian. In 1987, Massachusetts congressman Barney Frank publicly acknowledged his homosexuality.
The Eighties in America Bowers v. Hardwick In 1986, a case of extreme import to gay and lesbian rights came before the U.S. Supreme Court. Michael Hardwick had been arrested for sodomy when police found him with a lover at his own home. He was never actually prosecuted under the law, but he was deliberately humiliated and harassed by police. Hardwick sued the state, insisting the law against sodomy was unconstitutional. However, the Supreme Court ruled, in Bowers v. Hardwick, that the law was acceptable. (That decision stood until it was overruled in 2003.) Although two states, New York and Pennsylvania, had declared their own antisodomy laws unconstitutional at the beginning of the decade, other states were allowed to outlaw sodomy as they saw fit. This was widely considered to be a setback for gay and lesbian rights, especially in the area of the right to privacy, as it meant states could continue criminalizing homosexual behavior. However, in spite of this significant blow, there were some positive signs, as no new states passed sodomy laws and no such laws that had been removed were reinstated. Canada Though events in the United States got most of the popular press coverage in the 1980’s, gay and lesbian rights also made steady gains in Canada throughout the decade. In February, 1981, before the AIDS crisis, a group of four bathhouses in Toronto frequented by gay men were raided. Police ultimately arrested more than three hundred gay men for their sexual orientation. Nearly three thousand protesters demonstrated against these actions, and this protest formed the basis of Toronto’s annual Gay Pride Week. Thus, the raids represented both a visible sign of the backlash against the gay and lesbian revolution and, ultimately, a victory for gay and lesbian rights. Of even greater significance, in 1982, the Canadian constitution was revised and patriated, and the Charter of Rights and Freedoms became a central portion of it. This charter had a strong impact on the lives of the gay and lesbian community. The charter did not specifically list sexual orientation as a protected category of identity, and activists throughout the decade and beyond fought to have the phrase added to the document. (In 1996, the Canadian Supreme Court declared that freedom from discrimination based on one’s sexuality was protected by the charter.) The Canadian military specifically refused to employ gays and lesbians for many years, and en-
Homosexuality and gay rights
■
485
listed homosexuals were removed from the military throughout the decade if their sexuality was discovered. Elsewhere in the political realm, in 1986, the group Equality for Gays and Lesbians Everywhere (EGALE; later Égale) formed in Canada. The group became one of Canada’s most active legal groups, seeking not just to win victories in the courtroom, but also to offer a support network to the nation’s GLBT community. In 1986 and 1987, the provinces of Ontario and Manitoba, as well as the Yukon Territory, added sexual orientation to their codes of human rights, making employment, housing, and some other forms of discrimination against gays, lesbians, and bisexuals illegal. On the religious front, in 1988, the United Church of Canada publicly stated that homosexuals could be ordained. Impact The Religious Right had an immeasurable effect on politics in the United States, as it backed President Ronald Reagan for the entirety of both his terms as president and gave financial support to President George H. W. Bush as well. This meant much of the political power during the decade rested in the hands of the anti-gay movement. However, AIDS had the greatest impact on gay and lesbian rights in the 1980’s. Besides engendering a sometimes violent backlash, it also knit gay and lesbian rights activists together by giving them a common cause to fight against just when a schism was beginning in their movement. Further Reading
Cruikshank, Margaret. The Gay and Lesbian Liberation Movement. New York: Routledge, 1992. History of the movement beginning with the Stonewall riots and moving through the decade of AIDS. Engel, Jonathan. The Epidemic. New York: Smithsonian Books/HarperCollins, 2006. Discussion of the human immunodeficiency virus (HIV) and of AIDS, including their social and political ramifications. Intertwined with discussion of the GLBT community in the 1980’s and the effects of the disease and conservative backlash on, particularly, gay men. Herman, Didi. Rights of Passage: Struggles for Gay and Lesbian Legal Equality. Toronto: University of Toronto Press, 1994. Examines the legal struggles of Canadian gays and lesbians in the 1980’s, with the inclusion of some feminist theory. Hunt, Gerald. Laboring for Rights: Unions and Sexual
486
■
Horror films
Diversity Across Nations. Philadelphia: Temple University Press, 1999. Focuses on gains within the labor movement in the United States and Canada, including domestic partner benefits. Shilts, Randy. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin’s Press, 1987. Examines the AIDS crisis, including the disease, its origins, and the social opprobrium faced especially by gay men. Jessie Bishop Powell See also
ACT UP; AIDS epidemic; AIDS Memorial Quilt; Bakker, Jim and Tammy Faye; Bowers v. Hardwick; Falwell, Jerry; Moral Majority; Reagan, Ronald; Swaggart, Jimmy; Televangelism; Torch Song Trilogy; White, Ryan.
■ Horror films Definition
Movies featuring violence or the supernatural whose purpose is to shock and frighten viewers
During the 1980’s, horror films completed their transition away from being considered children’s fare, as they were targeted instead at young-adult audiences. An emphasis on graphic gore and special effects was at the center of this repackaging, and it also resulted in significant controversy in a era often characterized by cultural conservatism. Horror films had been considered a children’s genre for most of their existence; in 1969, the first full year of the Motion Picture Association of America (MPAA) rating system, the gory, sexy Dracula Has Risen from the Grave received a G rating. The 1970’s were a transitional period, with films such as The Exorcist (1973) and The Texas Chainsaw Massacre (1974) establishing the idea of R-rated, adult horror. Still, the majority of American horror films appealed to youth: as late as 1979, the makers of Alien hoped to win a PG rating. The turning point for the horror genre came in 1980 with Friday the 13th, a low-budget “slasher” movie. The film had been inspired by the phenomenal success of 1978’s Halloween, but it featured added gore effects by Tom Savini, a makeup artist highly influential throughout the decade. Its slit throats, ax in a head, and decapitation scenes thrilled audiences, but they also brought criticism to the MPAA ratings board for being too lenient in assigning the film only
The Eighties in America
an R rating. From then on, the board watched slasher movies carefully and, for a time, all but the smallest amounts of bloodletting garnered an X rating. Nevertheless, slasher films were so inexpensive and successful that the formula was endlessly copied in films such as Maniac (1980), The Burning (1980), and The Prowler (1981), all featuring Savini’s special makeup effects. These films were generally either recut to pass the MPAA board with an R rating or, as in the case of Maniac, released “unrated.” The Friday the 13th franchise continued to flourish, and copycat “holiday horror” movies followed in its wake, including Mother’s Day (1980), My Bloody Valentine (1981), Prom Night (1980), Happy Birthday to Me (1980), and Graduation Day (1981); it appeared no holiday would remain unscathed. These various stalker and slasher films—in which teenagers engage in illicit behavior before being slaughtered—are traditionally viewed as a product of the cultural conservatism of the 1980’s. The “bad kids” get killed in such films, while the virgin lives to fight another day (or in the next sequel). Another explanation for the films’ formula (one logically compatible with the first) is that it was the least expensive, easiest way to get as much sex, nudity, bloodshed, and horror on screen as possible on a limited budget. These, it was believed, were the elements a teenage audience would want to see. While the stalker and slasher films threaten to dominate discussions of 1980’s horror, it is important to consider the many creative, innovative horror films of the period as well. Brian De Palma directed the stylish horror-thrillers Dressed to Kill (1981), Blow Out (1981), and Body Double (1984), arousing the ire of feminist protesters who objected to his artfully staged murder scenes. Stanley Kubrick offered his highly anticipated and controversial version of Stephen King’s The Shining in 1980: It divided critics and enthusiasts more than did any other adaptation of King’s work. Other such King adaptations in the 1980’s included well-received versions of Christine (1982) and Cujo (1983), as well as King’s own collaboration with George A. Romero, Creepshow (1982), an homage to the horror comics of the 1950’s. Meanwhile, Canadian David Cronenberg, one of the most prolific and creative talents in the genre, continued directing fascinating and wildly erratic science-fiction horror films such as Scanners (1980), with its infamous “exploding head” scene; Videodrome (1983), which challenged the boundaries of reality and illusion; The Dead Zone (1983), a straightforward adaptation of
The Eighties in America
Horror films
King’s novel; The Fly (1986), a radical reinvention of the Vincent Price film; and Dead Ringers (1988), the disturbing tale of twin gynecologists who descend into a hell of identity loss, drug addiction, and bizarre ideas that lead to suicide and murder. The 1980’s also saw the rise of new visionaries in horror cinema. Wes Craven’s A Nightmare on Elm Street (1984) returned horror to the realm of the fantastic by placing a killer inside young people’s dreams. Neil Jordan’s The Company of Wolves (1984) was an innovative and stylish coming-of-age tale set amid werewolf legends. Sam Raimi’s tale of demoniac possession, The Evil Dead (1981), compensated for a low budget with stunning style, leading Raimi and star Bruce Campbell to more prosperous careers. Hellraiser (1987) introduced Clive Barker’s talents as director to the screen, adding to his already
■
487
impressive literary accomplishments. Stuart Gordon’s H. P. Lovecraft adaptation, Re-animator (1985), was among the best and most outrageous horror films to emerge from the 1980’s. Meanwhile, the old guard held up its end, as George A. Romero directed the third in his Dead series, Day of the Dead (1985). Joe Dante directed both The Howling (1981) and Gremlins (1984), the latter contributing to the creation of the PG-13 rating, which would open a completely new market for horror films to come. Impact The 1980’s introduced the low-budget slasher fad to the big screen. The decade also, however, saw the debut of many genre filmmakers who would go on to prominence. These horror auteurs directed intelligent, upstart films that would influence horror and supernatural cinema in future years.
Selected 1980’s Horror Films Year
Title
Director
1980
The Shining
Stanley Kubrick
Friday the 13th
Sean S. Cunningham
1981
1982
1983
He Knows You’re Alone
Armand Mastrioni
You Better Watch Out (also known as Christmas Evil)
Lewis Jackson
Prom Night
Paul Lynch
The Evil Dead
Sam Raimi
Halloween II
Rick Rosenthal
Hell Night
Tom DeSimone
The Incubus
John Hough
Ghost Story
John Irvin
The Hand
Oliver Stone
Wolfen
Michael Wadleigh
The Howling
Joe Dante
Scanners
David Cronenberg
The Thing
John Carpenter
Poltergeist
Tobe Hooper
Creepshow
George A. Romero
Swamp Thing
Wes Craven
Cujo
Lewis Teague
Night Warning
William Asher
The Dead Zone; Videodrome
David Cronenberg
The House on Sorority Row
Mark Rosman
Psycho II
Tom Holland
(continued)
488
■
The Eighties in America
Horton, William
Selected 1980’s Horror Films
(continued)
Year
Title
Director
1984
A Nightmare on Elm Street; Invitation to Hell
Wes Craven
Scream for Help
Tom Holland
1985
Re-Animator
Stuart Gordon
The Return of the Living Dead
Dan O’Bannon
Cat’s Eye
Lewis Teague
Fright Night
Tom Holland
Day of the Dead
George A. Romero
Aliens
James Cameron
The Fly
David Cronenberg
Invaders from Mars
Tobe Hooper
Psycho III
Anthony Perkins
1986
1987
1988
1989
Critters
Stephen Herek
Bad Taste
Peter Jackson
Evil Dead II
Sam Raimi
Hellraiser
Clive Barker
Creepshow 2
Michael Gornick
Hello Mary Lou: Prom Night II
Bruce Pittman
Halloween 4: The Return of Michael Myers
Dwight H. Little
Child’s Play
Tom Holland
Critters 2: The Main Course
Mick Garris
Fright Night 2
Tommy Lee Wallace
The Return of Swamp Thing
Jim Wynorski
Further Reading
Gingold, Michael. “History of Horror: The 1980’s.” In Fangoria’s Best Horror Films, edited by Anthony Timpone. Avenel, N.J.: Crescent Books, 1994. An intelligent, concise chapter by a writer who covered the films as they were released. Harper, Jim. Legacy of Blood: A Comprehensive Guide to Slasher Movies. Manchester, England: Headpress, 2004. A thematic guide to this subgenre. McCarty, John. Splatter Movies. New York: FantaCo Enterprises, 1981. Places gore films of the 1980’s in context with previous films. Stell, John. Psychos! Sickos! Sequels! Horror Films of the 1980’s. Baltimore: Midnight Marquee Press, 1998. Film-by-film guide to 1980’s horror films. Charles Lewis Avinger, Jr.
See also
Aliens; Blade Runner; Blue Velvet; Fatal Attraction; Film in Canada; Film in the United States; King, Stephen; PG-13 rating; RoboCop; Science-fiction films; Teen films; Terminator, The; Twilight Zone accident.
■ Horton, William Identification Convicted murderer Born August 12, 1951; Chesterfield, South
Carolina Horton was serving a life sentence in prison when he was released for a weekend as part of a Massachusetts state furlough program. He used this opportunity to escape, and he later raped a white woman in Maryland. His story was
The Eighties in America
used by political opponents against presidential candidate Michael Dukakis, who as governor of Massachusetts supported the furlough program. Few times in recent history has a single convicted felon been so critically linked to national affairs as was William Horton, an African American felon commonly nicknamed Willie Horton by the media. Horton’s name became linked with the political destruction of Michael Dukakis through negative campaigning, the election of George H. W. Bush, and the racial hysteria of the 1980’s. Horton and two accomplices were convicted of killing a gas station attendant in 1974 during a robbery. Though it is not known if Horton stabbed the attendant, in Massachusetts any accomplice to a qualifying felony at which a murder takes place is guilty of felony murder. He was sentenced to life in prison without the possibility of parole and entered the Massachusetts prison system. That system had developed a furlough program to reward good behavior among inmates as an incentive to assist with control of the inmate populations. Furlough programs were common among the states, but Massachusetts was unique in that it would allow a two-day furlough for convicted murderers. Horton received one of these furloughs in 1986, and he failed to return to serve the remainder of his life sentence when his two days of freedom ended. The following year, Horton surfaced in Maryland as the perpetrator of a violent crime. He raped a woman twice, viciously assaulted her fiancé, and then stole the fiancé’s car. He was soon captured by the police, tried, and convicted in Maryland, which refused to return Horton to Massachusetts on the chance that he might again be released. In 1988, Horton’s story was used repeatedly to attack Massachusetts governor Michael Dukakis, who was running for the presidency. Dukakis had inherited the furlough system when he took office, but he did believe in its effectiveness. The attacks began during the 1988 Democratic presidential primaries, when Senator Al Gore of Tennessee used the Massachusetts prison furlough program to suggest that Dukakis was weak on crime. There is some dispute as to whether or not Gore used Horton’s name in these attacks, but once Dukakis won the primary contest, the seeds were sown for the general election, in which the governor faced Vice President George H. W. Bush. Bush’s campaign team included future
Horton, William
■
489
White House Chief of Staff Andrew Card and future Fox News president Roger Ailes, and it was run by Lee Atwater, a seasoned Republican political warrior. The campaign decided to make Willie Horton a household name. Two negative campaign ads were produced regarding Horton. One showed him confronting the camera in a glaring mug shot, and the other showed a revolving door at a prison. The infamous revolvingdoor commercial depicted a stream of minority prisoners leaving a prison, implying that under a Dukakis administration, African American felons would stream out of correctional facilities to invade America’s neighborhoods. Horton’s visage is considered
The infamous photo of William Horton. The convicted murderer’s visage featured prominently in campaign ads attacking Michael Dukakis’s record on crime during the 1988 presidential campaign. (AP/Wide World Photos)
490
■
Houston, Whitney
The Eighties in America
by pundits to have influenced the election in Bush’s favor, though it is doubtful that Dukakis would have beat Bush even if these ads had never been run. Nonetheless, Horton’s name became synonymous in political circles with negative campaigning. To “go Horton” came to mean to “go negative” in a campaign. Impact During the 1980’s, racial tensions increased, especially in U.S. cities and suburbs, and the Republican Party successfully positioned itself as the party that was “tough on crime.” The story told by the commercials of an African American raping a white woman because a New England liberal governor was too nice to him therefore struck a major chord with the U.S. electorate. Dukakis’s defeat, whether the direct result of the Horton ads or not, became associated with them, and the tactics those ads represented were thought of thereafter as both reprehensible and effective. Further Reading
Anderson, David C. Crime and the Politics of Hysteria. New York: Random House, 1995. Feagin, Joe R., and Hernan Vera. White Racism. New York: Routledge, 1995. Gest, Ted. Crime and Politics. New York: Oxford University Press, 2001. Pepinsky, Harold E., and Richard Quinney. Criminology as Peacemaking. Indianapolis: Indiana University Press, 1991. R. Matthew Beverlin See also African Americans; Atwater, Lee; Bush, George H. W.; Crime; Dukakis, Michael; Elections in the United States, 1988.
■ Houston, Whitney Identification African American singer Born August 9, 1963; Newark, New Jersey
Houston was one of the most popular female pop singers of the 1980’s. Born into a musical family, Whitney Houston discovered her love of music at an early age. During elementary school, Houston sang in the New Hope Baptist Church choir, and by high school she was performing professionally with her mother, Cissy Houston, and her cousins, Dionne and Dee Dee
Whitney Houston answers questions from the press after the 1986 MTV Video Music Awards ceremony in Los Angeles. (AP/Wide World Photos)
Warwick, in the gospel group the Drinkard Sisters. After her high school graduation in 1981, Houston continued to develop her professional music career by signing a contract with Tara Productions. In 1983, she sang “Eternal Love” on the album Paul Jabara and Friends, which was produced by Columbia Records. In 1983, Houston moved even closer to stardom when she signed a recording contract with the legendary Clive Davis of Arista Records. Her self-titled debut album, Whitney Houston, was released in 1985 and became an instant success. Catapulting the young singer to superstar status, the album sold over thirteen million copies at its release. Famous songs from the album included “You Give Good Love,”
The Eighties in America
which earned an American Music Award, and “Saving All My Love for You,” which won both a Grammy Award and an American Music Award. In 1986, with her popularity continuing to soar, Houston embarked on her first worldwide tour, the Greatest Love tour. In 1987, the artist’s second album, Whitney, debuted in both the United States and the United Kingdom at number one on the Billboard pop charts, marking the first time that feat had been accomplished by a female artist. The record also sold more than twelve million copies at the time of its release. Four singles from the album reached number one on the Billboard Hot 100: “I Wanna Dance with Somebody (Who Loves Me),” which earned Houston a Grammy Award, “Didn’t We Almost Have It All,” “So Emotional,” and “Where Do Broken Hearts Go.” Houston also embarked on the very successful worldwide Moment of Truth tour to promote her album. She then recorded the 1988 Olympics theme song, “One Moment in Time.” In 1990, Houston ended the decade with the release of her third album, I’m Your Baby Tonight. Impact During the 1980’s, Whitney Houston enjoyed enormous success as a singer. While her first two albums catapulted her to superstar status, her natural singing talents set a standard that other female singers strove to reach. Through hard work and dedication, Houston witnessed several of her singles reach number one on the pop charts around the world. She also won several Grammy Awards and American Music Awards. Further Reading
Nathan, David. The Soulful Divas: Personal Portraits of Over a Dozen Divine Divas, from Nina Simone, Aretha Franklin, and Diana Ross to Patti LaBelle, Whitney Houston, and Janet Jackson. New York: Billboard Books, 1999. Parish, James Robert. Whitney Houston: The Biography. London: Aurum Press, 2003. Bernadette Zbicki Heiney See also African Americans; Music; Music videos; Olympic Games of 1988; Pop music.
Howard Beach incident
■
491
■ Howard Beach incident The Event
A racially motivated attack results in the death of an African American man Date December 20, 1986 Place New York City One of several high-profile incidents of hate-based violence to occur during the 1980’s, the Howard Beach incident received natonal coverage and sparked a debate concerning the state of race relations in the United States in the 1980’s. On the evening of Friday, December 19, 1986, four African American men traveling in the vicinity of Howard Beach in the borough of Queens, New York, became stranded when their automobile broke down. Three of the four men walked to the predominantly white Howard Beach neighborhood to seek help, stopping at a local pizzeria in search of a telephone. Upon emerging from the restaurant at approximately 12:40 early Saturday morning, the three African Americans were confronted by a group of approximately ten white men, who shouted racial slurs at them and began beating them with baseball bats and other objects. In the course of the attack, one of the African American men, Michael Griffith, ran into the street to escape his attackers and was struck by a passing automobile. Griffith died at the scene as a result of his injuries. Police later arrested twelve young male residents of Howard Beach, charging them with crimes ranging from assault to murder. In December of 1987, three of the men, Jon Lester, Scott Kern, and Jason Ladone, were convicted of second-degree manslaughter and sentenced to prison terms ranging from five to thirty years. Two of the other defendants were acquitted, and the rest received light prison sentences or community service. The attack and the failure of the judicial system to convict any of the defendants of murder provoked outrage among black residents of New York City and inspired a number of protests. On December 27, a predominantly African American group of approximately twelve hundred protesters led by the Reverend Al Sharpton marched through Howard Beach. The marchers were met by a large number of local residents who shouted racial epithets and threats. Sharpton rose to national prominence as a result of the protest. Impact The Howard Beach incident was one of a series of high-profile incidents highlighting racial
492
■
Hubbard, L. Ron
tension in the New York City area during the late 1980’s, including the murder of Yusef Hawkins in the New York City neighborhood of Bensonhurst in 1989 and the alleged racially motivated attack of teenager Tawana Brawley in 1987. Critics of American race relations and racial policy in the United States cited the incident as an example of the country’s growing racism, which many blamed upon the resurgence of political conservatism under the leadership of President Ronald Reagan. The incident contributed to calls for legislation creating special penalties for hate crimes, or violent acts motivated by racism and other social biases. Further Reading
Jacobs, James B., and Kimberly Potter. Hate Crimes: Criminal Law and Identity Politics. New York: Oxford University Press, 2000. Pinkney, Alphonso. Lest We Forget: White Hate Crimes— Howard Beach and Other Atrocities. Chicago: Third World Press, 1994. Michael H. Burchett See also
African Americans; Brawley, Tawana; Central Park jogger case; Conservatism in U.S. politics; Crime; Gangs; Goetz, Bernhard; Hawkins, Yusef; Horton, William; Racial discrimination.
■ Hubbard, L. Ron Identification
American pulp fiction writer and religious leader Born March 13, 1911; Tilden, Nebraska Died January 24, 1986; San Luis Obispo County, California Hubbard was the charismatic founder of a philosophy called Dianetics and of the Church of Scientology, a popular twentieth century American religion. Controversial public figure L. Ron Hubbard, the founder of the Church of Scientology, had not been seen in public for five years when he died from a stroke at his California ranch in 1986. The Church of Scientology, which asserts that Scientology is the only major religion to have been founded in the twentieth century, claimed that Hubbard had merely discarded his mortal body to be free to travel to a higher plane where he could conduct advanced spiritual research. Scientology’s philosophy, written and developed by
The Eighties in America
Hubbard, holds that humans are immortal spiritual beings that live through many lifetimes. A method of spiritual transformation whose roots lie in all the great religions, Scientology, which means “knowing how to know,” seeks to help people access their unlimited potential abilities. According to the church, Hubbard strove to set people free spiritually and to design a civilization free of war and crime where all could prosper and be free to evolve to higher states of being. During the 1980’s, Hubbard achieved a high level of success, as his church’s membership continued to grow. However, during this time, he also met with an enormous amount of controversy that centered on his less than stellar education, dubious military records, and cult activities, as well as bigamy charges and the continuing negative reaction of the American Psychological Association to Dianetics (1950). This best-selling book by Hubbard formed the basis of his religious philosophy. In 1984, a California judge condemned Hubbard as “a pathological liar when it comes to his history, background, and achievement.” That same year, a London high-court judge decreed that Scientology was “dangerous, immoral, sinister, and corrupt.” During the 1980’s, Hubbard wrote Battlefield Earth (1982) and the ten-volume Mission Earth (19851987), as well as an unpublished screenplay titled Revolt in the Stars. The screenplay was intended as a dramatization of Scientology’s teachings. Hubbard earned a fortune from his fiction, as well as from other Scientology enterprises. Forbes magazine estimated that Hubbard’s Scientology-related income exceeded $40 million in 1982 alone. Impact The Church of Scientology, an enterprise grossing an estimated $100 million a year, became an extremely visible religion and arguably a successful one under Hubbard’s leadership. In later years, it would continue to be controversial, both because of its intimate connection between faith and funds and because adherents were quick to attempt to silence would-be critics. As the founder of the Church of Scientology, Hubbard influenced millions of people. In all, his work included more than five thousand writings, including dozens of books and three thousand tape-recorded lectures about Dianetics and Scientology that have been translated into numerous languages. Indeed, the Guinness Book of World Records says that Hubbard is the world’s mosttranslated author.
The Eighties in America Further Reading
Atack, Jon. A Piece of Blue Sky: Scientology, Dianetics, and L. Ron Hubbard Exposed. New York: Carol, 1990. Hubbard, L. Ron. Scientology: The Fundamentals of Thought. Washington, D.C.: Bridge, 1997. M. Casey Diana See also
Book publishing; Psychology; Religion and spirituality in the United States; Scandals.
■ Hudson, Rock Identification American movie star Born November 17, 1925; Winnetka, Illinois Died October 2, 1985; Beverly Hills, California
Hudson became the first major Hollywood celebrity to die of AIDS. His public announcement that he had the disease and the posthumous revelation that he had been gay dramatically changed the public perception of both homosexuality and the AIDS epidemic. Rock Hudson was a popular romantic leading man in film and television from the 1950’s through the 1970’s, starring in a wide range of films including Giant (1956), Magnificent Obsession (1954), Pillow Talk (1959), and Ice Station Zebra (1968). He completed over sixty-five motion pictures and starred in several television series, including McMillan and Wife and Dynasty. During the 1980’s, his health began to decline, and his career began to fade. Though he was diagnosed with acquired immunodeficiency syndrome (AIDS) on June 5, 1984, it was not until his stunning television appearance the following year on Doris Day’s Best Friends that audiences began to see the remarkable decline in his health. After a brief claim that he was stricken with liver cancer, on July 25, 1985, Hudson issued a statement acknowledging that he had AIDS. After spanning the globe to seek out treatments for the disease, Hudson returned to his house in Beverly Hills. He died on the morning of October 2, 1985. He was fifty-nine. Following his death, his former life partner, Marc Christian, sued Hudson’s estate for the “intentional infliction of emotional distress.” Christian, who remained HIV-negative, claimed that Hudson had continued to have sex with him after he knew of his human immunodeficiency virus (HIV) status.
Hudson, Rock
■
493
Impact Rock Hudson was instrumental in eliminating the stigma attached to AIDS. In the mid-1980’s, AIDS panic was pervasive in the media and politics, as well as in society, giving rise to a ferocious homophobia. The disease was either demonized or ignored. Hudson’s admission gave AIDS a human face, resulting in an immediate rise in AIDS education and compassion for those afflicted with it. Shortly before his death, Hudson stated, “I am not happy that I am sick. I am not happy that I have AIDS. But if that is helping others, I can at least know that my own misfortune has had some positive worth.” Hudson’s story inspired an enormous rise in celebrity activism surrounding the disease. In the late 1980’s, popular actors such as Elizabeth Taylor, Carol Burnett, and others began to raise money and speak out on behalf of AIDS awareness. In addition, Hudson helped break the stigma and stereotype of gay men. His status as a masculine leading man, and his extreme popularity with all Americans, tempered the rising homophobia. Hudson had followed the protocol for gay actors in Holly-
Rock Hudson and Doris Day speak at a news conference on July 18, 1985. (AP/Wide World Photos)
494
■
Hughes, John
The Eighties in America
wood during his lifetime: He dated women, married Phyllis Gates in 1955, and kept his private life out of the press. He believed that if knowledge of his sexuality became public, his career would end. After Hudson’s death, however, more celebrities began to challenge the studio-mandated Hollywood closet and to come out as gay, lesbian, or bisexual. Further Reading
Hudson, Rock, and Sara Davidson. Rock Hudson: His Story. New York: William Morrow, 1986. Kashner, San, and Jennifer MacNair. The Bad and the Beautiful: Hollywood in the Fifties. New York: W. W. Norton, 2002. Daniel-Raymond Nadon See also
ACT UP; AIDS epidemic; AIDS Memorial Quilt; Film in the United States; Homosexuality and gay rights; Television; White, Ryan.
■ Hughes, John Identification
American screenwriter, director, and producer Born February 18, 1950; Lansing, Michigan Hughes, a screenplay writer, director, and film producer, was the creative force behind some of the greatest teen comedies in film history. He was involved in the production of sixteen movies during the 1980’s. John Hughes was born in Michigan in 1950 and moved to the Chicago suburbs with his family in 1963. Glenbrook North High School in Northbrook, Illinois, from which he graduated in 1968, was the model for the schools in most of his films. A college dropout, Hughes left school for advertising. His first screenplay, National Lampoon’s Class Reunion (1982), resulted from Hughes’s position as editor at National Lampoon magazine. He went on to write three highly successful National Lampoon’s Vacation movies (19831989). However, it was the genre of teen angst comedies that made Hughes one of the most famous and successful filmmakers of the 1980’s. Named the “philosopher of adolescence” by Chicago film critic Roger Ebert, Hughes became synonymous with teenage comedies beginning with 1984’s Sixteen Candles, which was followed by The Breakfast Club (1985), Weird Science (1985), and Ferris Bueller’s Day Off (1986). In addition to those films, which he both
Screenwriter and director John Hughes in 1984. (AP/Wide World Photos)
wrote and directed, Hughes wrote two teen films for director Howard Deutch: Pretty in Pink (1986) and Some Kind of Wonderful (1987), the latter of which was not primarily a comedy. All of these films treated adolescence with dignity and understanding; they also featured New Wave sound tracks, grounding them in the teen culture of the 1980’s. Hughes is credited with establishing the Brat Pack, a group of young actors, all of whom went on to adult success in the entertainment industry. In 1988, Hughes shifted to making films about adults, including the very popular Planes, Trains, and Automobiles (1987) and Uncle Buck (1989). These films were not as well received by critics as were his earlier films, however. Shortly after the 1980’s ended, Hughes ceased to direct
The Eighties in America
films, but he remained a prolific writer and producer into the twenty-first century. He is best known, however, for his 1980’s teen comedies. Impact John Hughes’s teen films are characterized by a deep understanding of the pain and fears of the high school experience. Despite the films’ comic nature, they treat their characters seriously. They deal with the difficulties teens experience fitting in and with issues of class difference, social identification, and self-determination. He portrayed characters who struggled with identity issues, usually resolving his films by having disparate groups realize that they had more in common than they believed. The thoughtful handling of teen characters and their problems set these movies above the usual teen sex comedies and farces that preceded and followed them. Further Reading
Bernstein, Jonathan. Pretty in Pink: The Golden Age of Teenage Movies. New York: St. Martin’s Griffin, 1997. Clarke, Jaime, ed. Don’t You Forget About Me: Contemporary Writers on the Films of John Hughes. New York: Simon Spotlight Entertainment, 2007. Leslie Neilan See also
Brat Pack in acting; Breakfast Club, The; Fast Times at Ridgemont High; Film in the United States; New Wave music; Teen films.
■ Hurricane Hugo Identification Disastrous storm Date September 9-25, 1989 Place Formed off the west coast of Africa; struck
the northern Caribbean and the East Coast of North America When it struck North America and the Caribbean in 1989, Hurricane Hugo became the most devastating and costly hurricane then on record. It reached Category 5 on the Saffir-Simpson scale and wrought havoc throughout the Caribbean before striking the United States. The storm killed at least seventy people and caused an estimated $10 billion in damages. In 1959, Hurricane Gracie, a Category 3 storm, made landfall near Beaufort, South Carolina, wreaked some havoc but swiftly weakened, and disappeared into Georgia. At that time, the Sea Islands of coastal
Hurricane Hugo
■
495
South Carolina were thinly populated, inhabited primarily by African American descendants of freed slaves. From the 1960’s onward, virtually all of coastal South Carolina—except for an area north of the Isle of Palms to Pawley’s Island—witnessed massive development: golf courses, gated communities, and tourist destinations. For three decades, hurricane season brought a brush or two, but nothing of significance to remind natives or visitors of the dangers posed by the powerful storms. Preservation of nature’s defenses against storms—sand dunes and sea oats, for example—was often overlooked in the building frenzy. Thirty years after Gracie, following several years of drought, good summer rains refilled lakes, and September brought more of the same. On September 9, a tropical wave moved off Cape Verde, Africa, developing into a tropical storm two days later. Hugo became a hurricane on September 13 and intensified rapidly, becoming a Category 5 storm—a storm with sustained winds of at least 156 miles per hour— while it was still one thousand miles from the North American continent. A National Oceanic and Atmospheric Administration (NOAA) reconnaissance aircraft flew into Hugo on September 15 and discovered sustained wind speeds of 190 miles per hour and a barometric pressure of 918 millibars. Weakening slightly to Category 4 when its highest sustained wind speeds dipped to 140 miles per hour, between September 17 and September 19 Hugo passed over the Caribbean Islands of Guadeloupe, Montserrat, Dominica, the British and U.S. Virgin Islands, and Puerto Rico with devastating fury. Puerto Rico left Hugo much diminished in strength, but the Gulf Stream quickly restored its power to Category 4. Doppler radar made hurricane tracking easier, but it was clear only that Hugo would strike somewhere along the Georgia-South Carolina coast. The residents of Savannah, Georgia, were ordered to evacuate, but a northward hitch by Hugo turned the storm directly toward Charleston. During the night of the September 23, packing sustained winds of 138 miles per hour, Hugo passed over the city. On the windward side of the hurricane, a storm surge in excess of twenty feet inundated the tiny fishing village of McClellanville, among others. The winds virtually destroyed most of the mature longleaf pines and palmettos in the Francis Marion National Forest, snapping their tops off about twenty feet above ground. Sadly, despite having deep root systems, the live oaks of the forest were also devas-
496
■
The Eighties in America
Hurricane Hugo
Ben Sawyer Bridge in South Carolina, following the onslaught of Hurricane Hugo. (NOAA/National Hurricane Center)
tated, because the soggy ground resulting from earlier rainfall allowed their roots simply to pop out of the ground. Hugo moved swiftly through South Carolina and was still a Category 1 storm by the time it reached Charlotte, two hundred miles inland. Racing northward, the storm finally disappeared over eastern Canada on September 25, but the effects of its devastation would be felt for years.
prepared for managing assistance following the disaster. The hurricane thus resulted in a greater concern for disaster preparedness, especially at the local level. It also helped bring about passage of proposed legislation to protect barrier islands, and it convinced more people living in hurricane-prone regions to take seriously evacuation orders for their communities.
Impact Hugo’s destruction was, at the time, the most costly in recorded history, resulting in its name being permanently retired. Had the storm continued on its original course, damages would have been far greater, given the enormous buildup along the coast from Charleston to Savannah. As it was, the worst of the storm pummeled small towns and a national forest. Caribbean islands in Hugo’s path suffered terribly, and agriculture in eastern South Carolina was essentially wiped out. Significantly, the Federal Emergency Management Agency (FEMA), to whom U.S. senator Fritz Hollings referred as a “bunch of bureaucratic jackasses,” was woefully un-
Further Reading
Boone, C. F. Frank. . . . and Hugo Was His Name. Charleston, S.C.: Boone, 1989. An account by a reporter for Charleston’s News and Courier regarding the impact of Hugo. Elsner, James B., and A. Birol Kara. Hurricanes of the North Atlantic: Climate and Society. New York: Oxford University Press, 1999. Provides climatology data for the twentieth century, analysis of hurricane climate research, and discussion of hurricane information, including dangers posed to catastrophe insurance. Fraser, Walter J. Lowcountry Hurricanes: Three Cen-
The Eighties in America
turies of Storms at Sea and Shore. Athens: University of Georgia Press, 2006. A history of more than eighty hurricanes and tropical storms along the Georgia-South Carolina seaboard from 1686. Golden Joseph H., Riley M. Chung, and Earl J. Baker. Hurricane Hugo: Puerto Rico, the U.S. Virgin Islands, and South Carolina—September 17-22, 1989. Washington, D.C.: National Academy Press, 1994. Important coverage of Hugo’s impact from formation to termination. Moore, Jamie W., with Dorothy P. Moore. Island in the Storm: Sullivan’s Island and Hurricane Hugo. Charleston, S.C.: History Press, 2006. A College of Charleston history professor’s account of Hugo, focusing on its social, economic, and environmental effects. Simon, Seymour. Hurricanes. New York: HarperCollins, 2003. Succinct, general introduction to hurricanes. Tait, Lawrence S., ed. Beaches: Lessons of Hurricane Hugo. Tallahassee: Florida Shore & Beach Preservation Association, 1990. Discusses the problem of and possible solutions to beach erosion. Trimnal, Katherine J. Photographer’s Notebook: Hurricane Hugo, September 21-22, 1989. Limited ed. Charleston, S.C.: n.p., 1991. Collection of photographs documenting the hurricane’s destruction. U.S. Department of Agriculture. Hurricane Hugo: South Carolina Forest Land Research and Management Related to the Storm. Asheville, N.C.: Southern Research Station, 1996. Hugo’s high winds destroyed most of the mature trees in a swath from the coast to north of Charlotte. This report evaluates the impact of that loss. William S. Brockington, Jr. See also
Doppler radar; El Niño; Environmental movement; Natural disasters.
■ Hurt, William Identification American actor Born March 20, 1950; Washington, D.C.
Hurt became a major film star at the beginning of the decade and continued doing challenging work, winning an Academy Award. After several years of stage and television work, William Hurt made his film debut in 1980 with Altered
Hurt, William
■
497
States. Though Paddy Chayefsky’s psychological drama was a critical and commercial failure, Hurt fared better with his second film, the romantic thriller Eyewitness (1981), as a janitor infatuated with a television anchorwoman played by Sigourney Weaver, with whom he had starred in the PBS series The Best of Families (1978). Hurt became a full-fledged star with writer-director Lawrence Kasdan’s Body Heat (1981), a very loose remake of Double Indemnity (1944). Hurt played a none-too-bright Florida lawyer tricked into killing the wealthy husband of femme fatale Kathleen Turner, in her film debut. The role was the first to allow Hurt to display his unique blend of self-deprecating humor, vulnerability, and sex appeal. Body Heat is generally considered one of the best film noirs since the genre’s heyday in the late 1940’s and early 1950’s. Hurt’s most popular film of the decade was Kasdan’s The Big Chill (1983). As seven college friends unite in the film to ponder their failures and disappointments, Hurt stands out as the group’s most philosophical member, a disillusioned, impotent Vietnam War veteran. Though he would become known for such intimate dramas, Hurt followed The Big Chill with another thriller, Gorky Park (1983). In this adaptation of Martin Cruz Smith’s best seller, Hurt played a Moscow police inspector investigating a complex triple murder. After starring as a sleazy Hollywood casting director in his first Broadway play, David Rabe’s Hurlyburly (1984), Hurt received the first of his three Academy Award nominations of the 1980’s for Kiss of the Spider Woman (1985), adapted from Manuel Puig’s 1976 novel. As a homosexual sharing a South American prison cell with a political prisoner (Raul Julia), Hurt created one of his most sensitive portrayals and won the Academy Award for Best Actor. He was also nominated for playing another sensitive character in Children of a Lesser God (1986), adapted from Mark Medoff’s Broadway play. Hurt portrayed a teacher at a school for the deaf who falls in love with one of his students, played by Oscar winner Marlee Matlin, who had a romance with Hurt in real life. Hurt displayed his comic skills in James L. Brooks’s Broadcast News (1987), earning another Oscar nomination for his portrayal of a dim-witted television anchor. He ended the decade by reuniting with Kasdan and Turner for The Accidental Tourist (1988), an adaptation of Anne Tyler’s novel. Hurt
498
■
Hustler Magazine v. Falwell
played another of his melancholy characters, a travel writer who slowly recovers from the death of his son and the breakup of his marriage to Turner by falling for an eccentric dog trainer played by Geena Davis, who received an Academy Award for her performance. Impact Hurt quickly established himself as one of the decade’s most popular leading men. Playing a series of confused, emotionally wounded characters with an understated Method approach earned him critical acclaim and box-office success. Further Reading
Karger, Dave. “A World of Hurt.” Entertainment Weekly, no. 865 (February 24, 2006): 26. Linfield, Susan. “Zen and the Art of Film Acting.” American Film 11, no. 6 (July/August, 1986): 28-33. Millea, Holly. “The Star Who Walked Away.” Premiere 11 (October, 1997): 120-130, 141. Michael Adams See also
Academy Awards; Big Chill, The; Close, Glenn; Film in the United States; Kiss of the Spider Woman; Turner, Kathleen; Weaver, Sigourney.
■ Hustler Magazine v. Falwell Identification U.S. Supreme Court decision Date Decided on February 24, 1988
The Supreme Court ruled that the First Amendment’s free speech clause prevented Jerry Falwell, a prominent preacher and television personality, from successfully suing Hustler magazine for libel and intentional infliction of emotional distress for publishing a parody that portrayed Falwell as having engaged in sexual intercourse with his mother in an outhouse. The U.S. Supreme Court has interpreted the free speech clause of the First Amendment to restrict the ability of public figures to sue for libel without demonstrating that the person sued had willfully made false statements of fact about the public figure. In Hustler Magazine v. Falwell, the Court concluded that the parody published by Hustler magazine could not be reasonably construed as making a statement of fact about Jerry Falwell. Since Falwell was a public figure, his libel claim against the magazine was rejected. The remaining issue was whether the constitutional protections of the freedoms of speech and
The Eighties in America
the press would also apply in cases where a party claimed that he had suffered intentionally inflicted emotional distress because of another’s speech. Chief Justice William H. Rehnquist announced the Court’s unanimous decision, in which Justice Byron White concurred in a separate opinion. Justice Anthony Kennedy did not participate in the case. Chief Justice Rehnquist concluded for the Court that such interests as a state might have in protecting public figures from intentionally inflicted emotional distress could not trump the First Amendment’s protection of free speech. Speech about public figures could not always be expected to be temperate, and, so long as it did not constitute false statements of fact made deliberately or made with reckless disregard for whether the statements were false, such speech received the full protection of the First Amendment. Falwell had argued that although this standard was appropriate for claims of libel in which the harm suffered by an individual was one to the individual’s reputation, it ought not to apply to claims that the speech had inflicted emotional distress upon a victim. The Court rejected this argument, concluding that even speech that caused emotional distress to public officials was protected by the First Amendment, so long as it did not contain deliberately made false statements of fact. Impact Jerry Falwell, a prominent Baptist minister and television personality, achieved prominence during the 1980’s for his leadership of the Moral Majority, a conservative political action group. However outrageous Hustler’s parody may have been, though, Falwell’s position as a public figure meant that freedom of speech would protect even this speech from liability. As a parody, the magazine’s article could not reasonably be interpreted as a statement of fact, whether false or not. Though Falwell included not only a libel claim, but one asserting intentionally inflicted emotional damage, the Court, with essentially one voice, ruled that the First Amendment prevented his recovery. The victory of Larry Flynt in this case leant credence to his claim to be a crusader for civil liberties rather than merely a pornographer, and it helped establish that even pornographic magazines could function as legitimate organs of cultural commentary deserving of First Amendment protections. Further Reading
Russomanno, Joseph. Speaking Our Minds: Conversations with the People Behind Landmark First Amend-
The Eighties in America
ment Cases. Mahwah, N.J.: Lawrence Erlbaum Associates, 2002. Stein, Laura. Speech Rights in America: The First Amendment, Democracy, and the Media. Urbana: University of Illinois Press, 2006. Timothy L. Hall See also
Falwell, Jerry; Flynt, Larry; Supreme Court
decisions.
■ Hwang, David Henry Identification
Chinese American playwright, screenwriter, and librettist Born August 11, 1957; Los Angeles, California Hwang became one of the most significant Asian American voices in the theater, as he explored the distinctive experience of Chinese Americans and produced groundbreaking work at the intersection of experimental and mainstream theater. David Henry Hwang received a B.A. in English from Stanford University in 1978 and attended the Yale University School of Drama from 1980 to 1981. Hwang’s first play, F.O.B. (1978), was initially produced at Stanford, but subsequently an Off-Broadway production won an Obie Award in 1980. F.O.B., whose title is an acronym for “fresh off the boat,” reveals an immigrant’s struggle to learn new customs without abandoning his cultural heritage. This theme recurred in several of Hwang’s other plays, including The Dance of the Railroad (1981), Family Devotions (1981), The House of Sleeping Beauties (1983), The Sound of a Voice (1983), As the Crow Flies (1986), and My American Son (1987). These works earned ample recognition for Hwang, including a Rockefeller Fellowship (1983), a Guggenheim Fellowship (1984), and a National Endowment for the Arts Fellowship (1985). Hwang’s most influential play of the decade was M. Butterfly, which was produced on Broadway in 1988. The play won the Outer Critics Circle Award, the Drama Desk Award, and the Tony Award. The play was based on an article in The New York Times (May 11, 1986) that reported on a French diplomat in China whose twenty-year relationship with a Chinese opera singer ended following his conviction in Paris for espionage. The French diplomat had mistaken the male opera singer for a woman and believed that the singer gave birth to a child fathered by the diplomat. The singer had been cooperating
Hwang, David Henry
■
499
with the Chinese authorities and had obtained confidential diplomatic information from the diplomat and delivered it to the Chinese. With this fantastic yet historical basis, M. Butterfly also borrowed from Giacomo Puccini’s Madama Butterfly (1905; Madame Butterfly). Musical themes from that opera heighten the effect of Hwang’s play, and the story of Pinkerton’s betrayal of Cio-Cio-San in Madame Butterfly underlies Hwang’s story of betrayal involving Gallimard and Song. From the perspective of the late 1980’s, members of Hwang’s audience can see the effects of the Cultural Revolution in China; the diplomatic tensions among the United States, France, and China during the Vietnam War; the misunderstandings and stereotypical impressions that contributed to the West’s view of the East; and the dangers of basing a love relationship on fantasy rather than reality. Impact M. Butterfly was one of the major plays of the late twentieth century, and it elevated the already successful Hwang to the heights of the American theater. The play was at once an emotionally en-
David Henry Hwang.
500
■
The Eighties in America
Hwang, David Henry
gaging narrative and an exploration of Orientalism, particularly of the intersection of traditional Western representations of Asians and of gender. It was therefore significant both as drama and as cultural criticism. Hwang ended the 1980’s at the height of his career, and he continued to expand his creative horizons thereafter.
______. Introduction to F.O.B., and Other Plays. New York: New American Library, 1990. Simakawa, Karen. “Who’s to Say? Or, Making Space for Gender and Ethnicity in M. Butterfly.” Theatre Journal 45 (October, 1993): 345-367. William T. Lawlor See also
Further Reading
Hwang, David Henry. Afterword to M. Butterfly. New York: New American Library, 1990.
Asian Americans; China and the United States; Homosexuality and gay rights; Multiculturalism in education; Theater.
I ■ Iacocca, Lee
saster. He introduced new products such as the Kcar, a compact front-wheel-drive automobile that Identification Automobile industry executive was hailed as both economical and efficient in a time Born October 15, 1924; Allentown, Pennsylvania of painfully high gasoline prices for American conA business turnaround specialist, Iacocca brought the sumers. Chrysler Corporation back from financial ruin through Iacocca also sought financial relief and cooperastrategic business planning and creative advertising praction from Chrysler’s major stakeholders. To that tices. Iacocca represented the consummate business execuend, he successfully negotiated a $1 billion wage and benefits givebacks with the leadership of the United tive, and his name became associated both with corporate Auto Workers (UAW), a major labor union. He also success and with the American automobile industry. restructured Chrysler by downsizing the workforce considerably, by eliminating managerial positions, By 1980, Lee Iacocca, the chief executive officer of and by closing company operations and factories Chrysler Corporation, had already pulled off a mathat he deemed inefficient. To further buttress Chrysjor business coup. Iacocca had secured a guaranteed loan of $1.5 billion from the United States to stave ler’s financial position, he sought and received better business terms and conditions from external stakeoff the near bankruptcy of the world’s third largest holders, such as financial institutions and corporate automobile company. During the decade, Iacocca developed and executed a series of business stratesuppliers. He led by example when he reduced his salary as chairman from $360,000 per year to $1. gies to return Chrysler from the brink of financial diIn an unprecedented move by a corporate executive, Iacocca appeared personally in television commercials to promote Chrysler automobile sales. Ever the consummate salesman, Iacocca appealed to the American public in two ways: first, by asking Americans to buy an American car, and second, by challenging the public to purchase a Chrysler product with his famous mantra: “If you can find a better car, buy it!” His appearance in television advertising spots thrust Iacocca into the public limelight, and he became a major American personality. Iacocca received further public recognition when President Ronald Reagan appointed him to lead the Statue of Liberty-Ellis Island Chrysler chairman Lee Iacocca stands in front of a Lamborghini Countach, celebrating his Centennial Commission in May, company’s acquisition of Lamborghini in 1987. The deal served to confirm Chrysler’s as1982. cendancy in a decade defined by mergers and acquisitions. (AP/Wide World Photos)
502
■
Immigration Reform and Control Act of 1986
Under Iacocca’s leadership, Chrysler reversed its poor fortune. In 1983, Chrysler repaid its guaranteed loan in full, and shortly thereafter the once nearly defunct automobile company realized a $2.4 billion profit. In 1984, Iacocca cowrote Iacocca: An Autobiography, which became an instant success. With the company out of debt, Iacocca put some of Chrysler’s profits to work when in 1985, he spent more than $750 million to acquire Gulfstream Aerospace Corporation and E. F. Hutton Credit Corporation. His success at Chrysler and his national visibility and celebrity status hurled him into the spotlight as a potential U.S. presidential candidate in 1988. During the last few years of the 1980’s, Iacocca’s iconic status was slightly tarnished, after his $18 million compensation package created a stir among critics who believed that executive compensation was excessive, especially in the light of the threat to the U.S. industry posed by Japanese imports. Impact Lee Iacocca restored pride in American automobiles. He returned Chrysler to a position of prominence as an American car manufacturer and convinced the American public to purchase Chrysler products. Further Reading
Iacocca, Lee, with William Novak. Iacocca: An Autobiography. New York: Bantam, 1984. Levin, Doron P. Behind the Wheel at Chrysler: The Iacocca Legend. New York: Harcourt, Brace, 1995. Joseph C. Santora See also
Business and the economy in the United States; Chrysler Corporation federal rescue; Reagan, Ronald; Reaganomics; Statue of Liberty restoration and centennial.
■ Immigration Reform and Control Act of 1986 Identification Federal legislation Date Signed into law on November 6, 1986
The Immigration Reform and Control Act was enacted to control illegal immigration to the United States, especially that from Mexico. Among its other provisions, it created sanctions against anyone employing illegal immigrants. During the second half of the twentieth century, the number of people immigrating to the United States,
The Eighties in America
both legally and illegally, increased. Mexican immigration contributed significantly to this pattern. The United States had begun officially encouraging Mexican laborers to enter the country in 1942, when it instituted the bracero program to ensure that there were enough agricultural laborers to work U.S. farms during World War II. The program continued to bring Mexican contract workers to the United States until it was discontinued in 1964, and it helped establish a pattern in which most Mexican immigrants to the country became agricultural laborers. Despite the bracero program’s official end, Mexicans continued to come to the United States. In fact, the 1970’s and the 1980’s saw a sharp increase in Mexican immigration. Many people continued to come from Mexico specifically to work in agricultural jobs, even those who, in the absence of the bracero program, could no longer do so legally. The 1980’s saw a shift in immigration patterns. Mexican immigrants started to move to U.S. cities, becoming urban dwellers. Previously, their agricultural occupations had led them to settle in rural areas. The way that Mexican immigrants were viewed by the U.S. government also began to change through the 1970’s and into the 1980’s. As more immigrants bypassed U.S. customs stations and border guards to enter the country illegally, the government became more concerned. Because Mexico and the United States share a long border, Mexico became a primary focus of that concern. Many people were able to cross the border without encountering a U.S. immigration officer or undergoing any sort of inspection. Such illegal immigration increased sharply in the 1970’s and the 1980’s. The White House, Congress, a select commission, and numerous task forces began seriously to address illegal immigration as early as the 1970’s. The Immigration Reform and Control Act (IRCA) of 1986 represented a culmination of this activity. The IRCA was passed in the closing days of the 1986 congressional session. The primary purpose of the act was to remove from the U.S. labor market those immigrants not legally entitled to participate in it. The primary sponsors of the IRCA were Republican senator Alan Simpson of Wyoming and Democratic representative Romano Mazzoli of Kentucky. Known in its early years as the Simpson-Mazzoli Act, the IRCA enacted two primary policy instruments. First, it granted legal status to certain illegal immigrants—those who could prove to the Immi-
The Eighties in America
gration and Naturalization Service (INS) that they had been living continuously in the United States since 1982. Second, it imposed financial and legal penalties, known as employer sanctions, on employers who knowingly hired illegal immigrants. This measure was designed to reduce the demand for illegal labor, which was cheaper than legal labor. To accomplish this goal, the act resulted in the creation of the I-9 form, a document attesting to one’s legal employability in the United States, which all potential employees in the nation were required to file. One of the goals of the IRCA was to increase the number of independent immigrants—those with no family connections—in the United States. Previously, potential immigrants with siblings already residing in the United States were given preference in obtaining visas to enter the country. The IRCA eliminated this “sister-brother” preference. The IRCA was signed into law by President Ronald Reagan on November 6, 1986. Impact The Immigration Reform and Control Act of 1986 sought to curb a trend toward the exploitation of illegal labor in the United States. This trend had created an underground market for cheap labor that affected the “legitimate” labor market and caused a great many people, both Mexican and of other nationalities, to enter the United States illegally. The pattern continued after the act’s passage, so it did not seem to accomplish its goal. However, as a result of the act, more than 2 million immigrants were removed from the underground labor market by being made legal residents of the United States. Further Reading
Hufbauer, Gary Clyde, and Jeffrey J. Schott. “The Immigration Reform and Control Act of 1986.” In NAFTA Revisited: Achievements and Challenges. Washington, D.C.: Institute for International Economics, 2005. Magaña, Lisa. Straddling the Border: Immigration Policy and the INS. Austin: University of Texas Press, 2003. Zolberg, Artistide R. “Reforming the Back Door: The Immigration Reform and Control Act of 1986 in Historical Perspective.” In Immigration Reconsidered: History, Sociology, and Politics. New York: Oxford University Press, 1990. Alison Stankrauff
Immigration to Canada
■
503
See also
Business and the economy in the United States; Conservatism in U.S. politics; Demographics of the United States; Immigration to the United States; Latin America; Latinos; Mexico and the United States; Reagan, Ronald; Unemployment in the United States.
■ Immigration to Canada Definition
Arrival and settlement of people from other countries in Canada
Following the adoption of a new immigration law, Canadian immigrants became more diverse in ethnic and racial origins. Asians, in particular, made up a larger portion of the immigrants admitted to Canada in the 1980’s. At the end of the 1970’s, Canada liberalized its immigration law with the Immigration Act of 1976. This law prohibited discrimination against immigrants on the basis of race, national or ethnic origin, religion, or sex. It described immigration as a positive means of achieving national goals, and it established a separate class for refugees. The number of Canadian immigrants did not begin to increase immediately following the adoption of the new law. Only in the late 1980’s did such an increase occur. Between 1981 and 1986, about 678,000 people immigrated to the country; between 1986 and 1991, about 1,164,000 people immigrated. However, the abolition of discrimination contributed to the beginning of an increase in the national and ethnic diversity of immigrants, and it led to an especially rapid rise in immigrants from Asia. Between 1981 and 1990, total Asian migration to Canada rose to 443,545—up from 311,270 between 1971 and 1980 and from 90,065 between 1961 and 1970. These Asian immigrants of the 1980’s came from several different countries, but the top three were Hong Kong, India, and Vietnam. Immigration from Hong Kong and India Long a center of trade and manufacturing, Hong Kong emerged during the 1970’s as a major world center of finance and banking. At the same time, though, many people in Hong Kong and elsewhere in the world began to question how long the island would remain under British control and when it would be returned to China. During the early 1980’s, as the end of a ninety-nine-year lease from China to the United
504
■
The Eighties in America
Immigration to Canada
Kingdom on territories neighboring Hong Kong approached, the British began to believe that they could not maintain Hong Kong and that they should negotiate its return to China. In December, 1984, the United Kingdom and China signed an agreement under which Hong Kong would become a special administrative region of the Republic of China in 1997. Although the agreement between the two nations stipulated that Hong Kong would retain its capitalist system for at least fifty years after the turnover, many in Hong Kong were worried about the prospect of communist rule. Canada was an especially appealing location for resettlement, especially for Hong Kong’s professional and business class, because it offered a high standard of living and favorable immigration policies to immigrants who could contribute to the Canadian economy. Although there had been very little immigration from Hong Kong to Canada before the 1960’s, during the 1970’s people from Hong Kong rose to constitute the eighth highest proportion of Canadian immigrants. In the 1980’s, with the agreement between the United Kingdom and China, the small island of Hong Kong became the number-one source of migration to the North Amer ican nation. Immigrants to Canada from Hong Kong rose from 12,580 during the 1960’s, to 41,270 during the 1970’s, to 76,980 during the 1980’s. The people of India, meanwhile, did not face the political concerns faced by the people of Hong Kong. Nevertheless, India had a large class of highly educated professionals and businesspeople. In India, another British colony, fluency in English was also widespread. While immigration from India to Canada did not attract as much media attention as did immigration from Hong Kong, India was not far behind the island in the movement of its natives to the North American nation. The 25,080 immigrants from India in 1961-1970 rose to 67,375 in 1971-1980 and remained at approximately the same level (68,080) from 1981-1990. Southeast Asian Refugees
The special category for refugees that was created by the Immigration Act of 1976 made Canada a place of welcome for refugees fleeing Southeast Asia following the end of the Vietnam War in 1975. Southeast Asian refugees did not settle in Canada in the same numbers that they settled in the United States, but the growth in their rate of immigration was notable nonetheless. In
1975, Canada admitted 2,269 people from Vietnam. As in the United States, it was believed that this influx of refugees would be a one-time event. However, as refugees continued to flee from the Southeast Asian country, Canada began a government-sponsored program of refugee resettlement in 1979. The first Vietnamese refugees to be welcomed under this program arrived by air at Toronto’s Pearson National Airport in July, 1979, from refugee camps in Hong Kong. During 1979, just under 20,000 Vietnamese resettled in Canada. Government programs attempted to use private sponsors to resettle these new arrivals around the country. The numbers of Vietnamese arriving in Canada went up to 25,541 in 1980. From 1981 to 1990, a total of 65,490 people from Vietnam immigrated to Canada. As the Vietnamese Canadian population grew, its members increasingly left the scattered locations where they had initially been settled and moved to large cities, where they formed ethnic communities. Among other places, large Vietnamese communities emerged in Ottawa and Vancouver. Impact The trends in immigration seen during the 1980’s made Canada a more ethnically diverse society. The nation was extremely concerned with diversity and cultural rights in the early 1980’s as it enacted a new constitution. However, Canadians’ focus at that time was almost exclusively on the interrelations of British Canadians, French Canadians, and First Nations peoples. By the end of the decade, many other groups had made their presence known within Canada’s diversifying population. By the end of the twentieth century, for example, one out of every thirty Canadians was of Chinese descent. New ethnic communities formed in many of Canada’s large southern cities, redefining the nation’s urban culture. Further Reading
Cameron, Elspeth. Multiculturalism and Immigration in Canada: An Introductory Reader. Toronto: Canadian Scholars’ Press, 2004. Collection of readings that provides an introduction to the history of multicultural ideology in Canada and shows the connection of immigration issues to issues of multiculturalism. Folson, Rose Baaba. Calculated Kindness: Global Restructuring, Immigration, and Settlement in Canada. Halifax, N.S.: Fernwood, 2004. Set of case studies of immigrants that disapprovingly argues immi-
The Eighties in America
grants are admitted to serve Canada’s economic interests. Li, Peter. Destination Canada: Debates and Issues. New York: Oxford University Press, 2003. Examination of Canadian immigration policies that argues against efforts to limit immigration to Canada. Carl L. Bankston III See also
Asian Americans; Immigration to the United States; Minorities in Canada.
■ Immigration to the United States Definition
Arrival and settlement of people from other countries in the United States
During the 1980’s, the increased immigration that had begun after 1965 intensified. Hispanics and Asians, who had started to make up a growing number of American immigrants during the 1970’s, immigrated in even greater numbers in the 1980’s, so these two demographics come to account for more than three-quarters of all the foreign-born people in the United States. The 1980’s saw a number of refugee movements, and undocumented immigration, especially from Mexico, became an issue that drew widespread public and official attention. The 1980’s were a decade of heavy immigration to the United States. The nation had seen a great wave of immigration between 1880 and the start of World War I. After the war, restrictive immigration legislation passed in the 1920’s, the decline of economic opportunities during the Great Depression of the 1930’s, and World War II during the 1940’s brought immigration to a low point in American history. This situation began to change after the United States liberalized its immigration policies in 1965. The number of U.S. immigrants increased rapidly during the 1970’s, and the increase grew even greater during the 1980’s. Of the estimated 21,596,000 foreignborn people living in the United States in 1990, about 43 percent had arrived during the 1980’s. New Immigrant Demographics
The United States had begun to draw more Asian immigrants during the 1970’s, and this trend intensified in the 1980’s. Of the foreign-born U.S. residents in 1990, less than 8 percent of those who had arrived in the country before 1970 were Asian; a little over one-fourth (26
Immigration to the United States
■
505
percent) of those who had arrived during the 1970’s were Asian; and 29 percent of those who had arrived during the 1980’s were Asian. The 1980’s also continued a trend of growing Hispanic immigration. Again, among foreign-born residents in 1990, a little under one-third (31 percent) of those who had immigrated before 1970 were Hispanic; 43 percent of those who had arrived in the 1970’s were Hispanic; and 46 percent of those who had arrived in the 1980’s were Hispanic. As a consequence of the growth in Asian and Hispanic immigrants, these two groups played a much larger role in the demographics of the nation by the end of the decade. In 1980, Hispanics had made up 6 percent of the total U.S. population and 30 percent of its foreign-born population. Ten years later, Hispanics constituted 9 percent of the population and 40 percent of the foreign-born population. Asians made up slightly over 1.5 percent of the U.S. population and 12 percent of the immigrant population in 1980, and those numbers rose to 3 percent of the total population and 23 percent of the immigrant population in 1990. Mexico sent more immigrants to the United States than did any other country during this decade, with 1,655,843 legally admitted immigrants moving from Mexico to its northern neighbor between 1981 and 1990. This number represented nearly one-fourth of all legal immigrants. The Philippines was a distant second, with 548,764 legal immigrants, or a little over 7 percent of all those admitted to the United States between 1981 and 1990. Other countries sending large numbers of legal immigrants to the United States included China (346,747 immigrants), South Korea (333,746), Vietnam (280,782), the Dominican Republic (252,035), India (250,786), El Salvador (213,539), and Jamaica (208,148). Refugees During the 1980’s, more refugees entered the United States than in any other decade in American history. Approximately one million refugees arrived in the country from 1980 through 1989. Following the end of the Vietnam War in 1975, Southeast Asians began to resettle in the United States. Largely in response to the movement of Southeast Asian refugees, the U.S. Congress passed the Refugee Act of 1980, the most comprehensive piece of refugee legislation in U.S. history. As a result, hundreds of thousands of refugees from Vietnam, Cam-
506
■
Immigration to the United States
bodia, and Laos were resettled in North America in the early 1980’s. In 1980 alone, over 170,000 people from these three countries entered the United States. By 1990, the United States had received 149,700 arrivals from Cambodia, 214,100 arrivals from Laos, and 687,800 arrivals from Vietnam. Cuban refugees had been coming to the United States since the 1960’s. A first wave from Cuba had left the island nation between 1959 and 1962, following the revolution led by Fidel Castro. A second wave followed from 1965 to 1974, when the Cuban and American governments agreed to arrange flights between the two countries for Cubans who wished to leave. The Cuban refugee flow slowed substantially after the halting of the flights. In 1980, though, the Cuban government faced internal unrest. This unrest led to a third wave of Cuban refugees. Hoping to ease public unrest on the island, the Cuban government decided to open the port city of Mariel for unrestricted emigration. Vessels from Mariel brought more than 125,000 refugees from Cuba to the United States over a six-month period. The Mariel boatlift became highly controversial, because the Cuban government placed some convicted felons and inmates of mental institutions on the boats. Some of the Marielitos, as members of this
The Eighties in America
third wave came to be known, became involved in criminal activities in their new country. Although U.S. immigration officials determined that only about twenty-five hundred individuals were legally excludable because of criminal activities or other undesirable characteristics, many Americans believed that the Cuban government had used the boatlift to dump their socially unacceptable citizens on U.S. shores. Refugees from the Soviet Union also made up a part of the refugee movement to the United States beginning in the 1970’s and continuing into the 1980’s. Over 100,000 refugees came to the United States from the Soviet Union between 1970 and 1988. Most of these refugees were Soviet Jews, but they also included members of Christian religious groups and ethnic minorities.
Undocumented Immigration The 1980’s saw increasing concern over undocumented immigration, also called illegal immigration. Numbers of immigrants entering the United States illegally showed an apparent sharp increase, rising from an estimated 130,000 undocumented immigrants each year during the 1970’s to an estimated 300,000 per year in the 1980’s. Mexico, the largest source of legal migration to the United States, was also the largest source of undocumented migration. This was due both to economic problems in Mexico and to the demand for Mexican workers in the United States. Over 70 percent of Mexico’s export revenues came from oil at the beginning of the 1980’s. As the price of oil declined beginning around 1982, Mexico had less revenue coming in, and the poverty of its citizens increased. At the same time, American companies began hiring Mexican immigrants for relatively low wages to work in the poultry industry, for carpet manufacturers, and at other labor-intensive jobs. Concern over undocumented immigration led to the Immigration Reform and Control Act (IRCA) of 1986. This act created During the Mariel boatlift, a tugboat laden with Cuban refugees sails toward Key West, fines for employers who hired illeFlorida, on May 6, 1980. (AP/Wide World Photos)
The Eighties in America
gal immigrants, and it offered legal status to immigrants who had entered the country illegally before January 1, 1982. The goal of the legislation was to stop encouraging new undocumented immigrants to enter the country by removing the jobs that were drawing them to the United States. It also sought to bring the large underground population of undocumented workers into the mainstream of economic and social life by enabling them to obtain legal status. However, undocumented immigrants continued to cross the U.S. border in rising numbers through the rest of the twentieth century. Patterns of Settlement
The Northeast had been the region receiving the greatest number of immigrants during the great immigration wave of the late nineteenth and early twentieth centuries. Post-1965 immigrants, though, frequently gravitated toward the West Coast. This trend increased during the 1980’s. The U.S. Census of 1990 indicated that over 38 percent of all immigrants who had arrived in the United States during the previous ten years had settled in the Pacific Division, comprising California, Oregon, Washington, and Alaska. Among immigrants who had arrived during the 1970’s, under 37 percent were living in the Pacific division, compared to one-fourth of immigrants who had arrived before the 1970’s and only 14 percent of non-immigrants. California had become the nation’s immigrant magnet. It was home to 35 percent of the foreignborn people who had immigrated during the 1980’s and 31 percent of all immigrants. Long a state with a large Hispanic population, California had become an especially favored destination for Asian immigrants by the 1980’s. By 1990, California was home to 40 percent of Asian immigrants and just under 40 percent of all Asians in the United States. Asian communities, such as Orange County’s Vietnamese American Little Saigon, became prominent parts of the landscape, particularly in Southern California. New York, a traditional immigrant destination since the nineteenth century, did continue to attract newcomers to the United States in the next-to-last decade of the twentieth century. Second only to California as a place of settlement, New York held 14 percent of American immigrants at the end of the 1980’s, with most of them concentrated in New York City. However, New York saw a relative decline as an immigrant location, and in 1990 it held only 18 per-
Immigration to the United States
■
507
cent of immigrants who had reached the United States before 1970. Outside of California and New York, Texas and Florida were home to large proportions of immigrants who arrived during the 1980’s. In the case of Florida, this was a situation that had existed since the 1960’s. Texas had begun to receive a disproportionate share of immigrants during the 1970’s, and this situation continued for the rest of the century. Cubans made up the single largest immigrant group in Florida. However, the largest waves of movement from Cuba to Florida had occurred in the 1960’s and 1970’s, with the roughly 125,000 brought by the Mariel boatlift making up most Cuban migration during the 1980’s. As a result, the Cuban share of the foreign-born population had decreased slightly by 1990. In that year, only 18 percent of the Floridian foreign-born residents who had immigrated during the 1980’s were Cuban, compared with 35 percent of those who had immigrated before 1970. Most immigrants to Texas came from just south of the border, from Mexico. About 58 percent of all immigrants living in Texas and 55 percent of immigrants arriving during the 1980’s were Mexicans. The slight decline in the Mexican share of Texas immigration was not a result of declining Mexican migration, but rather of increasing migration from other places of origin, notably Central America. Nearly one out of ten immigrants in Texas who arrived during the 1980’s came from one of the Central American nations, compared to only about 2-3 percent in earlier decades. Following the new immigration of the 1980’s, California, Florida, and Texas showed demographic changes that were even more marked than those that occurred in the rest of the nation. In 1980, one of twenty Californians was Asian. Ten years later, Asians accounted for one out of every ten Californians. California’s Hispanic population grew from 19 percent in 1980 to 26 percent in 1990, so by the end of the 1980’s, well over one-third of the people in California were either Hispanic or Asian. In Florida, Hispanics represented 9 percent of the population in 1980 and 12 percent in 1990. The Hispanic population of Texas increased from 21 percent to 26 percent in that ten-year period. Impact Immigration in the 1980’s produced a much larger foreign-born population in the United States,
508
■
The Eighties in America
Income and wages in Canada
and it contributed to a much more ethnically diverse population. Immigration, especially undocumented immigration from Mexico, became a major political issue during this decade, which was simultaneously characterized by increasing wealth and decreasing real wages. The notion of immigrants “stealing” American jobs gained traction, as working-class Americans found themselves struggling in a growing economy, and middle-class Americans found that they needed to work more hours than in previous decades. At the same time, American businesses began to benefit in a systematic way from employing immigrant laborers who were willing to work for less money than were native-born citizens. Further Reading
■ Income and wages in Canada Definition
Earning and payment of money, deriving from capital or labor, in Canada
In developed economies like that of Canada, the overwhelming majority of the population supports itself from money wages and income. By 1987, Canada’s labor force comprised slightly more than 13 million people, of whom a bit more than 1 million were unemployed. The labor force was equal to about one-half of the total population. During the 1980’s, Canada enjoyed one of the highest rates of job creation in the developed world, but many of the new workers were married women who often held part-time or temporary jobs. In an effort to counteract the effects of recession of the early 1980’s, the Canadian government had encouraged job sharing, which eased the burden on Canada’s unemployment insurance system but left many working only part-time. Despite the creation of some 1.3 million new jobs, Canada’s unemployment rate remained stubbornly high. Those who were laid off during this period remained unemployed, on average, for 17.3 weeks,
Brimelow, Peter. Alien Nation: Common Sense About America’s Immigration Disaster. New York: Random House, 1995. Controversial book that criticizes America’s post-1965 immigration policy for changing the ethnic makeup of the United States. Portes, Alejandro, and Ruben B. Rumbaut. Immigrant America: A Portrait. Rev. 2d ed. Berkeley: University of California Press, 1997. Describes America’s immigration and immigrants during the last third of the twentieth century. Waters, Mary, and Reed Ueda, eds. The New Americans: A Guide to Immigration Average Income of Families and Unattached Since 1965. Cambridge, Mass.: Harvard Individuals in Canada* University Press, 2007. Comprehensive work, covering major issues in immigraAverage Income of tion and immigrant groups. Unattached Individuals Average Family Income Zhou, Min, and Carl L. Bankston III. Year (in constant 1995 dollars) (in constant 1995 dollars) Growing Up American: How Vietnamese 1980 55,061 23,018 Children Adapt to Life in the United States. New York: Russell Sage Foundation, 1981 54,214 24,246 1998. Examines the lives of young peo1982 52,869 24,056 ple in the largest of the Southeast Asian 1983 52,304 22,972 refugee groups. Contains a brief history 1984 52,135 23,077 of Southeast Asian resettlement in the 1985 53,472 23,647 United States from 1975 to the middle 1986 54,462 22,673 of the 1990’s. Carl L. Bankston III 1987 55,156 24,120 See also
Asian Americans; Demographics of the United States; Immigration Reform and Control Act of 1986; Immigration to Canada; Latinos; Mariel boatlift; Mexico and the United States; Soviet Union and North America.
1988
56,366
24,425
1989
58,024
25,192
*Figures exclude self-employment farm income. Source: Centre for International Statistics, using Statistics Canada, Income Distributions by Size in Canada, 1995.
The Eighties in America
while some 20 percent remained unemployed for more than six months. Unemployment was most likely to affect those with less than eight years of schooling. The problem was most severe in the Atlantic provinces, New Brunswick, Nova Scotia, Newfoundland, and Prince Edward Island, where unemployment persisted at double-digit levels throughout the 1980’s. While Canada was relatively successful at creating new jobs, the pay received for those jobs remained stagnant throughout the 1980’s, which, coupled with the relatively high rate of inflation, meant that the incomes of many Canadians actually fell, in contrast to earlier decades. In 1988, the annual average earnings of a Canadian worker were 29,969 Canadian dollars ($C), a sum that was in fact less than that worker had earned ten years earlier. The highest wages were paid by the mining and oil industries, but they represented a relatively small part of the workforce. The lowest wages were paid by the service industries, which had the greatest number of employees. One important factor in the earnings of Canadian workers was the rate of unionization. In 1983, when the economy was just coming out of recession, 40 percent of Canada’s workforce was unionized, though most of those workers were employed in the public sector; 28 percent of private-sector employees were unionized. Impact Wages and salaries in Canada are traditionally slightly below those of their U.S. counterparts. In the 1980’s, this circumstance was made worse by the fact that the productivity of Canadian workers, at a dismal 0.2 percent per annum, grew even more slowly than that of workers south of the border. Canadian income and wages would continue to lag behind those in the United States until the 2005 oil boom in Alberta changed the picture. Further Reading
Bothwell, Robert, Ian Drummond, and John English. Canada Since 1945. Rev. ed. Toronto: University of Toronto Press, 1989. Crane, David. The Next Canadian Century: Building a Competitive Economy. Toronto: Stoddart, 1992. Organization for Economic Cooperation and Development. Economic Surveys: Canada. Paris: Author, annually. Nancy M. Gordon
Income and wages in the United States
■
509
See also Business and the economy in Canada; Canada and the United States; Demographics of Canada; Inflation in Canada; Unemployment in Canada; Unions.
■ Income and wages in the United States Definition
Earning and payment of money, mostly deriving from capital or labor, in the United States
While hourly wage rates and weekly earnings rose substantially during the 1980’s, they did not keep pace with rising prices, so real wages actually declined. At the same time, however, per capita disposable income, adjusted for inflation, increased. Many critics argued that the U.S. industrial economy was being undermined by international competition, reducing the number of good jobs and widening the inequality of wealth between the nation’s richest and poorest workers. Between 1980 and 1989, the average weekly earnings of employees in private U.S. nonagricultural industries increased from $235 to $334. This increase was not quite enough to keep up with rising prices, however, so real wages—that is, the purchasing power of wage income—declined by about 4 percent. The figures on average yearly wage rates tell a similar story. Over the decade, labor productivity—that is, the amount of value created by workers per dollar spent to employ them—increased by more than 10 percent. As productivity increased, however, real wages decreased. To a certain extent, these statistics reflect a lack of reinvestment of profits in the workers producing them. However, they can be misleading, because they do not take into account fringe benefits—including medical insurance and pension contributions—which increased substantially during the decade. Moreover, the measured price index is not always an accurate measure of the real consumer experience of inflation. It may overestimate inflation by disregarding new and improved products and services, and it may either over- or underestimate it by disregarding volatile but essential commodities such as oil and food. Household Real Incomes A very different story from that of U.S. wages in the 1980’s is told by household disposable incomes of the same period. Ad-
22.2 21.5
33.3 31.3
15.4 13.6
17.4 15.6
% Under $10,000
*Figures are in constant 1989 dollars. Source: Statistical Abstract of the United States, 1991.
3,906 5,933
8,847 10,486
Black 1980 1989
Hispanic 1980 1989
71,872 80,163
82,368 93,347
1980 1989
White
All Households 1980 1989
Number of Households (in 1,000s)
13.8 12.3
13.8 11.9
9.7 9.4
10.1 9.7
% $10,000$14,999
23.8 21.9
20.8 19.6
19.7 17.8
19.8 17.9
% $15,000$24,999
16.3 15.8
13.9 13.8
17.4 16.2
17.0 15.9
% $25,000$34,999
14.7 14.7
11.1 12.0
19.1 18.1
18.2 17.3
% $35,000$49,999
Annual Income of U.S. Households, 1980 and 1989
6.9 9.6
5.8 8.5
12.9 15.2
12.2 14.5
% $50,000$74,999
2.3 4.2
1.3 3.0
5.9 9.6
5.4 9.0
% $75,000 and Over
20,543 21,921
16,198 18,083
28,117 30,406
26,651 28,906
Median Income in Dollars*
The Eighties in America
justed for inflation, per capita disposable income rose by 18 percent and consumption rose by 20 percent during the decade. Part of this increase occurred because incomes from property and self-employment rose more than did labor incomes. These forms of income went mainly to wealthier families. Even so, inflation-adjusted median family disposable income (a better measure of the experience of the overall population than is average, or mean, income) rose by about 8 percent. This increase occurred mainly because a higher proportion of the population was working. The unemployment rate declined substantially, from 7 percent in 1980 to 5 percent in 1989. Labor-force participation (as a percentage of the population aged sixteen and over) rose from 64 percent in 1980 to 67 percent in 1989. Government data indicated that 13 percent of the population had incomes below the poverty level in 1980. This increased to 15 percent in 1982-1983 as a result of the recession, then returned to about the initial level by 1989. Poverty was higher among female-headed households and in African American families: As a result, about one-half of African American, female-headed households were in poverty during the decade. The federal statutory minimum wage had been raised several times during the inflationary 1970’s. These increases had resulted in its reaching $3.35 per hour in 1981, where it remained until April, 1990, when it increased to $3.80 per hour. An important source of rising property income during the 1980’s was high interest rates. Interest and dividend income rose from 15 percent of personal income in 1980 to 18 percent in 1989. Much of this wealth accumulated in the pension-fund assets of people still working. The U.S. labor market passed a watershed at the end of 1979: Employment in manufacturing reached a peak and began a slow decline. Manufacturing output continued to rise, but productivity was rising even faster, meaning that fewer workers were required to do the same amount of work. The result was a much-lamented erosion in the availability of production-line jobs with good pay. Such jobs were desirable, because they were accessible to persons with only a high school education. A large proportion of the 17 million additional jobs created in the 1980’s were in the trade and services sector. Unions
The shift in employment also meant a decline in the relative importance of union membership.
Income and wages in the United States
■
511
Membership in unions affiliated with the American Federation of Labor-Congress of Industrial Organizations (AFL-CIO) was relatively constant over the decade, but such other traditional stalwarts as mine workers, auto workers, steel workers, and garment and textile workers all showed significant declines in union membership. In the automobile industry, several Japanese manufacturers opened facilities in the South. These were generally non-union; several of the host states had “right-to-work” laws, which prohibited making union membership a condition of employment. The new plants were thus able to operate with far lower labor costs than were the Detroitbased Big Three automakers. Union membership took a major hit in 1981. In August of that year, thirteen thousand members of the Professional Air Traffic Controllers Organization went on strike after rejecting a government contract offer. President Ronald Reagan dismissed the strikers, and many of their jobs were filled by newcomers. The publicity generated by the strike and its eventual defeat significantly harmed the public perception of unions. Union leaders became convinced that good jobs were being lost in the United States because of competition from imports and that wages were being held down by immigration. They lobbied hard for restrictions on both imports and immigration, but without much success. The upstart Service Employees International Union, led by Andy Stern, was able to recruit aggressively among low-income workers such as janitors and custodians. Union membership expanded among people working for government, particularly schoolteachers and employees of state and local governments. Nevertheless, the favorable macroeconomic conditions of the 1980’s led to much fewer work stoppages than had occurred in the past. In the 1970’s, there had been more than two hundred stoppages per year. After 1982, the number fell well below one hundred. Impact Many Americans benefited from the decline of inflation and the strong labor market that prevailed after 1982. However, real wages either declined or stayed relatively constant, depending on how one interprets the price indexes. The average household did enjoy a modest increase in real income, but it had to do more work to get it. The most significant income benefits of the decade went to families that were already earning more money, and
512
■
Indian Gaming Regulatory Act of 1988
the extent of economic inequality in the United States increased. Further Reading
Cox, W. Michael, and Richard Alm. Myths of Rich and Poor: Why We’re Better Off than We Think. New York: Basic Books, 1999. Examines data on consumption, asset ownership, and social mobility to argue that most Americans enjoyed substantial improvement in their economic lives between the 1970’s and the 1990’s. Mishel, Lawrence, and David M. Frankel. The State of Working America. Armonk, N.Y.: M. E. Sharpe, 1990-1991. Paints a bleak picture of declining real wages and increasing poverty and inequality. Phillips, Kevin. The Politics of Rich and Poor: Wealth and the American Electorate in the Reagan Aftermath. New York: Harper & Row, 1990. Argues that public policies of the 1980’s substantially shifted income from the poor to the rich. Paul B. Trescott See also
Air traffic controllers’ strike; Business and the economy in the United States; Immigration Reform and Control Act of 1986; Immigration to the United States; Inflation in the United States; Reaganomics; Unemployment in the United States; Unions.
■ Indian Gaming Regulatory Act of 1988 Identification Federal legislation Date Went into effect on October 17, 1988
The Indian Gaming Regulatory Act authorized Native American tribes to operate gambling facilities, using all games tolerated elsewhere in a state. It required states to show good faith in negotiating compacts that specified the conditions for establishing and operating such casinos. Since a Supreme Court ruling in the early nineteenth century, the United States has recognized that state laws do not apply to Native American tribes on reservations unless those laws are specifically authorized by an act of Congress. When some tribes built casinos and bingo parlors in the 1980’s, several states claimed the right to regulate those facilities under Public Law 83-280, which had given those states the authority to enforce criminal laws on the
The Eighties in America
reservations. The Supreme Court, however, ruled in favor of the tribes in California v. Cabazon Band of Mission Indians (1987), holding that the states had been given no authority to regulate tribal activities that were not proscribed by their criminal codes. The tribes enthusiastically welcomed the ruling, whereas many state officials called on Congress to expand the states’ powers to regulate gambling. Attempting to arrive at a compromise, Congress enacted the Indian Gaming Regulatory Act (IGRA) of 1988, which pleased neither the tribes nor the states. Among its provisions, the statute separated gaming into three categories. For Class 1 games, consisting of traditional and social games, the tribes were given exclusive control. Class 2 encompassed games like bingo and lotto, which were to be regulated by the tribes with the oversight of the National Indian Gaming Commission. Class 3 comprised primarily casino-style games of chance. In order to operate a Class 3 facility, a tribe was required to negotiate a contract with the relevant state, clarifying such matters as permissible locations of casinos, the kinds of gaming permitted, and provisions for tribes to make donations in lieu of state and local taxes. The IGRA was quite comprehensive in scope. It required tribal ownership of all casinos and bingo parlors operating under the act, and it further required all resulting revenues to be used for specific tribal activities. It provided for the three-member National Indian Gaming Commission to be established to make rules and supervise Class 2 and Class 3 gaming. The statute referred to all Native American lands, with Section 2719 specifying that gaming would usually not be allowed on land acquired by a tribe after 1988. There were two major exceptions to this rule: land that was acquired within or contiguous to existing Indian lands, and situations in which the secretary of the interior determined that a gaming facility would not be detrimental to the surrounding community or to the state. Impact The 1980’s saw the beginnings of American Indian gaming as an important stream of income for the Native American tribes. The IGRA, combined with the Supreme Court’s 1987 decision, appeared to ensure that the enterprise would continue into the future. The rapid development of Indian gaming sparked great controversy. In less than a decade, some tribes with large casinos were moving toward self-sufficiency, and a few of the smaller tribes even
The Eighties in America
became fabulously wealthy. The most profitable facilities were those located near large urban areas, such as the Mashantucket Pequot’s Foxwoods Casino in Mashantucket, Connecticut. Tribes in more isolated regions, on the other hand, received few benefits from gaming, and their members continued to experience poverty and unemployment rates that were among the highest in the United States. Further Reading
Clinton, Robert, Nell Newton, and Monroe Price. American Indian Law: Cases and Materials. Charlottesville, Va.: Bobbs-Merrill, 1991. Kallen, Stuart, ed. Indian Gaming. New York: Thomson Gale, 2005. Mason, W. Dale. Indian Gaming, Tribal Sovereignty, and American Politics. Norman: University of Oklahoma Press, 2000. Thomas Tandy Lewis See also
Native Americans; Racial discrimination; Supreme Court decisions.
Inflation in Canada
Rate of increase in prices in Canada
Price levels in Canada continued to track those of the country’s much larger neighbor to the south, and the continued inflation of the 1980’s posed a significant problem for the Canadian economy. Canada’s consumer price index, like that of the United States, rose substantially during the 1980’s. The sharpest rise occurred between 1980 and 1982, when the rate of inflation was in the double digits for three consecutive years. Prices rose 10.2 percent over the preceding year in 1980, 12.5 percent in 1981, and 10.5 percent in 1982. In 1983, the annual rate of increase fell to 5.8 percent, but it remained at least 4 percent for most of the rest of the decade. In 1989, it rose to 5 percent. Inflation in Canada reflected, in part, shifts in the value of the Canadian dollar versus that of the U.S. dollar. Although the Canadian dollar depreciated against the U.S. dollar, it appreciated significantly against several other currencies. Canadian governments, both the federal government and the provincial governments, attempted to shield their citizens from the effects of inflation, leading to a rise in the
513
Consumer Price Index Figures for Canada, 1980-1989 The Consumer Price Index (CPI) measures the cost of living in Canada and is an important indicator of inflation. Statistics Canada tracks the retail price of about six hundred goods and services; this “market basket” of items includes an average household’s expenditures for food, housing, transportation, furniture, clothing, and recreation. Statistics Canada measures the prices against the base year of 1992; the average index level for that year is equal to 100. In 1980, the CPI was 52.4, which means that what consumers could buy for $100 in 1992 would have cost only $52.40 in 1980. The rate of increase of the CPI is typically reported as the percentage increase in the index over the past twelve months. The following CPI figures measure the costs of all items in the index during the 1980’s and the percentage of change from the previous year:
■ Inflation in Canada Definition
■
Year
Consumer Price Index for All Items
Percentage Change from Previous Year
1980
52.4
10.1
1981
58.9
12.4
1982
65.3
10.9
1983
69.1
5.8
1984
72.1
4.3
1985
75.0
4.0
1986
78.1
4.1
1987
81.5
4.4
1988
84.8
4.0
1989
89.0
5.0
Sources: Statistics Canada; Bank of Canada.
deficits of the various governments. Many of the governments attempted to hold down public employees’ salaries for the benefit of those in the private sector. The disparity between labor productivity and labor costs largely explains the rise in inflation during the 1980’s.
514
■
Inflation in the United States
Impact The rapid rise of inflation all over the developed world posed a major problem for the government. The pressure from voters to stem the deterioration in their living standard made it exceptionally difficult for governments at all levels to protect their own budgets from deficits. Since a high proportion of employees in Canada were government employees, they and their compensation tended to have a significant effect on prices in the country. Further, the deterioration in the prices of commodities, a major component of Canadian exports, combined with rising prices of imports, made it difficult to preserve a balance in the Canadian economy. Further Reading
Bothwell, Robert, Ian Drummond, and John English. Canada Since 1945. Rev. ed. Toronto: University of Toronto Press, 1989. Crane, David. The Next Canadian Century: Building a Competitive Economy. Toronto: Stoddart, 1992. Organization for Economic Cooperation and Development. Economic Surveys: Canada. Paris: OECD, 1980ff. Nancy M. Gordon See also Agriculture in Canada; Business and the economy in Canada; Canada and the United States; Canada Health Act of 1984; Health care in Canada; Income and wages in Canada; Mulroney, Brian; Unemployment in Canada.
■ Inflation in the United States Definition
Rate of increase in prices in the United States
The 1980’s began with consumer prices rising by more than 10 percent per year, causing serious public discontent that contributed to Ronald Reagan’s election to the presidency. Slowing the rate of growth of the money supply and a decline in world petroleum prices brought the inflation rate to lower levels beginning in 1982. Inflation had been a major problem in the United States in the 1970’s, reflecting the great increase in world petroleum prices generated by the Organization of Petroleum-Exporting Countries (OPEC) cartel and the misguided efforts of the U.S. Federal Reserve to reduce unemployment by rapidly ex-
The Eighties in America
panding the nation’s money supply. In 1979 and 1980, consumer prices rose about 13 percent per year. Causes of Inflation Higher oil prices made the American public worse off, transferring real income to the oil-exporting countries, chiefly those in the Middle East, and lowering output and employment in the years of major oil-price increases. Most of the discomfort caused by this inflation, however, resulted from unanticipated and poorly understood shifts of income and wealth within the United States. After all, a higher price paid by some meant a higher price received by others, but inflation anxiety afflicted even people who benefited from the process. Few people understood the underlying monetary causes, and paying higher prices made people feel victimized. Since monetary inflation occurs in response to rising demand for goods and services, the process tends to raise the money value of wages, as well as of products. In the 1980’s, however, wages did not keep pace with prices. Average weekly earnings, when valued in 1982 dollars, were more than 10 percent lower during the 1980’s than they had been in the previous decade. These earnings data do not include fringe benefits, however, which were also rising. Some economists suggested that price indexes overstated inflation, because they did not allow sufficiently for improvements in the quality and variety of products. Rising prices meant that the value of each dollar of actual money holdings declined. However, persistent high interest rates compensated wealth holders for the declining value of the dollar. Because Social Security benefits had been indexed, moreover, beneficiaries received higher payments to offset higher living costs. The program’s benefit payments doubled over the course of the decade, partly because the number of beneficiaries rose from 22 million to 28 million. Indexation was extended to the personal income tax in the early 1980’s, reducing the tendency for inflation to push people into a higher tax bracket. Encouraged by the 1980 election results, Federal Reserve chief Paul Volcker took steps to reduce the growth rate of the money supply. In the short run, this put the economy through a painful economic recession. The inflation rate quickly receded, falling below 4 percent in each year from 1982 to 1986. This
The Eighties in America
Inflation in the United States
■
515
Consumer Price Index Figures for the United States, 1980-1989 The Consumer Price Index (CPI) is an important indicator of the rate of inflation. The index measures the average change over time in the prices that consumers in urban areas of the United States pay for various goods and services. The Bureau of Labor Statistic (BLS), which calculates the index, has set the average index level at the thirty-six-month period covering the years 1982, 1983, and 1984; this level is equal to100. BLS then measures changes in prices in relation to that figure. An index of 110, for example, means there has been a 10-percent increase in prices since the reference period; similarly, an index of 90 means there has been a 10-percent decrease. During the 1980’s, the CPI rose every year, as the costs of consumer goods and services climbed. Increases in some items were relatively small; the costs of transportation, for example, rose by 24.6 percent during the decade, and transportation costs increased by 27.5 percent. However, medical care costs skyrocketed, increasing by more than 66 percent from 1980 through 1989. The following CPI figures measure the costs of all items in the index, as well as some selected items, for consumers residing in all urban areas of the United States during the 1980’s:
Year
All Items
Food and Beverages
Apparel and Upkeep
Transportation
Medical Care
1980
82.4
86.7
90.9
83.1
74.9
1981
90.9
93.5
95.3
93.2
82.9
1982
96.5
97.3
97.8
97.0
92.5
1983
99.6
99.5
100.2
99.3
100.6
1984
103.9
103.2
102.1
103.7
106.8
1985
107.6
105.6
105.0
106.4
113.5
1986
109.6
109.1
105.9
102.3
122.0
1987
113.6
113.5
110.6
105.4
130.1
1988
118.3
118.2
115.4
108.7
138.6
1989
124.0
124.9
118.6
114.1
149.3
+36.9
+33.9
+24.6
+27.5
+66.1
% change 1980-1989
Source: U.S. Department of Labor, Bureau of Labor Statistics.
slowing of inflation was aided by declining world oil prices, which helped household energy prices (which shot up by 30 percent in 1980-1981) to stabilize and ultimately to decline by 20 percent in 1986. Energy prices were lower in 1986-1989 than they had been in 1981. Interest Rates
An important lesson of the 1970’s was that interest rates tended to rise in proportion to people’s expectations of inflation. A rise in the expected inflation rate made borrowers more eager to borrow, since they could repay their loans with cheaper dollars. Higher expected inflation made lenders less willing to lend, as they would be repaid
in those same cheaper dollars. As actual inflation declined in the early 1980’s, the waning of inflation expectations moved interest rates gradually downward. High-grade corporate bonds, which yielded about 14 percent in 1981-1982, yielded about 9 percent in 1986-1989. The fall in yields meant marketable bonds fetched higher prices, helping compensate bondholders for their loss of purchasing power through inflation. Even in 1989, however, interest rates were still high by historical standards. In combination with the annual increases in stock prices (in every year of the decade except 1987), the high interest rates made American assets attractive to international in-
516
■
vestors and helped stimulate an inflow of international capital. Inflation in the United States lowered not only the internal value of the dollar but also its external value—its foreign-exchange value. The inflation of the 1970’s, for instance, caused the international value of the dollar to fall by one-eighth between 1973 and 1980. The slowing of inflation and the surging inflow of international capital helped raise the dollar’s value internationally by 40 percent by 1985, but it then receded to more normal levels. The higher value of the dollar made imported goods cheaper, helping bring down the inflation rate but contributing to a large deficit in the U.S. balance of international payments. Impact Anti-inflation sentiment helped strengthen support for deregulation of significant industries, begun by President Jimmy Carter and continued by President Reagan. The successful reduction of inflation in the 1980’s helped create goodwill for the Republican Party, assisting their victories in the presidential elections of 1984 and 1988. The inflationreduction experience validated the “monetarist” perspective, according to which inflation was viewed as a response to excessive money growth and high interest rates were perceived as a response to expectations of high inflation. In this view, monetary policy exercised by the Federal Reserve was seen as the principal instrument of government management of aggregate demand. These lessons became part of the intellectual capital of Alan Greenspan, who took over as head of the Federal Reserve in August, 1987. Under his leadership, the U.S. macroeconomy enjoyed still lower inflation and interest rates, combined with high employment, during the 1990’s. Further Reading
Mishkin, Frederick S. The Economics of Money, Banking, and Financial Markets. 7th ed. New York: Pearson/Addison Wesley, 2004. This widely used college text analyzes the relative roles of OPEC and the Fed in the inflation of 1973 to 1983; also provides analysis of interest rates and foreignexchange rates. Paul B. Trescott See also
The Eighties in America
Infomercials
Business and the economy in the United States; Income and wages in the United States; Reaganomics; Recessions.
■ Infomercials Definition
Television commercials that mimic the length and format of conventional programming
The emergence of infomercials in the 1980’s changed the face of television advertising and muddled distinctions between commercial and noncommercial programming. The origins of the infomercial can be traced to the two-to-four-minute commercials of entrepreneur Ron Popeil that first aired in the 1950’s and similar “long-form” advertisements for the Ginsu Knife in the 1970’s. Federal restrictions upon the time that television stations could devote to advertising prohibited more extensive, program-length advertisements until 1984, when the administration of U.S. president Ronald Reagan relaxed these restrictions as part of a sweeping campaign to deregulate the broadcast industry. Later that year, the first fulllength infomercial aired in the form of a sixty-minute advertisement for dietary supplements produced by the Herbalife company. Infomercials rapidly became popular as an alternative to conventional programming for broadcast and cable television stations seeking to boost advertising revenue in response to budget constraints, increased competition, and a growth in demand for twenty-four-hour programming. Infomercials came over time to follow a standard format. They were typically thirty-minute broadcasts designed to mimic news programs, talk shows, or public service announcements. These programs, which usually aired late at night and at other times when airtime was inexpensive, showcased a specific product or group of products that was often demonstrated in a dramatic manner during the broadcast. Members of a paid “audience” frequently participated in these demonstrations or in staged question-and-answer sessions about the product. Products featured on infomercials included tools or appliances, cleaning supplies, and sources of information about personal fitness, wealth creation, or dieting. These items were often touted as “miracle” products, capable of dramatically improving the lives of consumers. Some products advertised on infomercials were available for purchase at conventional retail stores, but many could be purchased only through the contact information (usually a toll-free telephone number) provided during the broadcast. Because the formats of infomercials often mimicked those of news and public affairs programming, and because the shows often aired at times previously
The Eighties in America
reserved for such programming, many viewers mistook the claims of the advertisers as objective, factual information from authoritative sources. By the end of the 1980’s, however, infomercials had become a fixture of late-night television, as their profitability to both broadcasters and marketers increased. Impact The proliferation of infomercials in the 1980’s altered both the manner in which products were advertised and public perceptions of the relationships between advertising and other sources of information. The emergence of advertising presented as news coincided with an increased emphasis upon entertainment and sensationalism in television news broadcasts, rendering distinctions between fact and fiction difficult for many viewers to understand or maintain. Infomercials both influenced and were influenced by the national obsession with materialism and consumerism that characterized the 1980’s. Further Reading
Harry, Lou, and Sam Stall. As Seen on TV: Fifty Amazing Products and the Commercials That Made Them Famous. Philadelphia: Quirk Books, 2002. Head, Sidney W., et al. Broadcasting in America: A Survey of Electronic Media. 9th ed. Boston: Houghton Mifflin, 2001. Michael H. Burchett See also Advertising; Business and the economy in the United States; Cable television; Television.
■ Information age Definition
Loosely delineated historical period defined by an exponential growth in information storage, duplication, and transmission
During the 1980’s, there was an explosion in the use and number of computers and other electronic information storage and retrieval devices. In addition, the number of options for consumers of electronic media vastly increased. As a result, industrial and postindustrial societies began to be defined by the rapid proliferation of information throughout every facet of contemporary culture. Ideas, words, and other data make up “information.” Information, the stuff of thought and creativity, becomes tangible when collected, analyzed, shaped, stored, duplicated, and transmitted. It becomes “phys-
Information age
■
517
ical” when communicated through words and numbers on paper (a book or street sign), through sounds (a musical composition such as a pop song), and through common visuals (a photograph, television commercial, billboard, Web page). Information, then, is human creativity that is organized, made tangible, and shared. Information is made into a “thing” through duplication, and these “things”—databases, e-mail, books, videos, pictures, newspapers, songs, and such—can be transmitted and even sold, thus marking an economy of information and an “information age” or “information society.” Influential Technologies
A 1982 report by the U.S. Office of Technology Assessment noted the innovative technologies from the mid- to late-twentieth century that were instrumental in shaping the information age, which began to flourish in the early 1980’s. These innovations include cable, satellite, digital television transmission, broadcast technology, computers, electronic storage, video technology, information sciences, and telecommunications. Of these nine, the widespread advancement and use of computers coupled with telecommunications is often considered the major stage in the move to a fully formed information age.
Early Roots
Some historians argue that the modern roots of an information age began with engineer Vannevar Bush and his 1945 idea for a “memex” machine, an early information-linking computer system. Others mark the work of mathematician Norbert Wiener on cybernetics (1948) as the point of origin. Still others have pinpointed the awareness of an emerging “information economy” or “knowledge society” to a 1962 book by economist Fritz Machlup, who documented increasing numbers of “knowledge workers” (workers in education, health care, government, legal services, banking, entertainment, tourism, repair services, and sales) since World War II. Media theorist Marshall McLuhan detailed the cultural and social effects of a growing media and electronic communications industry, coining the phrase “age of information” in 1964. Sociologist Daniel Bell, in 1964, argued that society was becoming “postindustrial,” whereby the collection, management, manipulation, and distribution of information—and not the production of material goods— would be most valued in the U.S. economy. In 1969, management theorist Peter Drucker wrote of a “knowledge economy” and “knowledge industries.”
518
■
The Eighties in America
Information age
The Information Age Goes Public At the beginning of the 1980’s, the public became increasingly aware of how computers were to be a growing part of daily life. Microprocessors, which made computers smaller, faster, and less expensive, had been installed in computers in the mid-1970’s, allowing nonprofessionals to use equipment once reserved for a computer elite. Microcomputers were first introduced to mainstream consumers in 1976 with the Apple I. In 1981, International Business Machines (IBM) came out with the IBM PC, which it sold through Sears stores. In 1983 and 1984, Apple made computing even more user-friendly by introducing the handheld “mouse.” Information could now be more easily gathered, stored, duplicated, and shared, and once the Internet was made available to academics and other researchers in the mid-1980’s, that information—in the form of words, numbers, graphics, and such—proliferated. In the 1980’s, computers became more commonplace not only in banks, libraries, offices, and retail establishments but also in factories and auto repair shops and at gas stations and construction sites. “Home information systems” became popular, as American Telephone and Telegraph (AT&T) and Bell Telephone began offering consumers an early form of e-mail in 1980. Libraries offered coin-operated microcomputers to their patrons, and automated teller machines (ATMs) were installed in banks. U.S. government information gathering, as well as privacy rights, became major concerns in 1982, when about 500,000 men who did not register for the draft were traced through Social Security files scoured by government computers. Best-selling authors and futurists Alvin Toffler (The Third Wave, 1980) and John Naisbitt (Megatrends, 1982) brought to a general reading audience a discussion of the emerging information age, arguing that the 1980’s would usher in a new era defined by an increased flow of information and knowledge. In 1986, AT&T announced, “Like it or not, information has finally surpassed material goods as our basic resource.” The information age, cultural critic Theodore Roszak wrote in 1986, was marked by an economy that was “mass manufacturing information.” Critics who did not believe the Western world was in the throes of an information age countered that industry and manufacturing were not diminishing in light of an information revolution. Instead, they argued, industrial capitalism was simply changing form,
that most knowledge and information workers comprised a routine, and hardly new, mass-producing workforce of clerks, typists, salespersons, and technical support staff. This workforce, the so-called knowledge workforce, made up a newly defined category of workers. Impact The 1980’s was a significant decade for the information age. By this time, the information age was considered unstoppable and pervasive. It would not be long before information technology—and the glut of information itself—became an accepted part of everyday life. Also, pundits argued that information was the key to global economic survival for the United States, with some naming the age an “information revolution.” Subsequent Events
By 1989, the information age would evolve radically, as researchers, namely computer scientist Tim Berners-Lee, developed a “browser” to easily retrieve information from computers. A browser contained “links” to information behind text and ushered in the earliest stages of the World Wide Web, another stage that introduced the public to information on an even grander scale. Within a few years, the Internet—and the Web— were accessed by a public even hungrier for information, and the information age was in full swing. In the 1990’s and the first years of the twenty-first century, information was everywhere, and everyone had to have it.
Further Reading
Capurro, Rafael, and Birger Hjorland. “The Concept of Information.” Annual Review of Information Science and Technology 37 (2003): 343-411. Details the histories of the terms “information” and “technology.” Castells, Manuel. The Rise of the Network Society. 2d ed. Malden, Mass.: Blackwell, 2000. Considered a classic in its discussion of the information age. Highly recommended, especially for advanced readers. Mattelart, Armand. The Information Society: An Introduction. Thousand Oaks, Calif.: Sage, 2003. A brief but recommended introduction to the information age. Webster, Frank. Theories of the Information Society. 3d ed. New York: Routledge, 2006. An excellent examination of ideas about the information society. Desiree Dreeuws
The Eighties in America
Intermediate-Range Nuclear Forces (INF) Treaty
See also Apple Computer; AT&T breakup; Book publishing; Business and the economy in Canada; Business and the economy in the United States; Cable television; CAD/CAM technology; Cell phones; Compact discs (CDs); Computers; Fax machines; Home shopping channels; Home video rentals; Infomercials; Microsoft; Music videos; Science and technology; Superconductors; Third Wave, The; Virtual reality; Voicemail.
■ Intermediate-Range Nuclear Forces (INF) Treaty
■
519
since the immediate postwar period, American and Soviet forces still faced each other across the lines drawn when the triumphant Allies partitioned Germany. For most of this period, both sides were content with conventional weapons. However, in 1975, the Soviet Union deployed its portable SS-20 intermediate-range missile, equipped with nuclear warheads. An intermediate-range missile is one with a shorter range than that of long-range intercontinental ballistic missiles (ICBMs). As a result, they are positioned closer to their targets than are ICBMs, and they can reach those targets in a shorter period of time, giving an opponent less warning and less op-
Identification
Arms control treaty between the United States and the Soviet Union Date Signed on December 8, 1987; entered into force on June 1, 1988 The deployment of a number of intermediaterange missiles during the 1970’s had caused increasing anxiety for people on both sides of the Cold War. The Intermediate-Range Nuclear Forces (INF) Treaty led to the elimination of all missiles in this classification. The agreement represented a level of trust between the two nations’ governments that had seemed unthinkable at the beginning of the decade. Throughout the Cold War, both the United States and the Soviet Union sought to gain an advantage through a wide range of foreign policy initiatives and military operations. While this jockeying occurred in many locations around the globe, Europe was the focus of the superpowers’ rivalry. The tension between them and its European focus resulted from the postWorld War II division of the continent into the communist-controlled East and the democratic West, which was allied with the United States. The Eastern European nations were allied through a treaty known as the Warsaw Pact, while the Western allies were members of the North Atlantic Treaty Organization (NATO). While the number of deployed troops had been scaled back
Preamble and Article I of the Intermediate-Range Nuclear Forces Treaty Treaty Between the United States of America and the Union of Soviet Socialist Republics on the Elimination of Their Intermediate-Range and Shorter-Range Missiles Signed at Washington December 8, 1987 Ratification advised by U.S. Senate May 27, 1988 Instruments of ratification exchanged June 1, 1988 Entered into force June 1, 1988 Proclaimed by U.S. President December 27, 1988 The United States of America and the Union of Soviet Socialist Republics, hereinafter referred to as the Parties, Conscious that nuclear war would have devastating consequences for all mankind, Guided by the objective of strengthening strategic stability, Convinced that the measures set forth in this Treaty will help to reduce the risk of outbreak of war and strengthen international peace and security, and Mindful of their obligations under Article VI of the Treaty on the Non-Proliferation of Nuclear Weapons, Have agreed as follows: Article I In accordance with the provisions of this Treaty which includes the Memorandum of Understanding and Protocols which form an integral part thereof, each Party shall eliminate its intermediate-range and shorter-range missiles, not have such systems thereafter, and carry out the other obligations set forth in this Treaty.
520
■
Intermediate-Range Nuclear Forces (INF) Treaty
The Eighties in America
U.S. president Ronald Reagan and Soviet leader Mikhail Gorbachev sign the Intermediate-Range Nuclear Forces Treaty on December 8, 1987. (Courtesy, Ronald Reagan Library/NARA)
portunity to respond before the missiles reach their targets. The Soviet deployment of such missiles in Europe upset the balance of power in the region. U.S. Response
The United States and its NATO allies responded to this new threat in two ways: The Americans developed a missile with capabilities similar to those of the SS-20, and Western powers entered into negotiations with the Soviets. Informal negotiations took place in the fall of 1980. Formal negotiations began the following year. The United States sought a bilateral treaty that would result in equal limits being placed upon both U.S. and Soviet nuclear capabilities without affecting their conventional weapons. Meanwhile, the United States hastened to develop a ground-launched cruise missile, as well as the Pershing II missile. The latter was an intermediate-range ballistic missile with a warhead small enough to allow somewhat targeted attacks. When negotiations did not result in a quick agree-
ment, the United States deployed its own intermediate-range missiles. The Pershing II was deployed in West Germany in 1984, and its European deployment was completed by the end of 1985. Many people saw the existence of intermediate-range nuclear missiles and their deployment within range of strategic targets as the equivalent of a hair trigger. It meant that if either superpower detected an apparent nuclear attack, its government would have no more than a few minutes to decide how to respond, increasing the chances of full-scale nuclear war. Early in the negotiations, President Ronald Reagan proposed what was called the “zero option,” which would entirely eliminate intermediate-range weapons from both arsenals. This proposal was rejected by the Soviets. Negotiations were complicated by the fact that multiple nuclear treaties and issues were under discussion simultaneously. For example, the Reagan administration sought to develop an antiballistic missile (ABM) defense system, to which
The Eighties in America
the Soviet Union strongly objected, and the superpowers were in the midst of negotiating a separate ballistic missile treaty in addition to the INF Treaty. Sometimes, the INF negotiations were held hostage to perceived problems in one of these other areas. The talks were sporadic until 1985, by which time both sides had intermediate-range missiles in the field, pointed at each other. Also in 1985, Mikhail Gorbachev became the leader of the Soviet Union. During the first year of his leadership, Gorbachev met with President Reagan, and the two leaders issued a joint statement declaring that both would seek to reach an agreement on intermediate-range missiles. Final Agreement
In late 1985, negotiations began in earnest, as the issues to be covered in the INF Treaty were fully separated from the issues being negotiated elsewhere. The United States retreated from the “zero option,” instead proposing that each country agree to maintain equal numbers of weapon systems, putting the brakes on the intermediate-range arms race. In early 1986, the Soviet Union proposed doing away with all U.S. and Soviet intermediaterange nuclear forces. The United States responded by proposing a limited number for each country, not only in Europe but also in Asia. The United States also proposed cutting back the number of shortrange forces. At a 1986 summit, the nations’ two leaders agreed to eliminate INF missiles in Europe and to limit the number elsewhere. In 1987, things moved swiftly in both directions. Based on the talks at the 1986 summit, the United States proposed provisions to allow on-site verification. In July, the Soviet Union accepted these provisions and proposed that the treaty do away with all intermediate- and short-range forces. The Soviets also wanted German missiles to be included in the treaty, but the United States rejected this proposal, saying it was to be a bilateral treaty between the two superpowers alone. However, in August, Germany announced that it would dismantle its intermediate-range missiles. In September, a basic agreement was reached. On December 8, 1987, the treaty doing away with short-range and intermediate-range weapons everywhere was signed by the leaders of both countries. During the ratification process by the U.S. Senate, three related agreements were made that clarified various provisions of the treaty. It was ratified in May, 1988, and entered into force on June 1, 1988.
Intermediate-Range Nuclear Forces (INF) Treaty
■
521
Impact Unlike most arms control treaties during the Cold War, the INF Treaty did not just limit the number of missiles in a category; it eliminated them altogether. It was the first major treaty allowing on-site inspections for compliance, a provision made famous by President Reagan’s slogan, “Trust, but verify.” Most previous treaties had mandated either approximately equal cuts by both sides or more cuts by the United States. In this treaty, the Soviet Union dismantled about twice as many weapon systems as did the United States. By May, 1991, all INF weapons— totaling around twenty-seven hundred—had been dismantled or destroyed, and about 850 on-site inspections had been conducted. The treaty’s provisions were thus enforced in conjunction with the end of the Cold War, signaled by the fall of the Berlin Wall in 1989 and the collapse of the Soviet Union in 1991. The treaty also brought to an end the strong antinuclear movement that had developed in the United States and Western Europe, as nuclear annihilation no longer seemed a realistic threat. Further Reading
Glitman, Maynard W., and William F. Burns. The Last Battle of the Cold War: An Inside Account of Negotiating the Intermediate Range Nuclear Forces Treaty. New York: Palgrave Macmillan, 2006. Co-written by the chief American negotiator of the treaty, this work contains rare insights into the negotiation process. Reagan, Ronald. Speaking My Mind: Selected Speeches. New York: Simon & Schuster, 2004. Contains the speech Reagan made during the signing ceremony, as well other speeches indicative of his rhetorical tone with regard to the Cold War. U.S. Department of State. Treaty Between the United States of America and the Union of Soviet Socialist Republics on the Elimination of Their Intermediate-Range and Shorter-Range Missiles. http://www.state.gov/www/global/arms/treaties/ inf1.html Part of the Department of State’s permanent electronic archive, this introductory page contains links to the text of the treaty and agreements implementing the treaty. Donald A. Watt See also Cold War; Day After, The; Foreign policy of the United States; Nuclear winter scenario; Reagan, Ronald; Reagan Doctrine; Reagan’s “Evil Empire” speech; Soviet Union and North America.
522
■
Inventions
■ Inventions Definition
Newly created or improved devices, objects, substances, techniques, or processes
Inventions in the 1980’s established emerging fields in materials and genetic engineering, computing, and other specialties, enabling practical applications, enhancing life quality, and invigorating the economy. Issues regarding the treatment of inventions as intellectual property resulted in new federal legislation and judicial forums that strengthened the U.S. patent system. During the 1980’s, North American inventors sought to create new objects and processes, as well as to improve the design of existing inventions. In addition to entirely new inventions, unique and substantial improvements to existing devices would qualify for patents, enabling inventors to benefit financially from their inventions. Many inventions resulted from people’s endeavors to solve particular problems or to provide utilitarian products to address more general concerns. They included devices or pharmaceuticals that saved lives and combated diseases. Often, inventors devised tools and other items designed to reduce the time it took to perform work, whether industrial, business, or domestic. U.S. Patent and Trademark Office
Starting in 1790, the U.S. Patent and Trademark Office (USPTO) issued patents to inventors who filed applications to patent innovative objects, designs, or processes. A similar Canadian patent office was not founded until the early twentieth century, and Canadian inventors often secured U.S. patents to protect their intellectual property rights. The USPTO commissioners of patents and trademarks, serving in the Department of Commerce, were usually patent attorneys who had engineering or scientific qualifications and passed an examination evaluating their knowledge of patent law. In 1980, President Jimmy Carter appointed Sidney A. Diamond as USPTO commissioner. Although Diamond was not a patent attorney, he was an authority regarding trademark law and sought to improve the patenting process. The USPTO’s primary problems during the 1980’s included the time required to review patent applications, especially those involving biotechnology. Such applications could take two years or more. Moreover, the low fee for a patent application resulted in the office being underfunded. In December, 1980,
The Eighties in America
President Carter signed the Patent Reform Act, implementing higher patent fees, with the moneys collected designated to computerize the USPTO. Inventors secured 56,650 patents during 1980. Women inventors filed around 2 percent of U.S. patent applications at the decade’s beginning and, by 1988, received 5.6 percent of patents. Patent law in the 1980’s recognized the intellectual property rights of the first person who had an idea for a given invention, rather than the first person actually to create the invention. Thus, someone describing a concept in a patent application but going no further retained control of the invention, even if someone else independently developed a working prototype. The USPTO recognized that inventors often required significant time between developing a concept and successfully inventing something tangible based on that concept. The office therefore ensured that inventors would retain exclusive rights to their inventions for seventeen years after a patent was issued. Leadership After Diamond resigned, President Ronald Reagan nominated as his replacement Gerald J. Mossinghoff, an electrical engineer and attorney who had worked as a USPTO examiner and National Aeronautics and Space Administration (NASA) lawyer. Beginning his duties in July, 1981, Mossinghoff, who stated that the USPTO needed modern resources to function effectively, noted that the Patent Reform Act did not provide enough funds for the office’s computer needs. In 1981, approximately twenty-eight hundred people worked at the USPTO, which had an average yearly budget of $116 million. By August of that year, inventors had filed more than 100,000 applications with the USPTO, of which 65 percent were approved. Mossinghoff hired more examiners. The number of USPTO examiners trained to evaluate applications rose from 780 in 1981 to 1,600 in 1989. Mossinghoff supported passage of legislation that would to extend patents for drugs that the federal government had not yet approved for sale. He spoke in August, 1982, emphasizing the usefulness of computers for USPTO examiners to search patent databases, stressing that the office needed to begin automation. When Mossinghoff resigned in December, 1984, Donald J. Quigg, who had been USPTO deputy commissioner, became President Reagan’s appointee
The Eighties in America
Inventions
as commissioner. Quigg’s invention experience included overseeing patents at Phillips Petroleum beginning in 1945. Starting as commissioner in January, 1985, Quigg continued Mossinghoff’s efforts to expedite the application review process. In the mid-1980’s, the USPTO typically processed applications in eighteen months. After Quigg resigned in October, 1989, President George H. W. Bush nominated Harry F. Manbeck, Jr., the General Electric company’s chief patent counsel, as USPTO commissioner. Innovations The numerous inventions produced in the 1980’s ranged from simple Post-it™ notes to complex bioengineered pharmaceuticals. Medical
■
inventions during the 1980’s included such drugs as Prozac, genetically engineered insulin, and alphainterferon; devices including the Jarvik-7 artificial heart, improved pacemakers, and polymer wafers that gradually released drugs into the recipient’s system; processes such as the use of lasers to clear arteries; and heavy equipment such as magnetic resonance imaging (MRI) machines. Fiber optics inventions increased telecommuni cation capabilities and enhanced long-distance communications when a glass fiber transoceanic cable was deployed. Inventors introduced machines to transmit facsimiles of digitized documents and images domestically and overseas within seconds. Others patented ceramic and metal materials that had
Time Line of Select 1980’s Inventions 1980 Genetically engineered alpha interferon Hepatitis B vaccine Sony Walkman
1985 DNA fingerprinting Gemini robot Microsoft Windows operating system
1981 IBM personal compuer MS-DOS Scanning tunneling microscope Synthetic human insulin
1986 Breadmaker Disposable camera High-temperature superconductor IBM-PC convertible (first laptop computer) Intel 386 microprocessor Nicotine patch
1982 ACE inhibitor Heathkit Educational Robot (HERO) Human growth hormone Insulated gate bipolar transistor Jarvik-7 artificial heart Video keyhole surgery 1983 Apple’s Lisa computer Cabbage Patch Kids dolls TCP/IP computer networking protocol 1984 Apple’s Macintosh computer and operating system CD-ROM Clumping cat litter
523
1987 Digital light processing Disposable contact lenses Statin drugs 3-D video game 1988 Digital cellular phones Indiglo nightlight Laser eye surgery Prozac WSR-88D (improved version of Doppler radar) 1989 High-definition television
524
■
Inventions
been subjected to chemical processes to reinforce their structures or increase their ductile characteristics. Such materials had a diverse range of applications, including strengthening engines used for transportation and facilitating the creation of faster, more efficient, more resilient computer processors and microchips. The creation of thin films, especially diamond, also advanced microelectronics. In the early 1980’s, inventors created personal computers sold by International Business Machines (IBM) and Apple Computer, as well as the operating systems, software, and peripherals required to operate them. These computers became popular for home and business applications, including desktop publishing. Computerized security systems were used to guard buildings and vehicles. Aerospace inventors improved airplanes and spacecraft in a host of ways. They created fan designs to decrease engine fuel consumption, space bumpers to deflect orbiting debris, and satellites for commercial and scientific purposes. The space shuttle program produced many inventions necessary to maintain and exploit the capabilities of NASA’s shuttles. Astronauts aboard the shuttles, moreover, tested earthbound scientists’ zero-gravity inventions, particularly pharmaceuticals. Robotics innovations resulted in five thousand robots being used in U.S. industries in 1980, a number that increased throughout the decade. Inventors explored solar applications to provide energy for transportation and buildings. Agricultural inventors developed polymers to improve soil quality and designed computer- and satellite-guided machinery. Consumers enjoyed many 1980’s inventions, including camcorders, videocassette recorders, compact discs, and Rubik’s Cube. In the early 1980’s, satellite television became available in some areas of the United States. Consumers also bought a variety of exercise and fitness inventions, such as the Stairmaster. Patent Law Legally, the U.S. patent system improved in the 1980’s, as legislative and judicial changes strengthened patent law. In previous years, judges concerned about preventing monopolies had applied their antitrust mindset to decisions regarding intellectual property. Lacking experiences with invention, many judges viewed patents as anticompetitive, and they had tended to deny inventors’ efforts to protect their patents for that reason. Over
The Eighties in America
the course of the decade, judges came more often to think of patents as promoting competition, so they were less likely to apply antitrust principles in their decisions regarding intellectual property. Journalists reported on the effects of legislation and court rulings, portraying them as detrimental to inventiveness. On television, an episode of NBC Magazine aired on December 5, 1980, profiled three inventors’ frustrations filing lawsuits for patent infringement. Weed Eater inventor George Ballas explained that when forty competitors sold similar weed-trimming products, a lawyer said Ballas’s patent had to be judged valid in court before he could seek infringement proceedings. A judge had then invalidated Ballas’s patent: He had ruled that Ballas’s invention was obvious enough that anyone could have had the same idea, and it was therefore not patentable. In June, 1980, the U.S. Supreme Court handed down a 5-4 decision in Diamond v. Chakrabarty, a case involving the question of whether it was possible to patent manufactured life forms. General Electric microbiologist Ananda Mohan Chakrabarty had filed in 1972 for a patent for a bacterium that could consume octane and camphor in crude oil. The USPTO denied his application, asserting that Section 101 of the Patent Act did not cover living organisms. The U.S. Court of Customs and Patent Appeals decided for Chakrabarty, resulting in the case going to the Supreme Court. On June 16, 1980, Chief Justice Warren Burger delivered the majority opinion, which favored Chakrabarty’s patent and established that it was possible to patent living things. Burger emphasized that the decision was not a judgment as to the desirability of genetic engineering; it merely indicated that the products of such engineering could qualify for a patent. The ruling resulted in the approval of approximately one hundred previously filed applications to patent organisms. In the early 1980’s, federal legislators and judges considered taking legal measures to protect computer programs. USPTO officials again cited Section 101 of the Patent Act of 1952 when they denied inventors James Diehr and Theodore Lutton a patent for a curing process that used computers to monitor temperatures while molding rubber. The initial ruling of Diamond, Commissioner of Patents and Trademarks v. Diehr and Lutton stated that computer programs could not qualify for patents. The U.S. Court of Customs and Patent Appeals again sided with the
The Eighties in America
inventors. On March 3, 1981, the Supreme Court voted five to four that computer programs qualified as a type of process that could be patented. This ruling set a precedent for software protection that bolstered the emerging computing field. As the number of computer-related patents increased, attorneys began to specialize in legal issues concerning software copyright and other means of protecting software and hardware inventions. On April 2, 1982, President Reagan approved a law assigning a single federal appeals court to review all patent-related appeals. Starting in October, 1982, the U.S. Court of Appeals for the Federal Circuit, located in Washington, D.C., had nationwide appellate jurisdiction over U.S. patent cases. Patent attorneys experienced with invention legalities presided as judges on the court.
Inventions
During the 1980’s, North American inventors faced increased competition from foreign inventors who were interested in many of the same rapidly developing and lucrative fields of technology, particularly microelectronics, materials engineering, biotechnology, and telecommunications. In 1985, U.S. and Japanese inventors received the most patents worldwide, with a global application rate of fifteen hundred per day. President Reagan stressed in his January, 1987, state of the union speech that the United States had to compete more effectively to retain its global position in technology and science. Both citizens and noncitizens applied for and received U.S. patents. In 1987, the USPTO issued 17,288 U.S. patents to Japanese inventors, an increase of 25 percent from the previous year. That same year, West German inventors acquired 8,030 U.S. patents, an increase of 15 percent from 1986, and French inventors received 2,990 U.S. patents, a 19 percent increase. In 1988, foreign inventors received 48 percent of U.S. patents. The USPTO issued the majority of its patents to corporations rather than individuals. Only two U.S. companies, IBM and General Electric, were included among the top ten U.S. patent recipients. In 1975, seven of the top ten recipients had been American. General Electric had topped the list for a quarter century through 1986, but in 1987 it slipped to fourth place, behind three Japanese leaders: Canon, Hitachi, and Toshiba. Inventors worldwide employed contrasting patenting strategies. Inventors in Japan
525
often secured patents for every individual improvement and modification, boosting the patent statistics for that country, while U.S. inventors waited until all components of their inventions were ready and filed for an inclusive patent, resulting in fewer patents being filed by Americans. As technological competition increased internationally in the 1980’s, U.S. industrial research shifted to the improvement of existing inventions and the production of salable commodities. This shift was designed to emphasize profits and appease stockholders who were unwilling to wait for longterm research to pay off. U.S. management sometimes impeded inventiveness, while foreign executives tended to encourage researchers to pursue innovation. Secrets and Rivals
International Competition
■
Many American inventors were secretive regarding their intellectual property, and their secrecy was aided by the fact that the USPTO did not divulge specific information contained in U.S. patent applications. In contrast, patent offices in Japan and Europe published complete applications eighteen months after they were filed, enabling competitors to gain access to them. The USPTO classified invention information deemed crucial to national security, and such information was regulated by the Department of Defense until it was declassified. For example, the Sidewinder missile was invented in 1947 and used by the armed forces, but it was not publically patented until 1980. In the late 1980’s, many inventors focused on creating and patenting high-temperature superconductors, resulting in numerous conflicts over the rights and profits to those inventions. Countries competed to secure power by dominating the superconductor industry. The Cambridge Report on Superconductivity noted that Americans received ten out of seventy-six early superconductor patents granted in Europe, while Japanese inventors received fiftynine. During the 1980’s, Sumitomo Electric Industries filed seven hundred superconductor patent applications worldwide. President Reagan instructed the USPTO to expedite patents for U.S. inventors’ superconductors as of 1988, so disputes regarding patent rights would not impede commercial benefits. He stressed that patents blocked rivals or required them to pay license costs to use specific patented items and processes. Patents also lured investors to support new compa-
526
■
The Eighties in America
Inventions
nies in the hope of generating wealth. In November, 1988, a USPTO spokesperson revealed that inventors had filed for 650 patents for high-temperature superconductivity inventions. Foreign inventors had applied for 225 of those 650 patents, including 150 Japanese applications. Institutional Inventions
Profits stimulated new invention strategies in the 1980’s, as entrepreneurs recognized the commercial value of patenting and licensing inventions. University, industrial, and government employers urged their researchers to pursue invention, so both researchers and their institutions could earn royalty income by licensing products to companies. Although researchers usually received credit for their inventions, their institutions often retained ownership of the resulting patents. Those institutions hired legal counsel to protect intellectual property rights and investment bankers to accrue more profits. In the 1980’s, the Massachusetts Institute of Technology (MIT) opened a technology licensing office to commercialize patents for inventions developed by university researchers. By 1986, MIT patent royalties averaged $2 million annually, primarily resulting from computer magnetic-core memory and synthetic penicillin. Because the university had rarely licensed inventions before, John Preston, the licensing office director, focused on licensing and marketing inventions in addition to filing patents. In 1987, MIT earned $3.1 million from over one hundred licensed inventions, including superconductors and software designed for artificial intelligence. MIT’s annual licensing revenues were fourth in the nation behind Stanford University’s $6.1 million, the University of California’s $5.4 million, and the University of Wisconsin’s $5 million. Collegiate research and development intensified the competition between rival inventors. Several researchers and industries vied for the rights to the superconducting material yttrium barium copper oxide. Former University of Houston colleagues Paul Chu and H. K. Wu each claimed to have been the first to describe that material in a paper they authored. Researchers at the Naval Research Laboratory, American Telephone and Telegraph Company, and IBM also claimed to have been the first to identify the material. USPTO representatives alerted MIT researchers who had formed the American Superconductor Corporation of the material’s
patent status, realizing that their forthcoming patent for a technique to make wiring using yttrium barium copper oxide might be challenged. Professors realized that if their work generated substantial income, they might receive more funds for future research, and many researchers diverted focus on pure science to innovate applications of existing inventions. Industries gave several hundred million dollars to universities for research, including $37 million to MIT in 1987. Some academic researchers established companies to manufacture and sell outstanding inventions considered commercially promising, with the MIT technology licensing office owning percentages of those companies. A 1980 federal law stated that universities owned outright intellectual property they developed with federal funds. Researchers recognized that a variety of fields, including biology and electronics, influenced the invention process. Schools encouraged researchers to pursue electronic, biomedical, and genetic inventions, as exemplified by the Biolistics gene gun, created by Cornell University researchers in 1983 to insert genetic information in cells quickly, which was leased to bioengineering companies. Inventive Culture
Inventors secured grants from states and other sources. During the 1980’s, the Hawaii Invention Development Loan Program, recognizing the potential benefits to the state economy that would result, donated to public libraries books and other sources of information explaining the theory and practice of patenting and marketing inventions. A state committee reviewed inventors’ applications to approve loans of up to $50,000. Other states implemented similar incentives. During the 1980’s, many inventors sought camaraderie in such professional organizations as the American Society of Inventors, the National Society of Inventors, the Society of American Inventors, Inventor’s Workshop International, and regional, state, and Canadian inventing groups. They read journals, such as American Inventor and The Lightbulb, distributed by those groups and other publishers. Expositions displayed new inventions. Starting in 1982, Valérie-Anne d’Estaing Giscard compiled yearly editions of invention almanacs in France, releasing the first U.S. volume of The World Almanac Book of Inventions in 1985 and holding an invention contest for readers. Schools and companies sponsored invention competitions and fairs for children. The Na-
The Eighties in America
tional Inventors Hall of Fame inducted such notable 1980’s U.S. inventors as Andrew J. Moyer, whose two patents facilitated penicillin production. In 1980, the USPTO commemorated the first fifty years of plant patents. Impact U.S. inventions during the 1980’s advanced previous standards of transportation, communications, and biotechnology and permeated American culture. People enjoyed playing portable video games, conducting business with mobile cellular phones, and using personal computers. Science fiction depicted new inventions and suggested future innovations. Inventions such as lasers and genetically engineered microorganisms surrounded people in their daily lives, even when they were unaware of them. Many 1980’s inventions contributed to U.S. economic growth by enabling new employment via telecommuting and providing electronic goods as market assets consistently in demand. Consumers, however, rejected some inventions, because they did not consider them necessary, affordable, or appealing. Although some inventions created new types of jobs, such as programming computers, others displaced laborers by performing their tasks more efficiently. Occasionally, inventions provoked a backlash, as when genetic engineering upset people who believed that it was unethical. Some inventions caused malicious behavior, as when computer hackers broke into online systems or created viruses to destroy data or disrupt networks. Subsequent Events Many 1980’s inventions provided the foundation for future developments and innovations in the next decade and early twenty-first century. For example, Kodak’s 1986 invention of the initial megapixel sensor preceded the development of digital cameras and quality photography not requiring film and processing. Legal victories protecting patents facilitated the emergence of new scientific and technological fields, such as genetic engineering, which thrived and set precedents for further investigations. Further Reading
Brown, David E. Inventing Modern America: From the Microwave to the Mouse. Foreword by Lester C. Thurow. Introduction by James Burke. Cambridge, Mass.: MIT Press, 2002. This Lemelson-MIT Program for Invention and Innovation publication describes the work and impact of several 1980’s inventors.
Inventions
■
527
Brown, Kenneth A. Inventors at Work: Interviews with Sixteen Notable American Inventors. Foreword by James Burke. Redmond, Wash.: Tempus Books of Microsoft Press, 1988. Profiles mostly male U.S. inventors, including those whose inventions were significant during the 1980’s, providing insights about their ideas and development. Evans, Harold, with Gail Buckland and David Lefer. They Made America: From the Steam Engine to the Search Engine—Two Centuries of Innovators. New York: Little, Brown, 2004. This detailed text accompanied a PBS documentary featuring invention. The section on the digital age examines computational, genetic, medical, and telecommunications inventions. Gausewitz, Richard L. Patent Pending: Today’s Inventors and Their Inventions. Old Greenwich, Conn.: Devin-Adair, 1983. The author is an engineer, inventor, and patent attorney, who explains why the legal system for patents in the early 1980’s needed reform, providing case histories depicting various patent-law issues. Macdonald, Anne L. Feminine Ingenuity: Women and Invention in America. New York: Ballantine Books, 1992. Provides information about often-overlooked female inventors in the 1980’s who contributed to successful biomedical and aerospace endeavors, in addition to women innovators who created domestic inventions. Petroski, Henry. Invention by Design. Cambridge, Mass.: Harvard University Press, 1998. Discusses how inventors’ personalities, culture, politics, and economics influenced several 1980’s inventions. _______. Success Through Failure: The Paradox of Design. Princeton, N.J.: Princeton University Press, 2006. Explores how design mistakes inspire invention improvements, discussing the 1980’s development of PowerPoint software and the construction and deployment of the space shuttle. Van Dulken, Stephen. American Inventions: A History of Curious, Extraordinary, and Just Plain Useful Patents. Washington Square, N.Y.: New York University Press, 2004. Several 1980’s inventions are included in this discussion of patents for toys, exercise equipment, entertainment devices, and hygiene and beauty items. _______. Inventing the Twentieth Century: One Hundred Inventions That Shaped the World—from the Airplane to the Zipper. Introduction by Andrew Phillips. Washington Square, N.Y.: New York University
528
■
Iran-Contra affair
Press, 2000. Documents each decade of the twentieth century in a separate chapter, providing patent illustrations and inventions’ histories. Suggests resources, both print and online, to research patents issued in thirty-nine countries. Elizabeth D. Schafer See also
Artificial heart; Bioengineering; Business and the economy in Canada; Business and the economy in the United States; Camcorders; Cell phones; Compact discs (CDs); Computers; Fax machines; Medicine; Prozac; Robots; Science and technology; Space exploration; Superconductors.
■ Iran-Contra affair The Event
A scandal results from illegal, covert arms trafficking among the United States, Iran, and Nicaraguan rebels Date Scandal broke in 1986 The Iran-Contra affair was orchestrated in order to circumvent the will of Congress, which had forbidden the Reagan administration from continuing military aid to the rightwing rebel army in Nicaragua. The ensuing scandal raised serious questions about the abuse of presidential powers in foreign affairs and the extent of congressional oversight of foreign affairs. It also demonstrated the extent to which the Cold War mentality of the 1980’s was able to justify dealings with and support for almost any regime that was anticommunist, including hostile fundamentalist Islamic regimes. From 1937 to 1979, Anastasio Somoza García and his two sons, Luis Somoza Debayle and Anastasio Somoza Debayle, ran a brutal dictatorship in Nicaragua. In the wake of the great earthquake of 1972, which crumbled the capital city of Managua, the world reacted not only to the resulting devastation but also to the shocking corruption of the Somoza regime. Massive humanitarian relief supplies were shipped to the country, only to be reshipped by the Somozas for sale abroad. A broad-based anti-Somoza internal opposition arose in Nicaragua. The opposition called itself the Sandinista National Liberation Front, after Augusto César Sandino, a revolutionary leader of the late 1920’s and early 1930’s whom Somoza García had assassinated. In 1979, following a bloody civil war, Somoza Debayle fled to the United States, while elements of his National Guard crossed
The Eighties in America
over the border into Honduras to organize the counterrevolutionary Contra movement. Leadership of the new Sandinista regime was placed in the hands of Daniel Ortega, a member of the Sandinista left wing. Ortega’s government was supported in its resistance to the Contras by Fidel Castro, who send military aid and advisers to Nicaragua. The replacement of the pro-United States Somoza regime by one sympathetic to Castro was only one hemispheric problem worrying the new U.S. president, Ronald Reagan. A second was the replacement of a friendly government in Grenada by one friendly to Castro under Maurice Bishop. At the same time, a guerrilla war raged in El Salvador against the military junta that had seized power. The guerrillas received support from the Sandinistas. From cold warrior Reagan’s perspective, there was an evident Russian-Cuban-Nicaraguan connection. The Ortega regime had to be stopped. Under the socalled Reagan Doctrine, anticommunist movements worldwide were to be supported. Aiding the Contras In March, 1981, the Central Intelligence Agency (CIA) helped organize and finance a movement, composed of ex-Somoza National Guard members and disenchanted former Sandinistas, to destabilize and topple the Ortega government. Operating from bases in Honduras, approximately fifteen thousand Contras were trained to launch raids on bridges, fuel depots, food storehouses, and a host of other “soft” targets. The aim of these activities was to destabilize the Ortega regime by causing widespread shortages of daily commodities—essentially, to make civilians suffer until they rejected a government that could not protect them. Opposition to this strategy grew in the United States as the human suffering in Nicaragua increased. Reports surfaced of direct attacks on civilians by the Contras, as well as of violent repressive measures being taken by the Sandinista regime to suppress the counterrevolution. Neither type of report could easily be corroborated. President Reagan saw the Contras not as abusers of human rights but rather as “the moral equivalent of our Founding Fathers.” By 1983, the targets of these “freedom fighters” dramatically expanded. The CIA orchestrated the mining of Nicaraguan harbors to prevent overseas trade. This tactic was condemned by the World Court at the Hague as violating international law—a conclusion rejected by the Reagan
The Eighties in America
administration. The mines and the refusal of the Reagan administration to accept the World Court’s declaration motivated the U.S. Congress to act. Revising a law known as the Boland Amendment (first passed in 1982), Congress banned the U.S. government from using any finances to support military or paramilitary actions by the Contras. In 1984, free elections were held in Nicaragua and supervised by the United Nations. Rather than compete in the elections, however, the Reagan-backed Contra leaders boycotted them. Ortega was elected president of Nicaragua when his party received 67 percent of the vote. These events seemed to strengthen the moral weight of the Boland Amendment, as the Contras had now refused to participate in their nation’s democratic process and chosen instead to use violence against a democratically elected socialist government. Circumventing the Ban
Despite the elections and the law forbidding the Reagan administration from doing so, either President Reagan or his subordinates decided to continue supporting the Contras, eventually triggering a major scandal. Coordination of the Contra movement was transferred from the CIA to the National Security Council (NSC), headed by Robert McFarlane (1983-1985) and Vice Admiral John Poindexter (1985-1986). Lieutenant Colonel Oliver North, a U.S. Marine who had worked for the NSC since 1981, would serve as the chief liaison with the Contras. Elaborate schemes were devised to create and channel foreign and private finances into a slush fund so that funding of the Contras could continue. The strangest of these schemes involved the covert sale of arms to Iran, the United States’ bitter enemy since the seizure of the U.S. embassy staff in 1979. However, Iran, locked in the final phases of a major war with Iraq, was in desperate need of modern equipment and could not look to the Soviet Union for assistance. As originally set up by Michael Ledeen (a consultant to Robert McFarlane), the operation called for modern antitank missiles and other military supplies to be sent through Israel, which would sell the missiles to Iran and
Iran-Contra affair
■
529
Arms for Hostages? On November 13, 1986, President Ronald Reagan addressed the nation, responding to reports that the United States had supplied weapons to Iran in exchange for the release of American hostages in Lebanon: The charge has been made that the United States has shipped weapons to Iran as ransom payment for the release of American hostages in Lebanon, that the United States undercut its allies and secretly violated American policy against trafficking with terrorists. Those charges are utterly false. The United States has not made concessions to those who hold our people captive in Lebanon. And we will not. The United States has not swapped boatloads or planeloads of American weapons for the return of American hostages. And we will not. Other reports have surfaced alleging U.S. involvement: reports of a sealift to Iran using Danish ships to carry American arms; of vessels in Spanish ports being employed in secret U.S. arms shipments; of Italian ports being used; of the U.S. sending spare parts and weapons for combat aircraft. All these reports are quite exciting, but as far as we’re concerned, not one of them is true. During the course of our secret discussions, I authorized the transfer of small amounts of defensive weapons and spare parts for defensive systems to Iran. My purpose was to convince Tehran that our negotiators were acting with my authority, to send a signal that the United States was prepared to replace the animosity between us with a new relationship. These modest deliveries, taken together, could easily fit into a single cargo plane. They could not, taken together, affect the outcome of the six-year war between Iran and Iraq nor could they affect in any way the military balance between the two countries. Those with whom we were in contact took considerable risks and needed a signal of our serious intent if they were to carry on and broaden the dialog. At the same time we undertook this initiative, we made clear that Iran must oppose all forms of international terrorism as a condition of progress in our relationship. The most significant step which Iran could take, we indicated, would be to use its influence in Lebanon to secure the release of all hostages held there.
530
■
Iran-Contra affair
return the money to the Contra slush fund. Suspicious that the activity might not have official U.S. approval, Israel ceased its cooperation after three shipments. As an alternate plan, the NSC approved the direct sale of a massive amount of missiles and other arms to Iran at greatly inflated prices. The proceeds would go directly to the Contras without arousing congressional suspicions. As a secondary motive, it was believed that the arms shipments would gain the cooperation of the Iranian Shiite government. Six Americans were being held as hostages by the Lebanese Shiite paramilitary group Hezbollah, and members of the Reagan administration hoped that Iran might pressure its fellow Shiites to release those hostages. Contragate Swings Open For a time, the covert funding plan worked flawlessly. However, the downing of a plane in Nicaragua that was carrying arms for the Contras resulted in a Lebanese magazine publishing a story on November 3, 1986, that arms were being shipped to Iran in exchange for the release of the hostages held in Lebanon by Hezbollah. Iran soon confirmed that it was receiving arms shipments from the United States. President Reagan was forced to give a televised address on November 13, admitting that weapons were being supplied to Iran but denying that the sales were made in return for the exchange of hostages. Rather, he claimed that they were designed simply to breed goodwill between the two nations—and that the best way for Iran to demonstrate its goodwill would be to secure the release of the hostages. In the meantime, North and his secretary, Fawn Hall, were busily shredding documents in their offices. Finally, on November 21, the day in which Admiral Poindexter submitted his resignation and Lieutenant Colonel North was unceremoniously fired, Attorney General Edwin Meese III revealed that the purpose of selling arms to Iran was to create funds to aid the Contras. As questions reverberated through the press and congressional criticism mounted, President Reagan created a presidential commission headed by Senator John Tower. The Tower Commission was to investigate the Iran-Contra affair (which would come to be known as both Contragate and Irangate, an allusion to the Watergate scandal of the previous decade). It was also tasked with investigating the general operations of the NSC. Meanwhile, a variety of congressional com-
The Eighties in America
mittees held hearings of their own. By the end of 1986, Lawrence E. Walsh was commissioned by Congress as a special prosecutor with the power to investigate and prosecute crimes committed during the Iran-Contra affair. The Tower Commission worked with considerable speed and completed its report on February 26, 1987. Both North and Poindexter, as well as Defense Secretary Caspar Weinberger, were severely criticized for their roles in the affair. Reagan came under criticism for not properly supervising the NSC. This failure was blamed, however, on his general disengagement from his administrative officials. Little was said in the report about the former director of central intelligence and then-vice president, George H. W. Bush. One week after the issuance of the Tower Report, President Reagan held a press conference in which he expressed his regrets for the IranContra affair. He admitted that covert arms sales had taken place to create funds for the purchase of weapons for the Contras. Indeed, of the $30 million paid by Iran for weapons, $12 million had been returned to the government and $18 million went to support the Contras. He pointed to both Poindexter and North as the individuals responsible for the operations. Both men were questioned extensively by Congress during the summer of 1987, but they were granted immunity in return for their testimony, as Congress was seeking to establish the responsibility of their superiors for the affair. The hearings were nationally televised, and North in particular became a media celebrity. Indictment, Conviction, Absolution In 1988, North was indicted by a grand jury on twelve counts and was convicted of three. However, his convictions were overturned on appeal. The prosecution claimed that no information used in the trial had been obtained from North’s testimony before Congress, which would have been a violation of his immunity deal. The appellate court, however, ruled that his nationally televised testimony had tainted the prosecution irremediably, as there was no way to prove that the jury had no access to the information revealed in the congressional hearings or that the prosecution had not used that information to develop its investigation. Poindexter was convicted in 1990 of felonies such as obstruction of justice, conspiracy, and lying to Congress. His conviction was also overturned as a result of his immunity deal and public testimony.
The Eighties in America
On April 13, 1989, the Kerry Committee Report was released by Senator John Kerry, revealing that a major source of funding for the Contras came from Latin American cocaine traffickers in exchange for protection from law-enforcement activity. The same planes that flew shipments of arms from the United States to Nicaragua were used to import cocaine and other drugs into the country on their return flights. The drug connection became a major source of controversy, especially in the context of the Reagan administration’s Just Say No anti-drug campaign. Impact
The Iran-Contra scandal fascinated the public for a time, but, unlike the Watergate scandal, it was soon forgotten. President Reagan’s approval rating plunged to 46 percent in 1986 as a result of the affair; however, it rose to 63 percent by 1989, a level only exceeded by Franklin D. Roosevelt. Reagan acquired the nickname “Teflon president,” because none of the charges tainting the rest of his administration seemed to stick to him. There was little in the way of public outcry when President Bush later pardoned the major individuals involved in Iran-Contra. By then, the affair was only a vague memory, and it was associated with the perceived realities of a Cold War era that had recently come to an end. Bush was even able to appoint individuals implicated in the Iran-Contra affair to positions in his administration. Indeed, from a broader perspective, the obsession with events in Nicaragua reveals the extent to which anticommunist ideology dominated U.S. political life. It was a fitting sequel to the invasion of Grenada and an appropriate final chapter in the extreme behavior caused by the Cold War. As a beginning chapter in dealing with the threat of hostile fundamentalist Islamic regimes, however, the sale of missiles to Iran indicates a glaring lack of awareness. While the missile deal did result in the freeing of three U.S. citizens held captive by Hezbollah in Lebanon, three more hostages were immediately seized to take their place.
Subsequent Events
In 1992, Robert McFarlane had been convicted of crimes related to the Iran-Contra affair, and Caspar Weinberger had been indicted and was awaiting trial. President Bush pardoned them both, as well as four other officials involved in the scandal. The independent prosecutor’s report, issued in 1994, revealed that both President Reagan and Vice President Bush had some knowledge of
Iranian hostage crisis
■
531
what was going on in the Iran-Contra affair and had a role in the cover-up. Further Reading
Draper, Theodore. A Very Thin Line: The Iran-Contra Affair. New York: Simon & Schuster, 1992. Contains keen analysis of the affair based on a thorough examination of documentary sources. Bibliography, footnotes, index, and sixteen pages of pictures. Walsh, Lawrence E. Firewall: The Iran-Contra Conspiracy and Cover-Up. New York: W. W. Norton, 1998. Detailed chronological account and analysis by the special prosecutor in the Iran-Contra affair. Includes index. Webb, Gary. Dark Alliance: The CIA, the Contras, and the Crack Cocaine Explosion. New York: Seven Stories Press, 1998. Heavily researched but controversial study of the CIA-drug connection by an investigative reporter. Index and table of contents. Woodward, Bob. The Secret Wars of the CIA, 1981-1987. New York: Simon & Schuster, 2005. One of the top U.S. investigative journalists analyzes the nature and scope of the Iran-Contra affair and other covert activities. Index, bibliography, and footnotes. Irwin Halfond See also
Cold War; Foreign policy of the United States; Iranian hostage crisis; Latin America; Middle East and North America; North, Oliver; Poindexter, John; Reagan, Ronald; Reagan Doctrine; Scandals.
■ Iranian hostage crisis The Event
Extremist Muslim students take Americans hostage Date November 4, 1979 to January 20, 1981 Place U.S. embassy, Tehran, Iran The Iranian hostage crisis represented the United States’ first confrontation with Shiite fundamentalist extremists. Taking place in a nation long considered to be one of America’s closest allies in the Middle East, the crisis revealed the extent of anti-Americanism in the region. Failure to gain release of the hostages was a national humiliation, and it became a major factor in the defeat of President Jimmy Carter in his 1980 reelection bid.
532
■
Iranian hostage crisis
Since 1953, Mohammad Reza Shah Pahlavi’s Iran had stood second only to Israel as the United States’ closest ally in the Middle East. His loyalty had been partly secured in 1953, when he was involved in deposing Prime Minister Mohammad Mossadegh at the behest of the United States and the United Kingdom after Mossadegh had attempted to nationalize Iran’s oil industry. Because the shah seemed to be firmly in power during the 1970’s, there was little U.S. concern over mounting demonstrations against his regime in 1978. The demonstrations originated on both sides of the political spectrum: Liberal reformers and leftists were alienated by the shah’s repressive tactics and desired the institution of democracy, while religious traditionalists were incensed by the pro-Western policies of the shah’s regime. By December, 1978, a mass protest of nearly one million people clearly indicated that changes had to be made. The shah appointed a reformist government to placate the masses and left Iran in January, 1979,
The Eighties in America
to seek extensive medical treatment for cancer. Continuing demonstrations led to the return of Ayatollah Khomeini, a religious leader who had been in exile from Iran since 1962. By March, an Islamic republic under Khomeini was formed. On October 22, 1979, the shah traveled to the United States for cancer treatment. Suspicion and anger flared. Khomeini denounced the United States as the “Great Satan” and an enemy of Islam, as students burned American flags in the streets. Mass protests were launched daily in front of the U.S. embassy. Used to the large number of demonstrations taking place, the embassy staff did not suspect that a group of three hundred students planned to take part in one of the demonstrations, cut the chains that bound the embassy gates, and then break into the compound. The plan, executed on November 4, worked flawlessly. The students were not fired on by embassy guards, who instead joined staff in destroying sensitive communications equipment and shred-
In the wake of Operation Eagle Claw, the scorched remains of a C-130 aircraft lie in the Iranian desert. The operation was aborted after the plane collided with a Marine helicopter, killing eight soldiers. (AP/Wide World Photos)
The Eighties in America
ding documents. The scene was reminiscent of the last hours of the U.S. embassy in Saigon, Vietnam. 444 Days of Captivity The students seized control of the embassy compound, and they took hostage sixty-three members of the embassy’s staff, as well as three other Americans who were present. While the U.S. population raged at this flaunting of the basic principles of international law, the government remained calm. President Jimmy Carter fruitlessly asked for the release of hostages on humanitarian grounds. He then decided to exert pressure. On November 12, oil imports from Iran were terminated. Two days later, $8 billion in Iranian assets were frozen in U.S. banks. These tactics had no apparent effect. On November 19, the students released thirteen of their hostages, all women or African Americans, claiming it was a gesture of solidarity with repressed minorities and of Islamic respect for women. No other hostages were released. Months passed. As the crisis dragged on, blindfolded hostages were frequently paraded before the press for the edification of the Iranian people. An American public still traumatized by the disastrous outcome of the war in Vietnam watched in horror as their government groped for a strategy to gain release of the hostages. Clearly, Carter found it impossible to give in to student demands. Instead, an ambitious rescue mission code-named Operation Eagle Claw was planned. Eagle Claw was to be the first (known) mission undertaken by Delta Force, an elite counterterrorism unit of the U.S. Army formed under orders from Carter in 1977. On April 24, 1980, eight U.S. Marine Corps helicopters were landed in a desert near Tabas for use in the mission. Disaster soon struck. One of the helicopters broke down while landing, and two more broke down in a sandstorm. A decision was soon made to abort the mission, but as one helicopter took off, it sideswiped a transport plane. Eight servicemen lay dead. Compounding the public humiliation caused by the failure of the mission, the Iranian government gloated, claiming the debacle was the result of divine intervention. Settlement On July 11, 1980, the hostage-takers released Richard Queen, who had become seriously ill. (He was later diagnosed with multiple sclerosis.) The shah died of cancer on July 27, thus rendering moot the students’ demand that he be returned to Iran to stand trial. Iran faced a host of other problems, including a major war with Iraq that began in
Iranian hostage crisis
■
533
September, 1980. Nevertheless, they held the hostages through early November, when Ronald Reagan was elected president. Algerian intermediaries were used to broker an agreement that was concluded on January 19, 1981. The terms of the negotiations are subject to radical dispute. It was claimed that the United States promised not to intervene in Iranian affairs, to release the $8 billion in frozen Iranian assets, and to grant immunity to Iran from lawsuits arising out of the seizure. Subsequently, the United States sold weapons to Iran, and the Reagan administration repeatedly denied charges that there had been a secret agreement to provide those weapons in return for the hostages. Those on the far left, meanwhile, claimed that the weapons were sold, not in exchange for releasing the hostages but for delaying their release until after Reagan’s victory, thereby assuring his election. Such allegations were never proven, but they became one of the many conspiracy theories that helped define the decade. On January 20, 1981, as Reagan was completing his inaugural speech, the hostages were flown to Algeria. Waiting to receive them was former president Jimmy Carter, who was sent by Reagan as a special emissary. The 444 days of captivity were at an end, for both the hostages and the nation. Impact The Iranian hostage crisis was a major humiliation for a nation that still had not recovered from the Vietnam War. It was a debacle that led to a feeling of national impotence that proved lethal to the reelection bid of Jimmy Carter. On the other hand, the affair provided a kick start for Ronald Reagan, who gained release of the hostages within his first hour as president. Bitterness felt toward Iran was manifest in U.S. support for Iraq in the Iran-Iraq War. However, this bitterness did not preclude the Reagan administration from selling missiles to Iran covertly to gain illegal funds that were funneled to aid the Contras fighting to overthrow a pro-Marxist regime in Nicaragua. In spite of events in Iran, the United States was still preoccupied with the Cold War, and it saw any chance to resist communism as a chance worth taking, even if it strengthened the military capabilities of an anti-American extremist Muslim regime. In addition to its geopolitical effects, the hostage crisis had a direct and lasting impact upon U.S. military structure and counterterrorist strategy and tactics. In the wake of the Operation Eagle Claw deba-
534
■
Irving, John
The Eighties in America
cle, the U.S. Army sought to ensure that a similar air disaster involving special operations forces would never be repeated. It established the 160th Special Operations Aviation Regiment (the Night Stalkers), which was specially trained and tasked with providing air transportation and support for special operations forces, as well as for general forces. In addition, a unified command, dubbed the U.S. Special Operations Command (USSOCCOM), was created in 1987 to oversee the special operations forces of all four branches of the military. This unified command was first conceived in response to Operation Eagle Claw, and it later arose out of the passage of the GoldwaterNichols Act of 1986, which established a unified command structure for all U.S. armed forces. Further Reading
Bowden, Mark. Guests of the Ayatollah: The Iran Hostage Crisis, the First Battle in America’s War with Militant Islam. Berkeley, Calif.: Atlantic Monthly Press, 2006. Chronological analysis of the event, filled with riveting narrative. Farber, David. Taken Hostage: The Iran Hostage Crisis and America’s Encounter with Radical Islam. Princeton, N.J.: Princeton University Press, 2004. Contains excellent historical background and analysis of anti-U.S sentiment in the Muslim world. Houghton, David P. US Foreign Policy and the Iran Hostage Crisis. Cambridge, England: Cambridge University Press, 2001. Scholarly political analysis of policy making, with interviews of major decision makers. Irwin Halfond See also
Canadian Caper; Foreign policy of the United States; Goldwater-Nichols Act of 1986; IranContra affair; Middle East and North America; Reagan, Ronald.
■ Irving, John Identification American novelist Born March 2, 1942; Exeter, New Hampshire
A best-selling and critically acclaimed novelist, Irving published work during the 1980’s that helped readers understand issues that were important to the decade, such as changing sexual and family roles, abortion, and the aftermath of the Vietnam War.
John Irving. (© Marion Ettlinger)
John Irving became well known to readers with the critical and financial success of his fourth novel, The World According to Garp (1978). The novels he published during the 1980’s, The Hotel New Hampshire (1981), The Cider House Rules (1985), and A Prayer for Owen Meany (1989), were similarly successful, both because readers found them fascinating to read and because they dramatized issues with which many readers themselves were grappling. Each book was also made into a film. Irving’s work was inspired by his favorite novelist, Victorian writer Charles Dickens. Like Dickens, Irving devoted much time and effort to character development, resulting in lengthy novels. Irving also learned from Dickens and from Graham Greene, another of his heroes, that emotional involvement with the characters was the best way to engage a reader’s attention. Because Irving’s novels often mimic the structure and emotional appeal of nineteenth century realist texts, rather than engaging in more experimental modes, they represent the world in a way that was comfortable rather than threatening to many readers of the 1980’s.
The Eighties in America
Irving’s friendship with fellow best-selling author Kurt Vonnegut is reflected in the plots of several of Irving’s novels. Unlike his mode of storytelling, these plots are not at all comfortable. As in Vonnegut’s books, Irving’s characters are frequently overwhelmed by forces they cannot control and must seek new ways to cope with problems, because the old solutions will not work. In The Hotel New Hampshire, a family tries desperately to stay together in the face of a world full of chaos and violence. The title is a reference to three hotels the family operates, none of which is successful, a metaphor for their declining life. The “cider house rules” referred to in the book of that title are posted in a cider house where one of the main characters works. He finds that no one follows the rules; the actual rules are determined by the dominant male worker. Similarly, the main character begins the book opposed to abortion; by the end, he finds that the rules with which he started out his career do not apply, and he performs abortions. The title character of A Prayer for Owen Meany wants to be a hero of the Vietnam War and save the Vietnamese people. He does help save them, but not in Vietnam and not from communism. Rather, he saves Vietnamese immigrants in Arizona from a homicidal American. Impact Irving used the realist novel form continually to remind his readers that nothing is as simple as it seems. As an author successful with both readers and critics, he helped define the literary culture of the 1980’s, and he intertwined that culture with issues of concern during the decade, reinforcing the sense that literature could be relevant to one’s daily experience and could engage with a society increasingly defined by its popular, visual media. Further Reading
Campbell, Josie P. John Irving: A Critical Companion. Westport, Conn.: Greenwood Press, 1998. Davis, Todd F., and Kenneth Womack, eds. The Critical Response to John Irving. Westport, Conn.: Praeger, 2004. Reilly, Edward C. Understanding John Irving. Columbia: University of South Carolina Press, 1991. Jim Baird See also Abortion; Book publishing; Literature in the United States.
Israel and the United States
■
535
■ Israel and the United States Definition
Diplomatic, economic, and strategic relations between the United States and Israel
The special relationship between the United States and Israel continued during the 1980’s, as, despite some difficulties, Washington supported the Jewish state militarily and politically against its hostile Arab neighbors while seeking to find a road to peace in the Middle East. Throughout the 1980’s, American officials at the national, state, and local levels overwhelmingly expressed support for Israel and its security, while hoping that the peace initiatives of the 1970’s, known as the Camp David Accords, would continue. Behind the scenes, however, while the United States continued to assure Israel of its support, there were also some disagreements and confrontations. Ronald Reagan was one of Israel’s staunchest presidential supporters. His commitment to the Jewish state stemmed from political, historical, and strategic U.S. interests, as well as his own personal religious feelings toward the biblical homeland of the Jews, to which he often referred in speeches. Reagan also looked on Israel as an important ally in his confrontation with the Soviet Union. Washington established communication, security, and economic links with Tel Aviv. The administration expanded the U.S.-Israel Strategic Cooperation Agreement to include the Joint Political-Military Group ( JPMG), which oversaw joint programs involving military and intelligence affairs. It also established the Joint Security Planning Group ( JSPG) to plan for Israel’s security needs within existing budget constraints. Washington also introduced a U.S.Israeli free trade agreement that afforded Israel the same status as Western Europe, leading to increased trade between the two countries. In 1985, the president authorized a $1.5 billion grant to alleviate Tel Aviv’s runaway inflation. The grant created a precedent, continuing throughout the decade and beyond, as more financial and military aid was given to Israel annually. Washington did not always move in lockstep with Tel Aviv, however. In March, 1980, when Jimmy Carter was still in office, the United States joined other members of the U.N. Security Council in condemning Israeli settlements on Palestinian land. Carter did still support Tel Aviv at the United Nations, which condemned the extension of Israeli law
536
■
The Eighties in America
Israel and the United States
to the Golan Heights. Furthermore, some of President Reagan’s chief advisers—including Vice President George H. W. Bush, Secretary of Defense Caspar Weinberger, and White House Chief of Staff James Baker—were not enthusiastic about Israeli policies, and they often persuaded the president to oppose Tel Aviv on specific issues. When an Israeli air strike destroyed Iraq’s nuclear reactor in 1981, Washington suspended delivery of F-16 Fighting Falcons to the country. The U.S. ambassador to the United Nations, Jeane Kirkpatrick, worked with the Iraqi delegation to support a Security Council resolution condemning the attack. The administration, with the exception of Secretary of State Alexander Haig, did not support Israel’s concern over a buildup of Palestine Liberation Organization (PLO) forces in southern Lebanon in the early 1980’s. Thus, when Prime Minister Menachem Begin visited Washington, the president sought his assurances that Israel would not violate Lebanon’s sovereignty to strike at the PLO. Begin—who disagreed with such U.S. peace initiatives as formal recognition of the PLO and providing Palestinian autonomy under Jordan—provided Reagan with some limited assurances that he would leave Lebanon alone, but in June, 1982, Israeli forces crossed the border, invading that country. Initially, the United States refrained from condemning the invasion, but it soon sent strong messages to Tel Aviv, calling for the cessation of hostilities. In August, the United States joined a multinational peacekeeping operation to oversee the evacuation of PLO forces from southern Lebanon. U.S. envoy Morris Draper helped supervise talks between Israel and Lebanon in December. Later that month, Washington called for a freeze on new Jewish settlements on occupied Palestinian land. George P. Shultz, who was less friendly to Israel than was Haig, had become secretary of state in July, but he proved also to be a supporter of Tel Aviv. In 1984, Washington and Tel Aviv disagreed over the usefulness to the peace process of a meeting between PLO leader Yasir Arafat and Egyptian president Hosni Mubarak. Israeli prime minister Yitzhak Shamir’s pragmatism, however, encouraged better relations by the end of the year. In 1988, Washington supported a U.N. resolution condemning Israel’s expulsion of Palestinians into Lebanon. That same year, however, the United States was the only nation to join Israel in objecting to Arafat’s desire to ad-
dress the United Nations. Washington refused to provide him with an entrance visa for that purpose, but he was able to come to New York anyway, and he held talks with some U.S. officials. One of the most disturbing conflicts between the two countries occurred in 1985, when Jonathan Pollard, a U.S. naval intelligence officer was arrested and convicted of selling secrets to Israel. In another case in 1986, John Demjanjuk, a Cleveland autoworker, was extradited to Israel, where he was tried and convicted of being a concentration camp guard during World War II. Demjanjuk and his supporters, who included Holocaust deniers, claimed he had been misidentified, and his case continued through twists and turns for many years. Impact The United States, of all members of the global community, had the most influence on Israel. The links of the American Jewish community to the Israeli state played a role in American politics, as well as providing financial and moral support to Tel Aviv. Furthermore, the sympathy for Israel generated by memories of the Holocaust and by its status as a thriving democracy in the region resulted in a great deal of support among the American public at large. The Reagan and Bush administrations were somewhat divided over Israel. While President Reagan was wholeheartedly and genuinely committed to the Jewish state, others in his administration were frustrated over Tel Aviv’s policies and wished for a more balanced approach. Further Reading
Aura, Nasser Hasan, Fouad Moughabi, and Joe Stork. Reagan and the Middle East. Belmont, Mass.: Association of Arab-American University Graduates, 1983. Analysis from the Palestinian point of view, accusing the United States of an unfair bias toward Israel. Carter, Jimmy. Palestine: Peace Not Apartheid. New York: Simon & Schuster, 2006. Controversial assessment by the former president of U.S. policies toward Israel, including those of the Reagan administration as well as his own. Novik, Nimrod. Encounter with Reality: Reagan and the Middle East During the First Term. Boulder, Colo.: Westview Press, 1985. Analysis of Reagan’s initial Middle Eastern policies by a respected Israeli scholar. Olive, Ronald J. Capturing Jonathan Pollard: How One of the Most Notorious Spies in American History Was
The Eighties in America
Brought to Justice. Annapolis, Md.: Naval Institute Press, 2006. Account of the notorious espionage case by an American naval intelligence analyst. Quandt, William B. Peace Process: American Diplomacy and the Arab-Israeli Conflict Since 1967. Rev. ed. Berkeley: University of California Press, 2005. History of U.S. Middle Eastern diplomacy by a respected scholar of Middle Eastern studies. Spiegel, Steven L. The Other Arab-Israeli Conflict: Making America’s Middle East Policy, from Truman to Reagan.
Israel and the United States
■
537
Chicago: University of Chicago Press, 1985. Analysis of U.S. policy from the 1940’s through the 1980’s by a respected political scientist. Frederick B. Chary See also Anderson, Terry; Beirut bombings; Foreign policy of the United States; Haig, Alexander; Iranian hostage crisis; Jewish Americans; Middle East and North America; Reagan, Ronald; USS Stark incident.
J ■ Jackson, Bo Identification
Professional baseball and football player Born November 30, 1962; Bessemer, Alabama Bo Jackson became` one of the few athletes in modern times to excel in two different professional sports; he was recognized by fans of both as one of the greatest athletes in history. After being drafted by the New York Yankees in the second round of the amateur draft in 1982, Vincent Edward Jackson opted to attend Auburn University
and play baseball and football. Nicknamed “Bo” by his brothers, Jackson became a college legend, amassing 4,303 career yards as a football running back and winning the Most Valuable Player (MVP) awards in the 1983 Sugar Bowl and the 1984 Liberty Bowl. In 1985, he was a first-team All-American and was named the Heisman Trophy winner. On the baseball field, he hit with power and for a high average for Auburn. In the 1986 National Football League (NFL) draft, Jackson was chosen by the Tampa Bay Buccaneers as the number one selection. Since most baseball scouts
Los Angeles Raider Bo Jackson makes a 45-yard run against the Kansas City Chiefs during a home game on October 15, 1989. (AP/ Wide World Photos)
The Eighties in America
thought that Jackson would opt to play professional football instead of baseball, he slipped to the fourth round in the 1986 Major Baseball League (MLB) amateur draft before being chosen by the Kansas City Royals. Jackson surprised almost everyone when he signed with the Royals. He debuted in the Royals outfield on September 2, 1986. He hit his first home run for the Royals on September 14, and it was the longest home run that had ever been hit in Royals Stadium. In 1987, Jackson hit 22 home runs for the Royals. He also played in seven NFL games for the Oakland Raiders and gained 554 yards for an outstanding average of 6.8 yards per carry. In 1988, he hit 25 home runs and stole 27 bases for the Royals and gained 580 yards in ten games for the Raiders. Jackson’s best season in the major leagues was 1989, when he hit 32 home runs and drove in 105 runs. He was named to play for the American League in the 1989 All-Star Game, and he was named the MVP of the game. Just a few months after the baseball season ended, Jackson was selected to play in the NFL Pro Bowl. During his eight-year MLB career, Jackson hit 141 home runs, drove in 415 runs, and batted .250. In his fouryear NFL career—which was ended by an injury in 1991—Jackson rushed for 2,782 yards, caught 40 passes for 352 yards, and scored 18 touchdowns. Impact Bo Jackson was an exceptional athlete and was the first individual ever to be selected as an allstar in two professional sports. He has been heralded as one of the greatest two-sport athletes in history. He was a starting outfielder for the Kansas City Royals and a starting running back for the Oakland Raiders. In his first time at bat as an MLB All-Star, he hit a mammoth home run that traveled about 448 feet. In a single 1987 NFL game against the Seattle Seahawks, he rushed for 221 yards. As a result of his success in two professional sports, Jackson became a popular culture icon during the 1980’s. He signed a major marketing deal with Nike, starring in the company’s popular “Bo Knows” advertising campaign. He was also an icon in the Nintendo video game Tecmo Super Bowl. Further Reading
Devaney, John. Bo Jackson: A Star for All Seasons. New York: Walker, 1992. Kramer, Jon. Bo Jackson. Austin, Tex.: Raintree SteckVaughn, 1996. Alvin K. Benson
Jackson, Jesse See also
■
539
African Americans; Baseball; Football;
Sports.
■ Jackson, Jesse Identification
African American clergyman, civil rights activist, and politician Born October 8, 1941; Greenville, South Carolina When Jesse Jackson ran for president in 1984 and 1988, Americans were forced to consider a person of color as a serious candidate for that office. In November of 1983, the Reverend Jesse Jackson announced that he was seeking the Democratic Party’s nomination for president of the United States. He was forty-two years old. After the assassination of Martin Luther King, Jr., in 1968, Jackson had emerged as a dynamic leader in the civil rights movement. He had marched with King in the South and in Chicago. In 1976, he had created an organization, Push for Excellence (PUSH EXCEL), to reverse teenage pregnancy, crime, and high school dropout rates. He encouraged African Americans to strive for educational excellence, to take greater responsibility for their communities, and to boycott businesses guilty of racial discrimination. Although PUSH suffered many setbacks when conservative Republican Ronald Reagan became president in 1980, it continued to function during the 1980’s. The 1984 Presidential Campaign Jackson saw his campaign for the Democratic nomination in 1984 as a continuation of the Civil Rights movement. He charged that Democrats had lost their focus and were “spineless” in dealing with a Republican administration that ignored the social and economic needs of disadvantaged Americans. Many African American leaders, including Coretta Scott King, were skeptical about the timing of Jackson’s campaign. They feared a white backlash at the polls. However, Jackson had seen Republicans win by narrow margins in state after state and believed that, if Democrats could combat the apathy among African American voters, they could win the election. African American clergy encouraged his candidacy with the slogan “Run, Jesse, Run,” and they worked to increase African American voter registration. Jackson excelled as an orator; he sought to give voice to the concerns of Americans of all races who felt left out
540
■
Jackson, Jesse
Jesse Jackson in 1983. (Library of Congress)
of the political process, including 40 million poor whites, 6 million Hispanics, one-half million Native Americans, and millions of women. He urged them to join his National Rainbow Coalition organization. Early in the campaign, Jackson attracted national attention when he traveled to Syria as a “selfappointed diplomat” to obtain the release of an African American airman, Robert Goodman, Jr. When he succeeded, Jackson and Goodman received a hero’s welcome from crowds at the airport in Washington, D.C. A reluctant President Reagan, who had offered Jackson no support, received them at the White House. Many criticized Jackson’s bold mission as grandstanding, but he had demonstrated that he was not merely a civil rights activist. In June of 1984, Jackson engaged in more private diplomacy, traveling to Cuba, where he persuaded Cuban leader Fidel Castro to release twenty-two Americans and twenty-six anti-Castro Cubans who been jailed there.
The Eighties in America
Jackson competed in the primary elections with little money but tremendous energy. He traveled tirelessly to speak in New Hampshire, Florida, Michigan, Illinois, and South Carolina. Although there were initially seven candidates for the Democratic nomination, after the first few primaries the field narrowed to three: Walter Mondale, Gary Hart, and Jackson. Mondale was clearly the front-runner, supported by many African American politicians within the party, but Jackson carried the inner-city African American vote, placing third in the northern primary races. He won primaries in Virginia, South Carolina, Louisiana, and the District of Columbia. In New York, he received 26 percent of the vote. More African Americans voted in New York than in any prior state election, and Jackson carried almost one-quarter of the Hispanic vote as well. Nationally, 3.5 million voters, more than 20 percent of the total vote, supported his nomination; he carried more than forty congressional districts. Over 2 million voters in the primaries were newly registered. Jackson’s campaign, however, was plagued by serious blunders. His raw ambition, volatile temperament, and inclination to exaggerate contributed to his problems. In an “off the record” interview with African American reporters, Jackson used the ethnic slurs “Hymie” and “Hymietown” to refer to Jews and New York City. When these remarks were reported in The Washington Post, Jackson at first denied having made them, then he denied that they were ethnically insensitive. In late February, however, he made an emotional apology to Jewish leaders. He continued to apologize throughout the campaign, but irreparable damage had been done to his image as a leader of a diverse racial and ethnic coalition. Jackson further alienated American Jews and whites by his association with the black separatist leader of the Nation of Islam movement, minister Louis Farrakhan, who made frequent statements attacking whites and Jews. Jackson survived numerous political storms and was a prominent figure at the Democratic National Convention, but he did not win the nomination. President Reagan easily won reelection in 1984 over the Democratic candidate, Walter Mondale, but a record number of African American voters turned out for Mondale, thanks in part to Jackson’s efforts. In 1988, Jackson ran again against a number of Democratic hopefuls, drawing enormous crowds wherever he went. After early losses in Iowa and New Hampshire, he won several southern primaries, in-
The Eighties in America
cluding those in Alabama, Georgia, Louisiana, Mississippi, and Virginia. He gained an increasingly multiracial following, winning the Michigan primary and finishing second in Illinois and Wisconsin. Over one-fourth of Democrats voting nationwide supported him. However, any chance he had of winning the nomination was killed in New York. His earlier indiscrete remarks about Jews and concerns that an African American candidate could not be elected president resulted in his defeat by Governor Michael Dukakis of Massachusetts. Dukakis won the Democratic nomination but lost the general election to Republican George H. W. Bush. Impact Jackson’s campaigns raised the possibility that an African American could one day become president of the United States and schooled many young African American activists to assume leadership roles in politics. His strong showing in the primaries and active involvement in the convention of the Democratic Party moved African Americans nearer to the center of American political life and changed the participation of minorities in the political process. Further Reading
Barker, Lucius J. Our Time Has Come: A Delegate’s Diary of Jesse Jackson’s 1984 Presidential Campaign. Urbana: University of Illinois Press, 1988. Close look at the day-to-day events of the 1984 campaign by a Jackson delegate to the Democratic National Convention. Bruns, Roger. Jesse Jackson: A Biography. Westport, Conn.: Greenwood Press, 2005. Balanced biography that details Jackson’s early life and includes a timeline of events and photographs; includes insightful discussion of the impact of Jackson’s career. Jakoubek, Robert. Jesse Jackson: Civil Rights Leader and Politician. Philadelphia: Chelsea House, 2005. Enjoyable volume in the series Black Americans of Achievement that focuses on Jackson’s political life and includes color photographs. Edna B. Quinn See also
African Americans; Bush, George H. W.; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, 1984; Elections in the United States, 1988; Hart, Gary; Mondale, Walter; Reagan, Ronald; Reagan Revolution.
Jackson, Michael
■
541
■ Jackson, Michael Identification Pop music superstar Born August 29, 1958; Gary, Indiana
The innovation and charisma Michael Jackson brought to popular music and its presentation, particularly in pioneering large-scale dance productions in the new medium of music videos, made him the most successful and popular musical act of the 1980’s. Michael Jackson entered the 1980’s as a rising star. He had begun his career in childhood as lead singer of the family group, the Jackson Five, and, after a four-year hiatus, had reemerged with the enormously successful 1979 solo album Off the Wall. In 1982, his career soared to a dramatic new level with the release of Thriller. The album included seven hit singles and became the best-selling album in pop music history. Jackson and Music Videos
In promoting the album and its hit songs, Jackson took advantage of the music video, a recent innovation that was driving the cable television channel MTV. At the time, videos most often depicted bands performing their songs in unusual settings or accompanied those songs with abstract or suggestive visuals. It was relatively rare for them to tell stories. Jackson, however, developed videos with narrative structures that both illustrated and existed in counterpoint to his music. These videos often included spectacular dance productions and special effects. One of them, “Beat It,” the story of a loner’s attempt to stop a gang fight, became the first video by an African American performer to receive regular airplay on MTV. “Thriller,” a fourteenminute horror musical evocative of Night of the Living Dead, was packaged with a documentary on the making of the video to become the best-selling home music video in history. The album spent thirty-seven weeks at number one, won eight 1984 American Music Awards, and, along with his narrative for the storybook album E.T.: The Extra-Terrestrial, won for Jackson a record-setting eight Grammy Awards in 1984. In May, 1983, Jackson appeared on a television special celebrating Motown’s twenty-fifth anniversary wearing what would become his trademark—a single white sequined glove. He performed “Billie Jean,” during which he created a sensation when he introduced the “moonwalk,” a dance step in which he slid backwards while appearing to walk forward.
542
■
The Eighties in America
Jackson, Michael
On the heels of his enormous popularity, Jackson signed a $15 million promotional contract with Pepsi in 1984, but he was seriously burned during an accident at the filming of a television commercial. Later that year, he reunited the Jackson Five for the fivemonth Victory Tour. Then, in January, 1985, he collaborated with Lionel Richie to compose “We Are the World,” a three-time Grammy-winning song performed by an all-star group of pop music’s biggest stars to raise money for African famine relief. Following the 1987 release of Bad, which spawned seven more hit singles and topped the music charts for eight weeks, Jackson launched a sixteen-month worldwide tour that drew over 4.4 million fans. By
Michael Jackson. (Paul Natkin)
the end of the decade, each of Jackson’s three solo albums had been certified multi-platinum, and he had released five platinum and three gold singles. Controversy
By the mid-1980’s, stories about Jackson appeared regularly in the tabloid press, and many of them portrayed him in an unflattering light. Reports told of his attempts to purchase the bones of Joseph Merrick, a Victorian-era man whose physical deformities led him to be known as the Elephant Man, as well as Jackson’s penchant for sleeping in a hyperbaric chamber in an effort to retard the aging process. Most controversial, however, was the dramatic change in his appearance. With each new album and his subsequent reemergence into the public eye, Jackson’s skin grew lighter, his nose and lips thinner, his cheeks higher, and his jaw line more angular; at one point, he even developed a cleft in his chin. Some questioned whether the alterations were an attempt on Jackson’s part to reject his African American heritage, but he explained the change in a 1988 autobiography as primarily the result of puberty and diet. Jackson’s attempts to expand his musical and personal empire also courted controversy. He purchased a controlling interest in the publishing company that owned the rights to most of the Beatles’ music, creating a serious rift with former Beatle Paul McCartney, who was still performing and with whom Jackson had collaborated to create several hit songs. He became more extravagant in his dress, wearing outfits with sashes and epaulets that resembled military costumes, and in his work projects, such as a short futuristic 3-D film for Disney, Captain EO, that cost over a million dollars per minute, at the time the most expensive film per minute ever produced. His 1988 autobiography shattered the myth of his family’s idyllic nature, when Jackson documented the physical abuse he suffered during his childhood at the hands of his father. That same year, he built a private amusement park and mansion in California that he named Neverland Ranch, where he regularly hosted sick children.
Impact By the end of the decade, Jackson had become pop music’s biggest international star, enjoying a level of celebrity previously enjoyed in the rock era only by Elvis Presley and the Beatles. In addition to over one hundred Grammys and other music awards, he was awarded a series of honors for notable achievements throughout the 1980’s, including
The Eighties in America
MTV’s Vanguard Award for outstanding contribution to music video production, the Special Award of Achievement at the American Music Awards, humanitarian awards from the National Association for the Advancement of Colored People (NAACP) and the National Urban League, and Artist of the Decade awards from organizations and publications as diverse as the British television industry and Vanity Fair magazine. Further Reading
Halstead, Craig, and Chris Cadman. Michael Jackson: The Solo Years. Gamlingay, Cambridgeshire, England: Authors OnLine, 2003. Heavily focused on Jackson’s productivity and contributions to popular music in the 1980’s, at the peak of his solo career. Features comprehensive discussion of Jackson’s output as a solo artist. Jones, Bob, with Stacy Brown. Michael Jackson: The Man Behind the Mask. New York: Select, 2005. Inside look at Jackson’s private life and motivations by the man who worked for thirty years as Jackson’s publicist. Pulls no punches but eschews scandal-mongering. Lewis (Jones), Jel D. Michael Jackson, the King of Pop: The Big Picture—The Music! The Man! The Legend! The Interviews! Phoenix: Amber, 2005. A richly detailed exploration of Jackson’s career and lesserknown personal side, including transcripts of interviews with high-profile talk-show hosts, magazines and newspapers, and television programs. Devon Boan See also African Americans; Music; Music videos; Pop music; Prince; Richie, Lionel; Rock and Roll Hall of Fame; USA for Africa.
■ Japan and North America Definition
Diplomatic and economic relations of Japan with the United States and Canada
Throughout the 1980’s, Japan was a strong political ally of the United States, helping confront communism in the last decade of the Cold War. Failure to solve trade frictions soured this relationship by the late 1980’s, however. Meanwhile, Japan and Canada enjoyed cordial relations, enriched by mutual interest in peacekeeping and developmental aid projects.
Japan and North America
■
543
Indicative of the strength of its political alliance with the United States throughout the 1980’s, Japan, like Canada, promptly joined the U.S.-led boycott of the 1980 Summer Olympics in Moscow. This boycott in retaliation for the invasion of Afghanistan by the Soviet Union in December, 1979, helped set the tone of Cold War politics early in the decade. A Strong Alliance
In 1981, to strengthen Japan’s alliance with the United States, the two countries signed a series of military agreements. Japan took on a larger share of responsibility for its own maritime defense, increased its support for U.S. troops stationed in Japan, and agreed to build up its selfdefense forces. These agreements relieved the United States of a considerable burden. The period from 1982 to 1987, when Japan was governed by Prime Minister Yasuhiro Nakasone while Ronald Reagan continued as U.S. president, was marked by great U.S.-Japanese diplomatic harmony. Pundits spoke of the “Ron-Yasu” friendship, shortening the leaders’ first names. Nakasone’s visit to Washington, D.C., in April, 1987, was a highlight of this era. In addition to close coordination of U.S. and Japanese foreign policy toward the Soviet Union and Asian trouble spots, the nations conducted joint military exercises in Asia every year of the 1980’s. In both 1980 and 1988, Japanese naval forces joined Pacific Rim exercises (RIMPAC) that included Canadian, Australian, and New Zealander forces in addition to those of the United States. Japan had never before joined the RIMPAC exercises. U.S.-Japanese relations became less warm and personal at the end of the 1980’s. The vanishing Soviet threat reduced the importance of the nations’ military alliance. Japanese political turmoil, such as the recruit scandal that forced Nakasone’s 1987 successor Noboru Takeshita to resign on June 3, 1989, prevented continuation of a friendship between the new leaders of the two nations, and economic friction clouded the relationship.
Economic Friction During the 1980’s, the Japanese economy was robust and innovative, creating desirable goods for the American market at affordable prices. For example, in 1982 Sony introduced the first camcorder for professional use. Combined with protective restrictions on the Japanese domestic market, this innovation quickly led to a huge U.S. trade deficit with Japan. According to U.S. calculations, the trade deficit, which had been a mere $380 million
544
■
The Eighties in America
Japan and North America
in 1970, reached $10 billion in 1980, and climbed to $60 billion in 1987. The increasing trade deficit alarmed the United States. Japan agreed to voluntary export limits in the automotive industry in the 1980’s. The 1985 Plaza Agreement was concluded among four leading Western nations, including the United States, as well as Japan. The agreement sought to rectify the trade imbalance by strengthening the Japanese yen and weakening the U.S. dollar. When this revaluation failed to achieve its desired result, the 1987 Louvre Agreement was signed among the same nations: Japan agreed to raise interest rates and open its markets, and the United States agreed to decrease its annual budget deficits. These measures worked to some extent, lowering the U.S. trade deficit with Japan to $38 billion by 1990. Japanese foreign direct investment in the United States soared throughout the 1980’s, reaching $83 billion by 1990. Japanese companies invested in the U.S. commercial sector, particularly in manufacturing and wholesale and distribution networks. Japanese investments in U.S. real estate were worth $15 billion in 1988. Sony’s 1989 purchase of Columbia Pictures caused some American cultural concern, as it crystallized a developing anxiety over foreign ownership of nominally American corporations. Economic friction crossed over into politics during the 1988 U.S. presidential campaign. Richard Gephardt won the Iowa caucuses and the South Dakota primary in part because of a campaign advertisement that discussed the high price of Chrysler’s K-car in Japan, caused by protective tariffs. The U.S. Trade Act of 1988, moreover, named Japan as an unfair trading partner in three areas in 1989. The U.S. Structural Impediments Initiative of 1989 launched talks between the two governments to avoid an acute crisis. Japan and Canada Throughout the 1980’s, Japan and Canada enjoyed warm relations. Japan was Canada’s second-largest trading partner, and the Japanese made substantial investments in Canada. The two countries found common ground in their mutual commitment to United Nations peacekeeping missions and as significant donors to developing nations. They also sat together in many multinational economic organizations. On November 6-7, 1989, in Canberra, Australia, Canada and Japan were among the founding nations of the new Asia Pacific Eco-
nomic Cooperation forum (APEC). Japan had vigorously supported Canadian, U.S., and Australian participation in the forum against initial Malaysian opposition. Cultural exchanges also flourished between Canada and Japan in the 1980’s. The Canadian embassy in Tokyo was designed by Japanese Canadian architect Raymond Moriyama in the late 1980’s and completed in 1991. Apologies for Internment
Further strengthening Japan’s relationship with North America was the 1988 decision by both the United States and Canada to issue a formal apology for the internment of Japanese Americans and Japanese Canadians during World War II. Victims of the internment and their families were given individual compensation of twenty thousand American dollars or twenty-one thousand Canadian dollars, as appropriate.
Impact
The firm U.S.-Japanese alliance added strength to America’s opposition to the Soviet Union. By 1989, with the Soviet Union’s new conciliatory attitude, some Americans wondered about redefining U.S.-Japanese diplomatic relations. Bitter trade disputes between the United States and Japan clouded bilateral relations by the late 1980’s, with both sides looking for a solution. Canada and Japan, however, continued to enjoy friendly relations enriched by their interest in each other’s culture and their common peaceful global commitments.
Further Reading
Cohen, Stephen D. An Ocean Apart: Explaining Three Decades of U.S.-Japanese Trade Friction. Westport, Conn.: Praeger, 1998. Comprehensive analysis of sources and of the mutual perception of issues affecting U.S.-Japanese relations in the 1980’s, the middle decade covered here. LaFeber, Walter. The Clash: U.S.-Japanese Relations Throughout History. New York: W. W. Norton, 1997. The final four sections of Chapter 12 cover the history of Japan and the United States throughout the 1980’s. Illustrated, notes, bibliography, index. Morris, Jonathan, ed. Japan and the Global Economy. Reprint. London: Routledge, 1997. Focus on the foreign direct investments of Japanese companies during the 1980’s, covering the U.S. manufacturing sector, the U.S. automotive industry, and Canadian businesses. Ota, Fumio. The US-Japan Alliance in the Twenty-First
The Eighties in America
Century: A View of the History and a Rationale for Its Survival. Honolulu: University of Hawaii Press, 2006. Puts the strengths and troubles of the nations’ relationship during the 1980’s into the context of its continuous historic development. Schultz, John, ed. Canada and Japan in the Twentieth Century. Toronto: Oxford University Press, 1991. The final three sections concern economic ties, political and social relations, and mutual cultural interests from the perspective of the late 1980’s; written by both Canadian and Japanese scholars. Illustrated, notes, index. R. C. Lutz See also Asian Americans; Business and the economy in Canada; Business and the economy in the United States; Camcorders; Chrysler Corporation federal rescue; Cold War; Elections in the United States, 1988; Foreign policy of Canada; Foreign policy of the United States; Globalization; Olympic boycotts.
■ Jazz Definition
American music genre
Jazz in the 1980’s gained the respect it deserved, moving into mainstream institutions. Jazz in the 1980’s was everywhere, and jazz fans were willing to travel across America and beyond its borders to support their favorite jazz musicians in clubs, concerts, and festivals. By the time the decade began, jazz had enjoyed a resurgence in both creativity and popularity. As a result, jazz found itself integrated into the larger American culture and supported by mainstream colleges and universities, major corporations, foundations, city and state governments, and even National Public Radio and television’s Public Broadcasting Service. Many jazz fans in the 1980’s flocked en masse to hear their favorite jazz musicians play every style from modern jazz to bebop in their favorite venue, small jazz clubs, where they were often joined by students, critics, and academic researchers. Jazz became the art music for listeners, and many fans thought the only appropriate place to carry out this serious form of listening was in small clubs. Because jazz is a fundamentally improvisational form, every performance is unique and no recording can be de-
Jazz
■
545
finitive. Therefore, fans lined up for hours to attend live performances at clubs such as Jazz Showcase in Chicago; Blue Note and Village Vanguard in New York; Yoshi’s Japanese Restaurant and World Class Jazz House in Oakland, California; Tipitina’s in New Orleans; Major Chords in Columbus, Ohio; Rusty’s Jazz Club in Cleveland, Ohio; and many other clubs located across the United States and Canada. New Young Musicians Arrive on the Scene In addition to established musicians, jazz aficionados encountered many new young artists who made the genre their own. These included Joshua Redman, Roy Hargrove, Michael and Randy Brecker, Courtney Pine, and Branford Marsalis. Many of these young players learned jazz by attending jazz studies programs in colleges and universities, and some had been influenced by their parents’ jazz collections, which included music by Thelonious Monk, Charlie “Bird” Parker, John Coltrane, Duke Ellington, Louis Armstrong, Count Basie, and many other jazz giants. One of the favorite young, very successful trumpeters who entered the jazz scene in the 1980’s was Wynton Marsalis, who began his career with Art Blakey and the Jazz Messengers and then formed his own modern, mainstream jazz quintet. In 1983, at age twenty-two, Marsalis won Down Beat magazine’s award for best trumpeter. Throughout the 1980’s, Marsalis won Grammy Awards for both jazz and classical music. He also became the cofounder and creative director of Jazz at Lincoln Center. He was the winner of the first Pulitzer Prize ever awarded to a jazz composer for his oratorio, “Blood on the Fields.” He represented to many the rebirth of jazz in the 1980’s. Established Musicians Entertain Listeners
While they welcomed such new talents, jazz fans also remained loyal to their favorite established musicians— such as Ahmad Jamal, McCoy Tyner, Betty Carter, Diane Schuur, Max Roach, Billy Eckstine, Abdullah Ibrahim, Sonny Rollins, and many more—who regularly appeared at small clubs. Fans were willing to hang out at such clubs until the wee hours of the morning, listening to the last set of the night. They also flocked to the many festivals occurring in major cities across the nation, where jazz greats regularly performed. These greats included trumpeter Miles Davis, who, after a five-year absence from the jazz scene, returned in the 1980’s loaded with fresh new ideas. Davis was eager to mentor young new talent,
546
■
The Eighties in America
Jazz
Selected 1980’s Jazz Albums Year
Title
Artist
1980
Ella Abraca Jobim
Ella Fitzgerald
Give Me the Night
George Benson
Kansas City Shout
Count Basie
Winelight
Grover Washington, Jr.
Walk on the Water
Gerry Mulligan
As Falls Wichita, So Falls Wichita Falls
Pat Metheny
Mecca for Moderns
The Manhattan Transfer
Warm Breeze
Count Basie
Breakin’ Away
Al Jarreau
Chick Corea, Herbie Hancock, Keith Jarrett, McCoy Tyner
Chick Corea, Herbie Hancock, Keith Jarrett, McCoy Tyner
Offramp
Pat Metheny
We Want Miles
Miles Davis
Travels
Pat Metheny
Gershwin Live!
Sarah Vaughan
An Evening with George Shearing and Mel Tormé
Mel Tormé and George Shearing
The Best Is Yet to Come
Ella Fitzgerald
All in Good Time
Rob McConnell and the Boss Brass
Looking Out
McCoy Tyner
Jarreau
Al Jarreau
Bop Doo-Wop; Bodies and Souls
The Manhattan Transfer
88 Basie Street
Count Basie
Top Drawer
Mel Tormé
Think of One
Wynton Marsalis
Nothin’ but the Blues
Joe Williams
Cleo Laine at Carnegie: The Tenth Anniversary Concert
Cleo Laine
The Cotton Club: Original Motion Picture Soundtrack
Bob Wilber and John Barry
1981
1982
1983
1984
1985
Cityscape
Michael Brecker
The Voice
Bobby McFerrin
First Circle
Pat Metheny Group
Hot House Flowers
Wynton Marsalis
New York Scene
Art Blakey and the Jazz Messengers
Straight to the Heart
David Sanborn
You’re Under Arrest
Miles Davis
Live from the Village Vanguard
Phil Woods Quartet
Vocalese
The Manhattan Transfer
Black Codes from the Underground; J Mood
Wynton Marsalis
Round Midnight; The Other Side of “Round Midnight”
Dexter Gordon
Water from an Ancient Well
Abdullah Ibrahim
The Eighties in America
Jazz
■
Year
Title
Artist
1986
Elektric Band
Chick Corea’s Elektric Band
Live from the Village Vanguard
Phil Woods Quartet
Double Vision
David Sanborn and Bob James
Timeless
Diane Schuur
Tutu
Miles Davis
The Tonight Show Band with Doc Severinsen
The Tonight Show Band with Doc Severinsen
Marsalis Standard Time, Volume I
Wynton Marsalis
Bud and Bird
Gil Evans and the Monday Night Orchestra
1987
1988
1989
Royal Garden Blues; Renaissance
Branford Marsalis
Brasil
The Manhattan Transfer
Still Life (Talking)
Pat Metheny Group
Diane Schuur and the Count Basie Orchestra
Diane Schuur and the Count Basie Orchestra
Digital Duke
Mercer Ellington
Blues for Coltrane: A Tribute to John Coltrane
Cecil McBee, David Murray, McCoy Tyner, Pharoah Sanders, and Roy Haynes
Destiny’s Song + The Image of Pursuance
Courtney Pine
Talkin’ ’bout You
Diane Schuur
Politics
Yellowjackets
Look What I Got!
Betty Carter
Simple Pleasures
Bobby McFerrin
Don’t Try This at Home
Michael Brecker
Live at Sweet Basil
Randy Brecker
Letter from Home
Pat Metheny Group
Akoustic Band
Chick Corea Akoustic Band
Aura
Miles Davis
Diamond in the Rough
Roy Hargrove
which he often did onstage during performances. Other jazz legends who continued to perform live during the 1980’s included bassist Ron Carter, Chick Corea and his Elektric Band, vocalist Abby Lincoln, and Latin-influenced, Brazilian artist Flora Purim. Jazz music as different as that of Sun Ra, Dewey Redman, Stanley Turentine, Benny Carter, Maynard Ferguson, and Grover Washington thrilled audiences attending concerts, while master drummer Max Roach and his double quartet appeared at Yoshi’s in Oakland, filling the room in a club too small for the sound with exotic drumming from every direction. The sounds of electric jazz, bebop,
547
modern jazz, hard bop, cool jazz, free jazz, crosscultural explosions of Latin, Brazilian, Cuban, Caribbean, and African rhythms, crossover jazz with rock beats, fusion, and rhythm and blues could be heard all across America. It was truly an exciting decade for those who appreciated the experienced artists of jazz. Impact Jazz flourished in the 1980’s. Hard-core fans found a plethora of live performances in venues of all sizes, while mainstream music consumers purchased jazz music in greater quantities. As it gained mainstream recognition, jazz appeared more fre-
548
■
Jennings, Peter
The Eighties in America
quently at mainstream institutions, and jazz musicians were recognized as important contributors to and participants in Americans’ national heritage. Further Reading
Davis, Miles, with Quency Troupe. Miles: The Autobiography. New York: Simon & Schuster, 1980. Details the evolution of jazz itself during the 1980’s, in addition to the evolution of Davis’s own musical style and career. Ward, Geoffrey C., and Ken Burns. Jazz: A History of America’s Music. New York: Alfred A. Knopf, 2000. Includes more than five hundred photographs along with the stories of men and women who contributed to jazz from its beginnings through the end of the twentieth century. Garlena A. Bauer See also Academy Awards; African Americans; Art movements; Compact discs (CDs); Music; Pop music; Racial discrimination; Television.
■ Jennings, Peter Identification Canadian American journalist Born July 29, 1938; Toronto, Canada Died August 7, 2005; New York, New York
Canadian-born Peter Jennings connected with the American people and became one of the country’s most trusted news reporters, anchoring ABC’s World News Tonight. As a Canadian citizen, Jennings enjoyed a useful outsider’s view of the United States.
Peter Jennings examines his Canadian Club Arts and Letters Award, awarded to him at the Canadian Consulate General in New York City in 1984. (AP/Wide World Photos)
In 1983, Peter Jennings took his seat as anchor for the American Broadcasting Company (ABC) network news, a position he would hold for the next twenty-two years. Ambitious and hardworking, Jennings had transformed himself from a high school dropout to a suave, self-educated man of the world. Through his reliable, sincere reporting, he developed a personal relationship with the American people. Jennings helped ABC become America’s most watched news network. Jennings first appeared on the U.S. news scene in 1964 as a callow, twenty-six-year-old correspondent. His first experience as a sole news anchor for the network, hosting Peter Jennings with the News (19651967), was a colossal failure, and he was reluctant ever to try the job again. Instead, Jennings traveled
overseas and gained experience as a foreign correspondent. Between 1978 and 1983, he was one of three co-anchors for World News Tonight, serving as the program’s foreign anchor. Appearing in his signature trench coat, Jennings would often report from the streets of London. He also traveled to the world’s hot spots, often putting his own life in danger. In 1982, he journeyed to Beirut to cover the Israeli invasion of Lebanon. In 1983, Jennings returned to the United States as a temporary replacement for ailing anchor Frank Reynolds. Upon Reynolds’s unexpected death, ABC offered Jennings the job as sole anchor and senior editor. The show was officially renamed World News To-
The Eighties in America
night with Peter Jennings. During the 1980’s, Jennings covered the 1984 and 1988 presidential elections, the 1985 highjacking of Trans World Airlines (TWA) flight 847, and the 1986 Space Shuttle Challenger disaster. Tall and handsome with soulful brown eyes, Jennings soon became a news celebrity. Jennings projected a sense of trust that transferred through the television camera and into viewers’ homes. Throughout the 1980’s, Jennings helped viewers make sense of the world’s chaos. His mellow, staccato voice, interlaid with Canadian pronunciations, brought a comforting familiarity to the nightly news. Impact Peter Jennings delivered the news with a cool detachment that set him apart from other anchors. Viewers trusted him to report truthfully, accurately, and fairly. Jennings’s believability helped make World News Tonight with Peter Jennings the most watched nightly network news broadcast. Further Reading
Goldberg, Robert, and Gerald J. Goldberg. Anchors: Brokaw, Jennings, Rather, and the Evening News. New York: Carol, 1990. Jennings, Peter, and Todd Brewster. The Century. New York: Doubleday, 1998. _______. In Search of America. New York: Disney Press, 2002. Rhonda L. Smith See also
Beirut bombings; Challenger disaster; Elections in the United States, 1984; Elections in the United States, 1988; Journalism; Network anchors; Television.
■ Jewish Americans Identification
Americans of Jewish descent from many countries of origin
For many Jewish Americans, the 1980’s was a period of increasing commitment to Jewish life in its diverse forms, including Jewish contributions to the larger American culture. The very diversity of those forms sparked controversies, however, particularly over how to accommodate such wider American concerns as equality for women, as well as how best to promote human rights in Israel and the Soviet Union. Two major shifts in non-Orthodox Jewish religious life occurred in 1983. First, the Conservative move-
Jewish Americans
■
549
ment, a century-old American movement within Judaism that charted a middle way between the more liberal Reform movement and traditional Orthodoxy, decided to permit women to serve as rabbis. The movement began to accept women as rabbinical students, and the first Conservative woman rabbi, Amy Eilberg, was ordained in 1985. (The Reform movement had ordained its first woman rabbi in 1972.) In the 1980’s, “egalitarian” Conservative synagogues, which accorded women the same religious rights and responsibilities as men, increased dramatically. The Reform movement, meanwhile, announced that it would accept as Jews the children born of Jewish fathers and non-Jewish mothers, if these children were raised as Jews. In traditional Jewish law, only birth to a Jewish mother or conversion rendered a child Jewish, but as intermarriage between American Jews and non-Jews steadily increased, Reform leaders felt that a public ruling on this issue was needed. They maintained that Jewish fathers, as well as mothers, should be able to raise their children as Jews. Orthodox and some Conservative leaders condemned this decision. Jewish Americans and Soviet Jewry
In 1980, the Soviet Union began to cut the number of Jews it allowed to leave the country. At the same time, these emigrants increasingly elected to settle in the United States rather than Israel. The United States granted Soviet Jews refugee status, and Jewish American relief agencies oversaw their resettlement and integration into American life. The Soviet Union became increasingly reluctant to allow Jews to emigrate, however. The Jewish American community advocated strongly for increased freedom of movement for Soviet Jews, and this became an important negotiating issue between the United States and the Soviet Union. In the early to mid-1980’s, the issue of Soviet Jewry was one unifying issue for the Jewish American community. Then, in the late 1980’s, the Soviet Union began to collapse, and with its dissolution, the barriers to Jewish emigration came down. Between 1986 and 1990, 300,000 Soviet Jews emigrated, almost onethird of them to the United States. The Israeli government, which desired these immigrants to augment its Jewish population, advocated for the United States and the Jewish American community to direct more Soviet emigrants to Israel. At the same time,
550
■
The Eighties in America
Jewish Americans
the political changes in the Soviet Union prompted the United States to decide to phase out refugee status for Soviet Jews, reducing the numbers it would accept in the 1990’s. Jewish Americans and Israel The survival of Israel remained a central concern for Jewish American leaders throughout the 1980’s, but Israel’s relationship with Jewish Americans grew more complicated. Israel drew criticism from around the world for its invasion of Lebanon in 1982 and for alleged human rights abuses during the Palestinian Intifada uprising that began in 1987. Some members of the Jewish American community added their voices to this criticism, as did many Israeli Jews. Both in Israel and in the United States, Jews on the left sympathized with the plight of Palestinians and argued that Israel’s actions compromised its Jewish values, while Jews on the right defended those actions as necessary for Israel’s self-defense and to enforce order within its borders. A serious political division began to assert itself within the Jewish American community, as well as between that community and the Israeli government. The conflict between the Jewish American community and Israel’s government came to a head over an amendment proposed in 1985 to Israel’s Law of Return. The amendment would have stipulated that Israel would only accept converts to Judaism as immigrants if they converted under Orthodox auspices. At the heart of the debate over this measure was the issue of who is and is not a Jew. The Jewish American community, which was roughly 90 percent non-Orthodox, protested the amendment. Demographic Concerns During the 1980’s, the Jewish American population appeared to remain stable at just under 6 million, or roughly 2.5 percent of the U.S. population. Two demographic trends generated significant concern among Jewish American leaders: rising rates of intermarriage between Jews and non-Jews and declining fertility rates among Jewish families. Taken together, these trends prompted concern for the community’s long-term survival. Intermarriage caused concern, among other reasons, because few children from “mixed” marriages appeared to identify as Jews. The general trend was clear: The community was aging and was poised to grow smaller. Jewish Americans and American Culture
Whatever uncertainties surrounded the Jewish American com-
munity’s demographic future, Jews made significant contributions to American culture and to Jewish culture in America during the 1980’s. In 1986, author and Holocaust survivor Elie Wiesel received the Nobel Peace Prize. Realizing that the firsthand witnesses to the Holocaust were aging, Jewish Americans launched new efforts to promote Holocaust education and remembrance among succeeding generations. Jewish studies programs proliferated on American college and university campuses, and they included not only courses in the Jewish religion but also in Holocaust studies, Yiddish studies, and contemporary humanities and social sciences. The National Yiddish Book Center, founded in 1980 in Amherst, Massachusetts, soon became the fastestgrowing Jewish American cultural organization. On stage and screen, such artists as Neil Simon, Harvey Fierstein, Woody Allen, Wendy Wasserstein, and others created successful works about Jewish themes that won wide popularity among the general American population as well. Impact During the 1980’s, the Jewish American community experienced greater religious equality for women and greater religious polarization, absorbed thousands of immigrants from the Soviet Union, and continued its commitment to Israel as the Jewish state struggled with new external and internal challenges. Jews continued to make major contributions to American culture. In the 1980’s, many did so using openly Jewish themes and subject matter. At the same time, the freedom and acceptance that Jewish Americans experienced, unprecedented in the long history of the Jewish Diaspora, challenged the community’s boundaries and sense of identity. By 1989, Jewish Americans faced uncertainty regarding their unity as a religious community, and many leaders voiced concern that their demographic decline threatened the community’s long-term survival. Further Reading
American Jewish Year Book. Vols. 81-90. New York: American Jewish Committee, 1980-1989. Sachar, Howard M. A History of the Jews in America. New York: Knopf, 1992. Sarna, Jonathan. American Judaism: A History. New Haven, Conn.: Yale University Press, 2004. Whitfield, Stephen J. In Search of American Jewish Culture. Hanover, N.H.: Brandeis University Press/ University Press of New England, 1999. Ben Furnish
The Eighties in America See also Cold War; Feminism; Film in the United States; Heidi Chronicles, The; Immigration to the United States; Israel and the United States; Middle East and North America; Nobel Prizes; Richler, Mordecai; Soviet Union and North America; Spielberg, Steven; Theater; Torch Song Trilogy; Women’s rights.
■ Johnson, Magic Identification
Hall-of-Fame professional basketball player Born August 14, 1959; Lansing, Michigan As a point guard for the Los Angeles Lakers, Magic Johnson became recognized as one of history’s greatest professional basketball players. After leading Michigan State University to the National Collegiate Athletic Association (NCAA) basketball title in 1979, Magic Johnson signed to play for the Los Angeles Lakers. Johnson formed a formidable tandem with All-Pro center Kareem AbdulJabbar. During the 1979-1980 season, Johnson helped lead the Lakers to a 60-22 record and was selected as a member of the National Basketball Association (NBA) All-Rookie Team. He led the Lakers to the NBA Championship and won the NBA Finals Most Valuable Player (MVP) Award. During the 1981-1982 NBA season, Johnson was again instrumental in leading the Lakers to the championship, and he won the NBA Finals MVP Award for a second time. In the 1982-1983 season, Johnson was named a member of the All-NBA Team. Although he again helped lead the Lakers to the NBA Championship, they were swept by the Philadelphia Seventy-Sixers. In the 1983-1984 season, Johnson and his Lakers again played for the NBA Championship, but they were defeated by the Boston Celtics, who were led by Johnson’s longtime rival Larry Bird. Powered by Johnson and Abdul-Jabbar, the Lakers again played Bird and the Celtics for the NBA Championship in 1985; this time, they won the series four games to two. For his play during the 1986-1987 season, Johnson was named the NBA MVP. He again led the Lakers to the NBA Championship and won the NBA Finals MVP Award for a third time. Johnson led the Lakers to the championship once more in 1988, the fifth time the team captured that coveted title with Johnson as its point guard. During his
Johnson, Magic
■
551
NBA career from 1979 into the 1990’s, Johnson averaged 19.5 points per game, as well as achieving a .520 field-goal percentage and an .848 free-throw percentage. He had 10,141 career assists, 6,559 rebounds, and shot .303 from beyond the three-point line. He was named an NBA All-Star twelve times and a member of the All-NBA Team nine times, and he earned three NBA MVP awards and three NBA Finals MVP awards. Impact Along with Larry Bird, Magic Johnson was instrumental in rejuvenating fan interest in the NBA. The rivalry between the two players, which began with the 1979 NCAA Basketball Championship, generated new excitement in the NBA. Johnson was only one of four players to win NCAA and NBA championships in consecutive years. Known as “Showtime,” Johnson possessed a wide variety of shots and superior passing and dribbling abilities.
Los Angeles Laker Magic Johnson leaps toward the basket, beginning a layup before Golden State Warrior Clifford Ray can react on March 30, 1980. (AP/Wide World Photos)
552
■
Further Reading
Johnson, Earvin. My Life. New York: Random House, 1996. Kramer, Sydelle A. Basketball’s Greatest Players. New York: Random House, 1997. Troupe, Quincy. Take It to the Hoop, Magic Johnson. New York: Hyperion Books, 2000. Alvin K. Benson See also
The Eighties in America
Journalism
Basketball; Bird, Larry; Sports.
■ Journalism Definition
Distribution of significant information and news via print and electronic media
The best of professional journalism in the United States lasted from the 1950’s to the early 1980’s. During the 1980’s, however, journalism came under sustained attack from corporate and commercial pressures. During the 1980’s, journalism across print and the electronic media was restructured, and problems of inaccuracy, misrepresentation, and intrusion into privacy arose. The gulf between popular and serious journalism in the press became more pronounced in the decade. In broadcasting, there was a tendency toward redefining journalism as personalized “infotainment” programming, and the tension between commerce and ethics was a powerful cause of disruption within journalism of the time. The gap between citizens and government grew, as well as the long-standing national problems such as crime, a lagging educational system, and environmental degradation. Discussions about solutions to these problems often turned into accusations across an ideological gap constructed by politicians and perpetuated by journalists. Consequently, in the 1980’s, a majority of Americans expressed the opinion for the first time that their children would inherit a society less livable, more dangerous, and presenting less opportunity than they had. It was noted that, for the first time since 1924, only half of the nation’s voters turned out for the presidential election in 1988. A decline in journalism and a decline in public life happened in the same decade because they are profoundly interrelated in modern society, as journalism provides the timely information and perspective that public life requires. The professional autonomy of U.S. journalism
changed radically in the 1980’s, when the relaxation of federal ownership regulations and the proliferation of new technologies made larger media conglomerates economically advantageous. Fewer and fewer firms owned and dominated the major film studios, television networks, music companies, and other media outlets. As most of the traditional news media became segments of commercial empires, owners looked to the news divisions to provide the same financial return as generated by film, music, or entertainment segments. As a result, news bureaus were closed, reporters were laid off, more free press releases were used as news, and advertisers and corporate entities gained increased influence on the press. Another problem that arose during this period was the growing control of the public relations industry. By providing superficial press releases, enterprising public relations agents used “experts” to shape the news in order to enhance the influence of their corporate clientele. Public relations releases were favored by media owners, as they provided fillers at no cost. More subtle but just as important was the unseen influence of values furthering the commercial aims of owners and advertisers. Thus, stories about royal families and celebrities became seen as legitimate news stories, while the government received more strict scrutiny than big business. Class Bias in Journalism
Earlier in the century, most daily newspapers of large circulation had at least one and sometimes several reporters to cover labor. By the end of the 1980’s, the number of reporters covering the labor movement had declined by more than half, and the coverage of working-class economics was markedly declining. At odds with this, business news and mainstream news grew over the decade, as news was increasingly aimed at the richest segment of the population. The largest media firms were built by profits generated by government gifts of monopoly rights to the valuable broadcast spectrum or monopoly franchises. The ups and downs of Wall Street, information about profitable investments, and the benefits of wealth were presented as of interest to the general reader or viewer. Some journalists began to rely on releases from business-oriented think tanks and institutes for their economic stories. Increased coverage by news media on the affluent components of the population supported this class bias in the selection of stories. Throughout the
The Eighties in America
1980’s, real income declined or stayed the same for lower-income individuals while wealth rose rapidly for the wealthy. The bottom 60 percent owned only a minuscule share of total wealth and was heavily burdened with a high level of personal debt. In those rare cases when poor people were covered in the news, studies have shown that the news media reinforced racial stereotypes. There has always been ample coverage of crime, but during the 1980’s this coverage was increasingly used for graphic and inexpensive filler, which often promoted popular paranoia regarding crime waves and prodded politicians to increase “tough talk” on crime. Public Journalism The rise of a movement called “public journalism,” alternatively called “civic journalism,” began in the 1980’s in the United States. It tried to redefine traditional journalistic values, to question the worth of vaunted objectivity, and to question ethical guidelines. Even more, the movement promoted the involvement of journalists as participants in the community and as reflecting the composition of the society around them in their reporting. The 1988 presidential elections served as a catalyst for the initiation of public journalism. Many media people were concerned about the relationship between journalists and political candidates that had developed throughout the decade. There was a trend for candidates to depend more on media consultants, not only during campaigns but also between campaigns. Both candidates and their campaign managers became obsessed with controlling their message during the 1980’s. The rise of public journalism was a reaction to the failure of journalism in its role of fostering honest reporting of important people in public life, to the idea that journalism should be a positive force in the revitalization of public life. It argued that many cultural changes in conventional reporting had to take place before any journalistic changes in community development could occur. For the news media, the problem was how to address these challenges to create an environment that would lead to a better understanding of community issues by citizens. Public journalism encouraged citizen participation in public life by providing news that would help them make enlightened decisions in a democratic, self-governing society. At its core was the assumption that journalism has an obligation to public life beyond simply relating news or presenting facts. Following from this premise, civic journal-
Journalism
■
553
ism tried to make a newspaper a forum for the discussion of public issues, to focus on events and issues important to ordinary people, and to help people function as political activists. Impact Although the movement for public, or civic journalism, started in the 1980’s, its most drastic growth occurred in the 1990’s. New organizations sprang up to promote the movement’s ideas, such as the Pew Center for Civic Journalism. The Kettering Foundation and the Public Journalism Network tried to spread civic journalism across the country. By the end of the 1990’s, proponents of public journalism were enthusiastic, but critics saw an erosion of some principles of traditional reporting. As a result, the new movement spurred great controversy across the media and the public. Yet by 2002, at least one-fifth of U.S. daily newspapers practiced some form of public journalism. Newspaper editors asserted that their public journalism increased public deliberation, civic problem solving, and volunteerism and had changed public policy. A key finding of the study Measuring Civic Journalisms Project by the Pew Center was that 96 percent of public journalism projects had used an explanatory story frame to cover public issues instead of a more traditional conflict frame, which usually reports two conflicting viewpoints. Further Reading
Black, Jay, ed. The Public/Civic/Communitarian Journalism Debate. Mahwah, N.J.: Lawrence Erlbaum Associates, 1997. A collection of articles concerning one of the central ethical issues of journalism: To what extent is the journalist an isolated individualist, and to what extent is he or she a committed member of the wider community? Corrigan, Don. The Public Journalism Movement in America: Evangelists in the Newsroom. Westport, Conn.: Praeger, 1999. Offers a thorough and devastating critique of public journalism by showing that its advocates have failed to diagnose what really ails American journalism and that their prescriptions for saving journalism are more likely to harm than to help the profession. The author introduces data from an extensive survey of newspaper editors and academics, as well as a comprehensive lexicon of public journalism. Machesney, Robert W. Rich Media, Poor Democracy: Communication Politics in Dubious Times. New York: New Press, 2000. Discusses the roots of corporate
554
■
Journey
influence in journalism and how it affects the workings of democracy. McPherson, James Brian. Journalism at the End of the American Century, 1965 to the Present. Westport, Conn.: Praeger, 2006. McPherson deals with both the best and the worst aspects of American journalism since 1965. The book emphasizes that traditional journalistic values have diminished in importance, not just for those who control the media but also for media consumers who need good journalism. Meritt, W. Davis “Buzz,” Jr. Public Journalism and Public Life: Why Telling the News Is Not Enough. 2d ed. Mahwah, N.J.: Lawrence Erlbaum Associates, 1998. This edition develops the philosophy of public journalism, responds to articles against it, and explains the importance of public deliberation and the role of certain values in public journalism. Perry, David K. The Roots of Civic Journalism: Darwin, Dewey, and Mead. Lanham, Md.: University Press of America, 2003. Discusses the philosophical roots of the civic journalism movement, focusing on the ideas of Charles Darwin, John Dewey, and George H. Mead. Rosen, Jay. What Are Journalists For? New Haven, Conn.: Yale University Press, 2001. Although several powerful news organizations such as The New York Times have criticized public journalism for abandoning the traditional goal of objective reporting, Rosen, one of the founders of the public journalism movement, believes that the movement may help newspapers in a time of decreasing readership as well as advance the common good. Sheila Golburgh Johnson See also Advertising; Brokaw, Tom; Business and the economy in the United States; Cable television; CNN; Craft, Christine; Jennings, Peter; Liberalism in U.S. politics; Network anchors; Pauley, Jane; Rather, Dan; Rivera, Geraldo; Tabloid television; Television.
■ Journey Identification American mainstream rock band Date Formed in 1973
Despite being labeled as a faceless rock group, Journey achieved success by cultivating a loyal fan base that was devoted to their type of mainstream hard rock.
The Eighties in America
By late 1980, Journey was performing with only two of its original members, lead guitarist Neal Schon and bassist Ross Valory. Earlier departures of several band members and the addition of new personnel had resulted in a shift away from their early jazz-rock style, which had emphasized solo improvisation. Instead, the group developed a more pop-oriented, ensemble sound. This change of style included the introduction of romantic ballads featuring a powerful new lead singer, Steve Perry. Journey’s 1980 album, Departure, was aptly titled, because it marked a turning point in the band’s commercial success, ranking number eight on the album charts. “Any Way You Want It” became a Top 25 single, and the group established a legion of devoted fans. Jonathan Cain, an accomplished songwriter, joined Journey as the new keyboard player in 1980. His addition proved to be a catalyst for added commercial success. Escape (1981) climbed to a number-one ranking, and it spawned three top U.S. hit singles, “Who’s Crying Now,” “Don’t Stop Believin’,” and “Open Arms.” The following album, Frontiers (1983), also demonstrated mass appeal, becoming entrenched as number two on the Billboard 200 chart for nine weeks. The newfound success of Journey was directly attributable to Cain’s ability to write songs with meaningful, hopeful messages and vocalist Perry’s unique tenor voice, which conveyed these messages. Critics were less enthusiastic about Journey than were their fans. The band was categorized as a member of “corporate rock” or described as a “faceless band,” along with such similar groups as Boston, Foreigner, Survivor, and others. Band members did not promote their individual personalities as did the decade’s superstars; as a result, they went largely unnoticed, except by loyal fans. Despite the criticism, Journey continued to perform and record a number of film sound tracks during the decade. In 1982, Journey recorded two tracks for the Disney film Tron. Later that year, two video games featuring the band were developed by Atari, Journey Arcade and Journey Escape, the first video games honoring a rock band. In 1984, Steve Perry released a solo recording titled Street Talk. Creative differences caused longtime members Valory (bass) and Steve Smith (drums) to depart, and they were replaced by studio musicians for a 1986 album Raised on Radio. Only three Journey members remained at this point: Perry, Schon, and
The Eighties in America
Joy Luck Club, The
■
555
Journey: From left, Jonathan Cain, Neal Schon, Steve Smith, Steve Perry, and Ross Valory. (Paul Natkin)
Cain. Following an exhaustive tour, Perry left. Journey dissolved in 1987, and Schon and Cain moved on to other projects. A farewell recording, Greatest Hits, was released in 1988. The apparent demise of Journey proved to be merely a hiatus, however. Reunited in 1996, Journey began performing again. Impact Rather than showcasing individual stars, Journey’s ensemble versatility allowed them to perform music across a wide stylistic spectrum, from romantic ballads to heavy metal. This led to a lack of distinctive sound for which critics berated them, but it also fueled their mainstream success. Further Reading
Romanowski, Patricia, and Holly George-Warren. The New Rolling Stone Encyclopedia of Rock and Roll. Rev. ed. New York: Fireside, 1995. Stuessy, Joe, and Scott Lipscomb. Rock and Roll: Its History and Stylistic Development. 5th ed. Upper Saddle River, N.J.: Pearson Prentice Hall, 2006.
Ward, Ed, and Geoffrey Stokes. Rock of Ages: The Rolling Stone History of Rock & Roll. New York: Rolling Stone Press, 1986. Douglas D. Skinner See also
Heavy metal; Music; Music videos; Pop
music.
■ Joy Luck Club, The Identification Best-selling debut novel Author Amy Tan (1952) Date Published in 1989
The unexpected popularity of Chinese American Amy Tan’s debut novel indicated that American readers of the late 1980’s longed for fresh literary voices, and it continued the trend toward multiculturalism in the literary scene of the decade.
556
■
Joy Luck Club, The
The runaway success of Amy Tan’s debut novel, The Joy Luck Club, was the literary surprise of 1989. Featuring four Chinese mothers and their four Chinese American daughters, Tan’s work explored women’s hardships in traditional Chinese society, the struggles between first-generation immigrant mothers and their often rebellious second-generation daughters, and the daughters’ attempts to come to terms with love, career, and personal issues in 1980’s America. Originally, Tan conceived of her work as a series of connected short stories that would be tied together by the Chinese game of Mahjong, which the four mothers play as part of the regular activities of their Joy Luck Club. However, once it was completed, the book was marketed and recognized critically as a novel, albeit one with an episodic structure. In the novel, the idea for the club of the title originated with Suyuan Woo in China during World War II. Suyuan brought her idea to America when she started a new life in San Francisco. At the begin-
The Eighties in America
ning of Tan’s work, Suyuan has died recently, and her daughter June (Jing-mei) Woo is asked to take the place of her dead mother at the Mahjong table. This event launches a series of stories centering on the experiences of the eight central characters. Each character speaks for herself, except Suyuan, whose life unfolds through June’s memories. Impact In 1989, American readers enthusiastically received The Joy Luck Club, propelling it onto the best seller list of The New York Times, where it stayed for over forty weeks. Readers were fascinated by the mothers’ tales of China, where a young woman’s life was seriously undervalued and placed at risk of harm from callous husbands or devious mother-in-laws. They were equally fascinated by the stories of the second generation and the difficult mother-daughter relationships at the novel’s heart. The rebellion of Waverly Jong, June’s childhood rival, against her overbearing mother struck readers, as did the hilarious scene of Rose Hsu Jordan introducing her Caucasian fiancé Ted to her family. Indicative of its literary impact, The Joy Luck Club received a nomination for the 1989 National Book Award and the National Book Critics Award; it won the Commonwealth Gold Award and the Bay Area Book Reviewers Award of that year. The success of The Joy Luck Club launched Amy Tan on a protracted successful literary career and also further opened the market for Asian American authors and themes in fiction. Further Reading
Adams, Bella. Amy Tan. Manchester, England: Manchester University Press, 2005. Bloom, Harold, ed. Amy Tan. Philadelphia: Chelsea House, 2000. Huntley, E. D. Amy Tan: A Critical Companion. Westport, Conn.: Greenwood Press, 1998. R. C. Lutz See also Asian Americans; Book publishing; Hwang, David Henry; Immigration to the United States; Literature in the United States.
Amy Tan. (Robert Foothorap)
The Eighties in America
■ Junk bonds Definition
High-risk, high-yield securities
During the 1980’s, junk bonds helped fuel the frenzy of corporate mergers and acquisitions by producing the enormous amounts of capital required for companies to purchase other companies. Bonds are financial instruments issued by public and private entities that promise to repay the principal invested plus interest on or after a specified date. Brokers rate bonds based upon their trustworthiness, that is, the likelihood of investors actually receiving the promised return-on-investment. Bonds with assured returns are rated “A,” while riskier ones are rated “B” or even “BB.” Low-rated bonds may be issued by start-up or expanding companies, which will be unable to repay their debts unless their business ventures are successful. To convince investors to take higher risks on their bonds, such companies offer greater rates of return than those available to investors in A bonds. As the U.S. economy improved during the 1980’s, the public looked for lucrative investment opportunities. Stockbrokers sold B bonds directly to individual investors wishing to play the market. By 1984, “junk bonds,” as B-rated offerings were labeled, were offering double-digit rates of return. Start-up companies sold junk bonds to fund their initial operations. Large firms like Dean Witter and Paine Webber began selling them, and Drexel Burnham Lambert had been brokering junk bonds since the 1970’s. By the 1980’s, Drexel was the leader in junk bonds, which had generated significant wealth for the company. Even Savings and Loans (S&Ls) bought junk bonds as investments, because federal law could be interpreted to mean their speculation was insured against loss. New tricks were improvised: Junk bonds financed hostile takeovers over one company by another, sometimes termed a “corporate raider.” Raiders sold their own junk bonds to raise the money necessary to buy controlling shares in their target companies. Once in control, raiders divided acquired companies and sold off their various components, paying back bond holders while netting a large profit for themselves. This practice put both management and regular employees of the scavenged companies out of work. A variation on this approach was called “greenmail.” The greenmailer bought controlling
Junk bonds
■
557
interest in a company but offered to sell it back at more than its market value. If this arrangement was agreed to, everyone kept their jobs while stockholder dividends plummeted. T. Boone Pickens and Saul Steinberg were the most famous raiders of the 1980’s. Some considered them heroes for making companies more efficient to avoid hostile takeovers; others saw them as villains for tearing businesses apart and increasing Wall Street’s debt. One anti-takeover maneuver employed by potential targets was to “swallow a poison pill”: That is, vulnerable companies would issue junk bonds paying great dividends if the company was broken up. Raiders would be required to pay those dividends out of their own pockets, preventing them from realizing profits and making the companies unattractive acquisition targets. Another strategy was to find a “white knight,” a company that would agree to a friendly merger with the target company, thereby forming a new company that was too large to raid. In 1986, financer Ivan Boesky was accused by the Security and Exchange Commission of getting insider information about unannounced takeovers. He wore a government wire to incriminate others— especially Drexel executives—who were funneling him information and secretly buying stock for him. Boesky served two years in prison. Michel Milken, the Drexel junk bond executive, also served two years for dishonest dealings. Both left prison wealthy men, Milken a billionaire. Impact By 1989, the junk bond flurry was over. Congress forced S&Ls to divest themselves of junk bonds. So many were on the market, there were no buyers. Junk bonds made billions of dollars for entrepreneurs, funded some start-up companies but destroyed others, and induced companies to merge and grow larger. Further Reading
Bruck, Connie. The Predator’s Ball. New York: Penguin Books, 1989. Stewart, James B. Den of Thieves. New York: Simon & Schuster, 1992. James Pauff See also
Black Monday stock market crash; Business and the economy in the United States; Wall Street.
558
■
Just Say No campaign
■ Just Say No campaign Identification
A slogan of the 1980’s war on drugs popularized by First Lady Nancy Reagan
This slogan typifies the Reagan-era zero tolerance approach to curbing illegal drug use. It soon became a punch line to jokes and came to signify the impotence of the government in dealing with illicit narcotics. It is common for the U.S. government to declare war on society’s great ailments, such as in the War on Poverty, the War on Hunger, and the War on Terror. For its heavy reliance on the mass media and its length on the public agenda, no government “war” was as prominent in the 1980’s as the War on Drugs. Every war has its slogans, whether it is a message carried by the president himself—such as Lyndon Johnson’s March, 1964, speech declaring a war on poverty or George W. Bush’s 2002 State of the Union address defining the “axis of evil”—or a message carried by an executive staff member or spokesperson. In the case of the War on Drugs, the effort’s message was pushed most heavily by First Lady Nancy Reagan and most famously took the form of the Just Say No campaign. The War on Drugs
Two major camps of scholars exist involving the use of illegal and illicit drugs. The first camp believes that the flow of illegal drugs can never be completely stopped, or even slowed enough to justify devoting significant resources to it. The second camp, in marked contrast, advocates expending finite government manpower and money on slowing illegal drug trafficking. The War on Drugs and its Just Say No campaign presupposed that the drug war was a war worth fighting—in other words, that it was winnable. Though federal action to combat drugs predates the 1980’s, it was not until the 1980 election of conservative Ronald Reagan that the movement really gained a head of steam. Countering drug use can take one of two forms: efforts to stop drug supply or efforts to stop drug demand. Supply reduction takes the form of contraband interdiction and border security. Some strategies to slow drug consumption exclusively focus on things such as the breaking up of drug cartels and the destruction of outlaw crops. Demand reduction is trickier, taking the form of education initiatives and public awareness campaigns. The Reagans, both Ronald and Nancy, popular-
The Eighties in America
ized their commitment to zero tolerance for illegal drug use with the Just Say No campaign. The idea behind the campaign was the recognition that both addictive and nonaddictive drug use is a willful act— an argument not without controversy—that can be curbed through a proper education program. The reasoning was that if the government employed popular figures such as singers or actors to address the mass public on the dangers of drugs, then people would choose to not use them. To target users and potential users alike, young actors such as Drew Barrymore (who later wrote a book on her childhood drug problems) championed the benefits of a drugfree lifestyle to youths. Some television shows picked up the president and First Lady’s antidrug torch by using Just Say No as an episode theme, and a popular song was even recorded by one of the lesserknown siblings of Michael Jackson. The Demise of a Slogan Nearly as soon as the Office of the First Lady introduced the Just Say No slogan, the initiative’s rhetoric received criticism as being too simplistic. For example, the slogan’s impact likely is reduced as a child ages. A one-size-fits-all approach to drug education, like a single-sized article of clothing, is more often seen as a one-size-fits-none approach. Beyond the issue of the target audience’s age, there was the question of the illegal substance to which “no” was being said. Was it regarding marijuana, the drug touted as the gateway drug to harder and more dangerous drugs, or was the slogan indeed telling its audience to shun crack cocaine and heroin? Whatever the case, the slogan was criticized as inadequate in addressing the abuse of legal substances. Just Say No quickly became the punch line of choice for standup comedians and T-shirt designers. Some of the pop culture infamy of the slogan is attributable to its kitschy, funny quality, but some of its fame is a direct result of its ineptitude. A child who tempted to try out his or her parents’ prescription drugs or to experiment with household cleaners is ill counseled by Just Say No. What are the ill effects? What are the health risks? Children wondering why they should Just Say No received little in reply to their curiosities unless they actively sought the answer from adults. Public service announcements of the 1980’s presupposed a general understanding among citizens about why one should not engage in the improper use of drugs, be they legal or illegal, quite a supposition when kids are included in the mix.
The Eighties in America
Just Say No campaign
■
559
Nancy Reagan appears flanked by kids sporting “Just say no” shirts in 1988. (Hulton Archive/Getty Images)
Another problem with the slogan was how it fit into the government’s total strategic antidrug package. A slogan has little to do with actual drug use and curtailment, and something must be done to curb use along with the catchphrase. Regarding implementation of antidrug efforts, in the 1980’s a multitude of blue ribbon panels and study groups were convened, but no large bureaucratic entity ever emerged as the obvious go-to-group for spearheading antidrug efforts. The military resisted being the soldiers in this particular “war,” and they more or less succeeded in staying out of the fracas. That left local police forces and covert government groups such as the Central Intelligence Agency (CIA) to tackle the problem. Using groups steeped in secrecy to combat drugs comes with a host of problems, but local policing was used to similar ill effect, presenting no panacea to the drug problem. What many thought to be overly tough mandatory minimum
sentencing laws thwarted the best efforts of police and judges, who deemed them too harsh for firsttime offenders. The result was decision making by judges and police officers who did their best to counteract antidrug laws seen in professional circles as being political rather than effective. In Washington, D.C., the nail in the coffin of the Just Say No campaign was the widely held perception that it was political grandstanding more than sound policy. Whether or not that assertion is true, statistics indicate that drug use during the 1980’s did not decline, and the slogan that turned into a popular culture punch line has not been judged kindly by history. Impact The Just Say No slogan has been statistically shown to have had little to no impact on illegal drug use during the 1980’s. Its use was part of a broader Reagan administration strategy of targeting drug
560
■
Just Say No campaign
users rather than focusing solely on drug suppliers. Soon after Nancy Reagan popularized the phrase, it became a subject of ridicule in popular culture. Today, it stands as a symbol of failed 1980’s “get tough on crime” policies, which were heavy on rhetoric as well as tough sentencing. Further Reading
Bayer, Ronald, ed. Confronting Drug Policy. New York: Press Syndicate of the University of Cambridge, 1993. This volume shows the post-1980’s angst among scholars wrestling with how to move forward with drug policy. Hamid, Ansley. Drugs in America. Greensburg, Md.: Aspen, 1998. An excellent book that gives the reader an overview of the illicit drug situation in the United States. Inciardi, James A. Handbook of Drug Control in the
The Eighties in America
United States. New York: Greenwood Press, 1990. This book provides a solid picture of where drug policy stood at the end of the decade and glimpses of where it was poised to go. Robinson, Matthew B., and Renee G. Scherlen, eds. Lies, Damned Lies, and Drug War Statistics. Albany: State University of New York Press, 2007. One need only regularly use reported government statistics, particularly those related to crime, to determine their severe limitations. This book is highly informative to the topic at hand, as well as being useful to budding social scientists in general. R. Matthew Beverlin See also Crack epidemic; Crime; Drug Abuse Resistance Education (D.A.R.E.); Reagan, Nancy; Reagan, Ronald; Reagan Revolution; Slang and slogans.
K ■ Keillor, Garrison Identification
American radio humorist and writer Born August 7, 1942; Anoka, Minnesota As creator and host of A Prairie Home Companion, Keillor revived the variety show for American live radio and mastered that venue for poking fun at people and institutions. The gently satirical yarns at the core of his 1980’s musical radio program—along with Keillor’s humorous books, essays, and literary sketches—explored the foibles of ordinary life and created a sense of shared, if flawed, humanity. Garrison Keillor’s musical variety show became weekly Saturday night fare on Minnesota Public Radio after its 1974 debut in St. Paul, Minnesota. Written almost entirely by Keillor during its initial thirteen-year run, A Prairie Home Companion was nationally broadcast by 1980 and, by 1987, had reached some 279 public radio stations and 4 million listeners. It featured skits, parodies, mock commercials, self-deprecating poems, and songs by eclectic musical guests, but the show’s acclaimed centerpiece was Keillor’s unhurried monologue, the “News from Lake Wobegon”—folksy, benevolent stories about a Minnesota farming hamlet. Characterized by downhome musings and rural midwestern flavor—and reminiscent of classic American storytelling by Mark Twain, Will Rogers, and James Thurber—the tales followed taciturn bachelors, preachers, and a circle of self-contained midwesterners grappling with marriage, small-town politics, youthful hijinks, and seasonal rituals. The stories avoided neat conclusions, allowing dark and bittersweet moments to mix with nostalgia and reflection. Unlike mainstream comedy of the 1980’s, Keillor’s live humor was understated, wholesome, and anachronistic. His shy, first-person narratives forged an intimate connection between radio personality and audience; as audiocassette recordings (including a Grammy Award-winning recording), they
helped make Keillor a celebrity and an American cultural phenomenon. Newspapers and magazines of the time claimed that yarnspinning alone did not explain Keillor’s enormous success; history and national identity also contributed: Modern urbanites, the media posited, craved the vintage sense of place and belonging that Keillor offered. Of four books Keillor wrote in the 1980’s, the decade’s two favorites drew on his radio vignettes and anecdotes: Lake Wobegon Days (1985), released as Keillor’s face graced the cover of Time magazine, was the year’s top-selling hardcover work of fiction; its sequel, Leaving Home (1987), was nearly as popular. Stories and verse Keillor wrote for Harper’s Magazine, The Atlantic Monthly, and The New Yorker were reprinted in Happy to Be Here (1982) and We Are Still Married (1989). From 1987 to 1992, Keillor was a staff writer for The New Yorker, renowned for its finely crafted humor about little people driven mad by modern annoyances. Like his oral tales, Keillor’s magazine fiction favored the amusingly dated over the pretentious or excessive. In print, however, Keillor showed less sympathy for his subjects, especially for politicians. In 1989, Keillor started a new radio program, American Radio Company of the Air, featuring Lake Wobegon yarns, as well as observations of city life by an adopted New Yorker. Impact By resurrecting comic radio programming that had entertained Americans for decades before television—and reaching a vast audience through multimedia marketing—Keillor sparked interest in the Midwest and seemingly shortened the divide between rural and urban Americans in the 1980’s. Further Reading
Lee, Judith Yaross. Garrison Keillor: A Voice of America. Jackson: University Press of Mississippi, 1991. Scholl, Peter A. Garrison Keillor. New York: Twayne, 1993. Wendy Alison Lamb
562
■
The Eighties in America
Kincaid, Jamaica
See also
Book publishing; Comedians; Country music; Literature in the United States; Music; Theater.
■ Kincaid, Jamaica Identification Antiguan American feminist writer Born May 25, 1949; St. John’s, Antigua
Kincaid’s fiction offered an angry and idiosyncratic new voice to American readers, as it addressed racism, colonialism, and the repression of women. By the eve of the 1980’s, Jamaica Kincaid had already left Antigua and established herself in New York. Born Elaine Potter Richardson, she had changed her name and had connected with the New York literary scene, where she was an outspoken critic of the many varieties of social bigotry. Her first book was a collection of short stories, At the Bottom of the River (1983), the first story of which was the oftenanthologized “Girl.” This short-short story is composed almost entirely of an island mother’s instructions to her daughter about how to live her life, from
cooking to cleaning to spitting, always with a refrain that indicates the mother’s suspicions that, despite her daughter’s protests to the contrary, the girl is bent on becoming a “slut.” Tensions between mothers and daughters would form a major theme in Kincaid’s later work. Kincaid’s first novel, Annie John, was published in 1985. Heavily autobiographical, like all her work, it not only depicts a mother and daughter whose bond is broken because of the mother’s betrayal but also reveals the contradictions between island life and the irrelevant education provided by a colonial government. Annie John’s resistance to that schooling emerges as she resists her teachers’ efforts to make her into a proper English schoolgirl, particularly when they repress her sexuality. Another element of her rebellion arises when she makes a shrine of a picture of Christopher Columbus in chains. An English education is essentially anomalous in the Caribbean, as Annie John knows. Kincaid pursued her examination of the racist results of colonialism in A Small Place (1988), a nonfiction account of her visit to Antigua in 1986 on a Guggenheim Fellowship. Kincaid’s anger is palpable in the text, as she describes the disparity between the lives and attitudes of the island’s white tourists and those of its black and Carib Indian residents. The tourists’ blindness to the island’s heritage of repression and slavery is analogous to the blindness of a colonial government that required island schoolchildren to study English poetry. Kincaid was equally angry at Antiguans’ willingness to accept colonialist standards of life and thought, even after the island achieved independence, seeing it as a result of Antiguan greed for tourists’ dollars. Not surprising, the book was rejected by many Antiguan and English readers, but it became Kincaid’s stepping-stone to her next novel, Lucy (1990), in which she would examine similar blind spots in the United States. Impact Jamaica Kincaid’s voice was one of a rising tide of angry feminist writers and writers of color in the 1980’s. Her particular addition to that chorus was her decision to force Americans to consider the role of colonialism in racism and the repression of women. Further Reading
Jamaica Kincaid. (Sigrid Estrada)
Bloom, Harold, ed. Jamaica Kincaid: Modern Critical Views. New York: Chelsea House, 1998. Paravisini-Gebert, Lizabeth. Jamaica Kincaid: A Criti-
The Eighties in America
King, Stephen
■
563
cal Companion. Westport, Conn.: Greenwood Press, 1999. Simmons, Diane. Jamaica Kincaid. New York: Twayne, 1994. Ann D. Garbett See also
African Americans; Beloved; Feminism; Immigration to the United States; Literature in the United States; Multiculturalism in education; Racial discrimination.
■ King, Stephen Identification American writer Born September 21, 1947; Portland, Maine
King was an extremely popular and influential writer of supernatural and horror novels. The 1980’s saw the publication of some of his best and worst books, in which he explored both personal and cultural anxieties. By 1980, Stephen King was already, as he put it, “a brand name.” The author’s success and fame allowed him to write what he wanted rather than tailor his fiction to the marketplace. King moved into fantasy, as well as supernatural horror in realistic settings, with The Gunslinger (1982, revised 2003) and The Drawing of the Three (1987), the first two books in his seven-volume Dark Tower series. He also collaborated with Peter Straub on a quest novel, The Talisman (1984), and published a young-adult novel, The Eyes of the Dragon (1986). King’s best books of the decade were arguably Pet Sematary (1983) and Misery (1987). The former explores the fear of death and being left behind, as well as obsession and denial. The latter, comprising much horror but no supernatural content, examines the nature of writing, especially popular writing, and an author’s relationship to his fans. King wrote several novels under the pseudonym Richard Bachman. Three of these were published during the 1980’s: Roadwork (1981), The Running Man (1982), and Thinner (1984). The latter two in particular commented upon the culture of the 1980’s. The Running Man portrays a future society in which a popular game show involves contestants surviving as long as they can before being hunted down and murdered—an extrapolation from the pandering nature of network television. Thinner both parodies and supports America’s emphasis on slender-
Stephen King. (Tabitha King)
ness. Late in the decade, King was “outed,” when it became public knowledge that he and Bachman were one and the same. The Dark Half (1989) represented King’s response to this revelation. Many of King’s novels demonstrate his deftness with child and adolescent characters: In Firestarter (1980), a young girl lights fires psychically; in Cujo (1981), a boy and his mother are endangered by a rabid St. Bernard; in Christine (1983), an alienated teen loves his demoniacally possessed car; and in It (1986), one of King’s best-structured novels, the protagonists are beset by a monster in their youth and then face it again as adults. The Tommyknockers (1987) is arguably King’s worst book, but it fascinatingly reflects the delights and perils of cocaine, to which the author later admitted being addicted. As occurs in Christine, the novel’s shifts in narrative perspective undermine its story. The decade also introduced Different Seasons (1982), featuring four novellas by King, and Skeleton Crew (1985), his second
564
■
The Eighties in America
Kirkpatrick, Jeane
short-story collection. Books about King’s writing began to appear in the mid-1980’s, and the genre of King criticism would burgeon in the next decade. King’s own study of the horror genre, Danse Macabre, appeared in 1980. Many film adaptations of King’s movies premiered during the decade, including Stanley Kubrick’s The Shining (1980), which appealed to film fans more than to King fans. Impact King is best known by his fans for disturbing situations, concepts, and events that stay with readers after the book is finished. Some read his novels for these simple, visceral reactions, but others see in King’s work a chronicle and an exploration of the anxieties and terrors distinctive of his time and place. King’s novels continued to resonate with American culture in the 1980’s, and they therefore form a key to understanding that in later years. He captured the surface details and brand names of daily life, as well as themes such as Americans’ love/hate relationships with cars, dieting, television shows, and even popular novelists. Further Reading
King, Stephen. On Writing. New York: Scribner, 2000. Magistrale, Tony. Stephen King: The Second Decade. New York: Twayne, 1992. Bernadette Lynn Bosky See also Book publishing; Film in the United States; Horror films; Literature in the United States.
■ Kirkpatrick, Jeane Identification
American stateswoman and academic Born November 19, 1926; Duncan, Oklahoma Died December 7, 2006; Bethesda, Maryland Kirkpatrick served as the U.S. ambassador to the United Nations, where she was a rigorous advocate for American policies on disarmament, the Falkland Islands War, the Soviet incursion in Afghanistan, the developing Solidarity movement in Poland, and the continuing crises in the Middle East. Kirkpatrick was respected by friends and foes alike and feared by some because of her intelligence, quick wit, and passionate commitment to the United States’ renewed anticommunist policies. Jeane Kirkpatrick flirted with socialism during the 1940’s, but she later became an anticommunist. She
joined the Democratic Party and became a foreign policy adviser to Hubert Humphrey during the 1968 presidential campaign. She drifted away from the Democrats, however, because of the foreign policy of President Jimmy Carter, which she viewed as not adequately anticommunist. During the 1980 presidential campaign, Kirkpatrick became a foreign policy adviser to the Republican candidate, Ronald Reagan. After his victory, Reagan appointed her to serve as U.S. ambassador to the United Nations; Kirkpatrick was the first American woman to serve in that capacity. Kirkpatrick maintained that communism was the primary enemy of the United States and its culture of freedom, individualism, and free trade. Her anticommunist focus led Kirkpatrick to lend American support to like-minded authoritarian regimes in Latin America and Asia. At the Republican National Convention in 1984, Kirkpatrick lambasted the Democratic Party and its candidate for president, Walter Mondale, for their naïveté about Soviet foreign policy and their failure to appreciate the values of freedom and individual rights that were the hallmarks of American culture. After Reagan’s second inauguration on January 21, 1985, Kirkpatrick resigned as U.S. ambassador to the United Nations. She joined the Republican Party and returned to Georgetown University to teach and write; she argued repeatedly that the Democratic Party of Roosevelt, Truman, and Kennedy had been taken over by a group of left-wing appeasers who jeopardized American interests by following a policy of accommodation with the Soviets. In later years, Kirkpatrick remained a forceful voice for procapitalist policies and conservatism in politics. Impact As U.S. ambassador to the United Nations, Kirkpatrick served as the voice of the Reagan administration on the world stage. She supported Reagan’s policy of confrontation with the Soviet Union during the Cold War, and she was at the center of the American response to the Arab-Israeli struggle, the rapid turnover in Soviet leadership, and the turmoil in Latin America. Further Reading
Gerson, Alan. The Kirkpatrick Mission—Diplomacy Without Apology: America at the United Nations. New York: Free Press, 1991.
The Eighties in America
Harrison, Pat. Jeane Kirkpatrick. American Women of Achievement. New York: Chelsea House, 1991. Kirkpatrick, Jeane. Making War to Keep Peace. New York: Regan Books, 2007. William T. Walker See also Cold War; Elections in the United States, 1980; Foreign policy of the United States; Grenada invasion; Israel and the United States; Reagan, Ronald; Reagan Doctrine; Reagan’s “Evil Empire” speech; Strategic Defense Initiative (SDI); United Nations.
■ Kiss of the Spider Woman Identification U.S.-Brazilian dramatic film Director Hector Babenco (1946) Date Released July 26, 1985
The film adaptation of Argentinean author Manuel Puig’s 1976 novel broke new ground for independent film and earned international praise as one of the most successful independent films of its time. Despite the film’s difficulties securing financial backing, Kiss of the Spider Woman won critical acclaim at the 1985 Cannes Film Festival, where William Hurt was awarded Best Actor, and the film soon realized international success. The motion picture drama became the first independent film to earn multiple Academy Award nominations (for Best Picture, Best Director, and Best Screenplay) and was also the first to garner an Academy Award for Best Actor, again for Hurt. The film’s other noteworthy performances included those of Raul Julia as Valentin and Brazilian actress Sonia Braga in the multiple roles of Leni Lamison, Marta, and the Spider Woman. Kiss of the Spider Woman is predominantly set in a dingy cell of a nameless Latin American prison. Luis Molina (Hurt), a homosexual window dresser imprisoned for corrupting a minor, is an idealist who refuses to sacrifice fantasy for reality, instead immersing himself in the nostalgia and glamour of memories of his favorite films. Valentin Arregui ( Julia) is a revolutionary, tortured and imprisoned for his involvement with an anti-government party. In contrast to Molina, Valentin has chosen to sacrifice emotion for political ideology. Despite the two men’s tremendous dissimilarities, their confinement forces them to confront their own beliefs and form a bond
Kiss of the Spider Woman
■
565
that transcends the boundaries of sexuality and politics; the two men ultimately realize through their love for each other what constitutes humanity. Also noteworthy is a subplot with Sonia Braga’s multiple characters at its center. In the prison cell, Molina narrates tales to Valentin, one recapitulating an old Nazi propaganda film, another about the mysterious Spider Woman. These narratives offer temporary escape for both men: The stories distract Valentin from debilitating pain and anxiety; they free Molina from the confines of physical maleness, allowing him to access a world in which he might experience love. Molina’s storytelling permeates the main plot, weaving an imaginative world of possibility and freedom into the men’s dark reality. Impact The independent U.S.-Brazilian coproduction Kiss of the Spider Woman gained international success, demonstrating the viability of such a project. It was also a significant critical success, garnering particular recognition for Hurt, whose performance as the homosexual Molina earned an Academy Award. While most attention devoted to this film was favorable, however, some critics were conflicted about Hurt’s performance and suggested that his portrayal of a flamboyant, feminine homosexual perpetuated negative stereotypes. Despite some controversy, Kiss of the Spider Woman had a tremendous impact upon the 1980’s, prompting questions about politics, sexuality, and human nature. Further Reading
Hadleigh, Boze. The Lavender Screen: The Gay and Lesbian Films—Their Stars, Makers, Characters, and Critics. New York: Citadel Press, 1993. Santoro, Patricia. “Kiss of the Spider Woman—Novel, Play, and Film: Homosexuality and the Discourse of the Maternal in a Third World Prison.” In Framing Latin American Cinema: Contemporary Critical Perspectives, edited by Ann Marie Stock. Minneapolis: University of Minnesota Press, 1997. Wiegmann, Mira. The Staging and Transformation of Gender Archetypes in “A Midsummer Night’s Dream,” “M. Butterfly,” and “A Kiss of the Spider Woman.” Lewiston, N.Y.: Edwin Mellen Press, 2003. Danielle A. DeFoe See also
Academy Awards; Film in the United States; Homosexuality and gay rights; Hurt, William; Theater.
566
■
The Eighties in America
Klinghoffer, Leon
■ Klinghoffer, Leon Identification American victim of terrorism Born September 24, 1916; New York, New York Died October 8, 1985; Mediterranean Sea, off
the Egyptian coast Klinghoffer was killed by Palestinian terrorists during the hijacking of the Achille Lauro cruise ship. His death epitomized, and put a human face on, the senseless, causebased killings of the international terrorism spree of the 1980’s. Four men representing the Palestinian Liberation Front (PLF) gained passage aboard the Italianflagged ship Achille Lauro in 1985 using false passports. When the ship’s steward, mindful of the men’s suspicious behavior, entered their cabin, he found them cleaning guns, forcing the hijackers to execute their takeover plan earlier than they had intended. The hijacking was not difficult, however, because most of the Achille Lauro’s passengers were participating in a day trip in Egypt. Those people remaining aboard the ship were citizens of a variety of different nations. In short order, the terrorists separated the American and British citizens from the remainder of the ship’s passengers and crew. Among the British and Americans was sixty-nineyear-old Leon Klinghoffer, a wheelchair-bound retiree, accompanied by his wife, Marilyn Klinghoffer. The hostage-takers shot Leon dead, because he was Jewish. They then forced two crew members to throw the body overboard at gunpoint and ordered the ship out to sea. The U.S. government was soon made aware of the hijacking, but initially not of Klinghoffer’s murder. A federal antiterrorist group dispatched U.S. military resources to the region, including the elite Seal Team Six special operations unit. On the third day of the ordeal, the ship anchored off of Port Said, Egypt, and the terrorists were flown toward Tunisia on an Egyptian commercial aircraft. Just four months earlier, an American serviceman had been killed during the hijacking of Trans World Airlines (TWA) Flight 847, and the Ronald Reagan administration was in no mood to see the perpetrators of the Achille Lauro hijacking escape justice. A flight of U.S. Air Force jets forced the Egyptian plane to land in Italy, where the four hijackers were arrested. The PLF was a militant faction of the Palestine Liberation Organization (PLO), headed by the
notorious Mohammed Abu Abbas. At the time of the Achille Lauro incident, which he masterminded, Abbas held a top position within Yasir Arafat’s cabinet. Abbas was actually on the Egyptian plane as well, but he was allowed to stand at the doorway of the aircraft and wave at American forces. He was given safe passage because of a disagreement between the U.S. government and the Italian government. As a result, Abbas remained at large for the better part of two decades. Impact Klinghoffer’s death prompted the U.S. Congress to pass Concurrent Resolution 213, which called for the formation of a multinational force designed to address international terrorism. Although Marilyn Klinghoffer died of natural causes months after her husband’s slaying, she was able to testify before Congress about the hijacking before her death. Later, Klinghoffer’s family would form the Leon and Marilyn Klinghoffer Foundation in association with the Anti-Defamation League. In the short run, the hijacking of the Achille Lauro and the murder of Leon Klinghoffer flamed ethnic hatred and resulted in the retaliatory killing of an Arab American college professor in California who spoke out in the media about the shamefulness of the violent hijacking. In the long run, the incident, taken in combination with the TWA Flight 847 hijacking and the 1983 attack upon the U.S. Marine Corps compound in Beirut, highlighted ongoing Middle Eastern tensions that, although somewhat dissipated by the Camp David Accords of the previous decade, remained a source of violence and strife both within the Middle East and between that region and the West. Further Reading
Bohn, Michael K. The Achille Lauro Hijacking. Washington D.C.: Brassey’s, 2004. U.S. House of Representatives. Committee on Foreign Affairs. Aftermath of the Achille Lauro Incident: Hearing and Markup Before the Committee on Foreign Affairs and Its Subcommittee on International Operations. Washington, D.C.: Government Publications Office, 1985. Wills, David C. The First War on Terrorism. Lanham, Md.: Rowman and Littlefield, 2003. R. Matthew Beverlin See also
Beirut bombings; Middle East and North America; Pan Am Flight 103 bombing; Terrorism.
The Eighties in America
■ Knoxville World’s Fair The Event
International exposition focused on energy and energy-related technologies Date May 1, 1982 to October 31, 1982 Place Knoxville, Tennessee The 1982 World’s Fair in Knoxville celebrated energy efficiency, usage, and alternatives and brought 11 million visitors to pavilions and exhibits from over two dozen nations. Following a decade of energy shortages, in the early 1980’s the city of Knoxville in East Tennessee emerged as a likely host for an international energy exposition. The town of nearly 200,000 boasted close proximity to the Oak Ridge National Laboratory; served as the headquarters of the nation’s largest utility, the Tennessee Valley Authority; and was adjacent to the University of Tennessee’s energy research facilities. Knoxville’s access to major interstate highways and closeness to the Great Smoky Mountains National Park also made tourist traffic likely. City leaders rallied around the concept of hosting an international exposition as a way to revitalize Knoxville’s downtown, improve the interstate system, and attract outside industry to the mountain town. Event planners settled on “Energy Turns the World” as the exposition’s theme. Leaders selected a narrow tract of land between downtown and the University of Tennessee’s campus, a tract once known as Scuffletown, as the location for the exposition. Following approval from the Bureau of International Expositions in Paris, Knoxville planners secured funding from the federal government and issued city bonds to help raise the $115 million required to stage the event. State and federal funds also became available for significant interstate highway improvements around the city. On May 1, 1982, President Ronald Reagan officially opened the Knoxville International Energy Exposition (also known as Energy Expo ’82, the Knoxville World’s Fair, and the 1982 World’s Fair). The 266-foot-tall Sunsphere overlooked the grounds and served as the event’s symbol. Visitors marveled at pavilions and exhibits including multilingual computers from Japan, solar collectors from Saudi Arabia, bricks from the Great Wall of China, a giant Rubik’s Cube from Hungary, an unwrapped Peruvian mummy, and talking robots from the United States. Fairgoers experienced daily parades, nightly
Koop, C. Everett
■
567
fireworks, marching bands, midway rides, and entertainment performances by Bob Hope, Debbie Boone, Johnny Cash, and many others. Professional football and basketball exhibition games were also held nearby. On October 31, 1982, the Energy Expo ’82 closed after hosting over 11 million visitors, making it one of the top-drawing fairs in American history. Impact The 1982 World’s Fair brought the small city of Knoxville, Tennessee, to the forefront of international attention. The production and conservation of energy proved a timely and relevant theme for Americans in the 1980’s. While the fair addressed many of the world’s energy problems, however, it brought about no significant innovations in energy. Further Reading
Dodd, Joseph. World Class Politics: Knoxville’s 1982 World’s Fair, Redevelopment, and the Political Process. Salem, Wisc.: Sheffield, 1988. Findling, John E., ed. Historical Dictionary of World’s Fairs and Expositions, 1851-1988. New York: Greenwood Press, 1990. Wheeler, William Bruce. Knoxville, Tennessee: A Mountain City in the New South. 2d ed. Knoxville: University of Tennessee Press, 2005. Aaron D. Purcell See also
Louisiana World Exposition; National Energy Program (NEP); Reagan, Ronald; Science and technology; US Festivals.
■ Koop, C. Everett Identification
Surgeon general of the United States, 1981-1989 Born October 14, 1916; Brooklyn, New York Surgeon General C. Everett Koop was not afraid to speak out on controversial health issues and inform Americans of the dangers of smoking and AIDS. C. Everett Koop became surgeon general of the United States in November, 1981, after an unprecedented nine-month-long confirmation battle in the Senate. A distinguished pediatric surgeon, Koop was a conservative Republican, an evangelical Christian, and a staunch opponent of abortion. He seemed an ideal candidate to promote the pro-life health poli-
568
■
Koop, C. Everett
The Eighties in America
U.S. surgeon general C. Everett Koop answers questions about the health effects of smoking during a press conference on May 16, 1988. (AP/Wide World Photos)
cies of newly elected president Ronald Reagan because of his strong anti-feminist, anti-gay, and antiabortion positions. His confirmation was bitterly opposed by the American Medical Association (AMA), the American Public Health Association (APHA), and pro-choice and gay-rights groups. Few believed his promise not to impose his personal values on the public. However, once confirmed, he proved controversial to the Right as well as to the Left: In 1982, resisting political pressure from tobacco lobbyists and the White House, Koop began an anti-smoking campaign. He attributed 30 percent of all cancer deaths to smoking and advocated a “smoke-free society.” Few people knew anything about acquired immu-
nodeficiency syndrome (AIDS) in the early 1980’s. By 1985, over ten thousand Americans, mostly homosexual men or intravenous drug users, were dying of the disease. The White House maintained a policy of silence about the disease for almost six years, in part because some conservative extremists believed AIDS was God’s punishment for immorality and in part because the president was not comfortable speaking about sexually transmitted diseases in public. However, public pressure finally forced Reagan to order the surgeon general to prepare a report on the epidemic. In 1986, after interviewing medical experts, hospitalized AIDS patients, and gay and lesbian leaders, Koop released his candid report on the dangers of
The Eighties in America
Koop, C. Everett
■
569
AIDS and unsafe sex practices. He called on all U.S. schools to begin education programs stressing those dangers and teaching the proper use of condoms to young students. Over objections from the White House, Congress approved a shortened version of the report. In 1987, Koop shocked conservatives by advocating condom commercials, and in 1988, an eight-page brochure of AIDS information prepared by Koop was mailed to every American household (114 million copies in all). Meanwhile, when Reagan asked for a report on the effects of abortion on women’s health, Koop delivered the opinion that there was insufficient scientific evidence to support the belief that abortion caused significant health risks to women.
ing and AIDS. When he resigned in 1989, the proportion of American smokers had decreased from one-third to one-fourth of all adults. His personal integrity, willingness to separate politics from scientific medicine, and concern for the health of all Americans earned him the respect of the public and began a new era in public health.
Impact Koop gave the U.S. Office of the Surgeon General a higher profile than it had ever enjoyed before, using the media effectively to inform the public of two of the greatest health risks of his time, smok-
See also Abortion; AIDS epidemic; Congress, U.S.; Conservatism in U.S. politics; Education in the United States; Feminism; Health care in the United States; Medicine.
Further Reading
Bianchi, Anne C. Everett Koop: The Health of the Nation. Brookfield, Conn.: Millbrook Press, 1992. Koop, C. Everett. Koop: The Memoirs of America’s Family Doctor. New York: Random House, 1991. Edna B. Quinn
L ■ L.A. Law Identification Television drama series Producers Steven Bochco (1943), David E.
Kelley (1956), and William M. Finkelstein (1952) Date Aired from September 15, 1986, to May 19, 1994 Though inspired by the tradition of courtroom hits like Matlock, this popular legal drama represented a new and influential television form that featured good-looking, slick, well-educated professionals engaged in both heated legal debates and passionate affairs, sometimes with one another. Part of the long-standing National Broadcasting Company (NBC) tradition of a strong Thursday night lineup, L.A. Law debuted in a two-hour pilot during the fall of 1986 with dramatic theme music composed by Mike Post and a cast of relatively unknown but very attractive actors. Harry Hamlin played Michael Kuzak, a conscientious liberal attorney who worked at a Los Angeles law firm headed by Leland McKenzie (Richard Dysart), the wise old partner who still had some youthful passion and humor. Kuzak was joined by opportunistic divorce attorney Arnold Becker (Corbin Bernsen), powerful litigator Ann Kelsey (Jill Eikenberry), tax attorney Stuart Markowitz (Michael Tucker), and Douglas Brackman, Jr. (Alan Rachins), the son of an original partner. Kuzak’s on-again, off-again sparring (and bedroom) partner at the district attorney’s office was Grace Van Owen (Susan Dey, in her first major television role since The Partridge Family). Jimmy Smits joined the firm later in the first season as junior partner Victor Sifuentes. Throughout the show’s seasons on the air, other attorneys came to the boardroom table, and the composition of the firm changed frequently, especially after the exodus of Hamlin, Smits, and Dey in 1991 and 1992.
Impact L.A. Law was Steven Bochco’s first successful follow-up to Hill Street Blues, and it established that the innovative, ensemble-driven television drama could succeed in more than one incarnation. Bochco teamed with Terry Louise Fisher, the creator of Cagney and Lacey, who brought a feminist sensibility to the show’s depiction of women in the workplace. The show was not necessarily innovative when compared with its creators’ previous efforts, but by following in their wake, it established that the changes wrought by Bochco and Fisher would continue to shape primetime network television during the 1980’s. Subsequent Events Producer and writer David E. Kelley left L.A. Law in 1991 to pursue other projects. By this time, Bochco and Fisher had already left as well. These departures, along with the loss of the three stars, took a major toll on the quality and appeal of the show. Later seasons featured melodramatic plots and ridiculous characters, and the show died with a relatively small and disappointed audience in 1994. Further Reading
Brigham, John. “L.A. Law.” In Prime Time Law: Fictional Television as Legal Narrative, edited by Robert M. Jarvis and Paul R. Joseph. Durham, N.C.: Carolina Academic Press, 1998. Brooks, Tim. The Complete Directory to Prime Time Network and Cable TV Shows: 1946-Present. 8th ed. New York: Ballantine, 2003. Schwartz, Tony. “Steven Bochco Goes from Hill Street to the Taut Glitz of L.A. Law.” New York, September 15, 1986, 62. Thompson, Robert J. Television’s Second Golden Age: From “Hill Street Blues” to “ER.” Syracuse, N.Y.: Syracuse University Press, 1997. Jennifer Heller See also
sion.
Cagney and Lacey; Hill Street Blues; Televi-
The Eighties in America
■ LaRouche, Lyndon Identification
American political activist and candidate Born September 8, 1922; Rochester, New Hampshire Over the many decades of his activism, LaRouche became a lightning rod for controversy. His followers regarded him as a bold and original voice in American politics, while many mainstream political thinkers dismissed him as paranoid and extremist. By the time the 1980’s began, Lyndon LaRouche had been an activist and organizer of leftist political groups for more than two decades. By this time, however, his ideologies and tactics had veered substantially away from traditional leftist doctrine and become more idiosyncratic, even to the point that his speeches and language echoed some language of the far Right. In the 1980’s, his activities cast a wide net. He was active in, among other things, supporting (and even taking partial credit for) President Ronald Reagan’s Strategic Defense Initiative (SDI), proposing a ballot initiative in California (Proposition 64) to force more aggressive measures to con-
LaRouche, Lyndon
■
571
tain acquired immunodeficiency syndrome (AIDS), advocating the colonization of Mars, meeting with leaders of developing nations to discuss economic policy, and founding the Schiller Institute in Germany as a clearinghouse for his ideas. The factor that brought LaRouche the most attention during the decade, though, was perhaps his status as a perennial candidate for U.S. president during the primary season. He ran as a candidate for the Democratic Party nomination in 1980, 1984, and 1988, buying television airtime to promote his candidacy but never receiving more than a tiny sliver of the primary vote or being accepted by the party as a serious candidate. He had run in 1976 as the candidate of the U.S. Labor Party, which he helped found, and he would continue to run as a dark horse Democrat in the 1990’s and beyond. By the early 1980’s, LaRouche was being investigated for a number of possibly illegal activities. In 1988, he was convicted of several tax violations, as well as conspiracy to commit mail fraud. He served five years of a fifteen-year sentence before being paroled. Unsurprising, LaRouche and his supporters called the affair a “show trial” and claimed the case against him was trumped up to suppress his activism. Perhaps more surprising, LaRouche continued many of his political activities from his jail cell, including a 1992 presidential primary run. His supporters during his imprisonment consistently referred to him as a “political prisoner.” Impact Critics from both the Right and the Left tended to portray LaRouche as out of touch with political reality, a conspiracy theorist, a cult leader, and an egomaniac. His supporters saw these criticisms as evidence of a political establishment unready for LaRouche’s genius and unwilling to put their own cherished ideas to the test. While his ideas never became mainstream, he remained a prominent figure on the American political scene throughout the 1980’s and beyond.
Lyndon LaRouche. (AP/Wide World Photos)
572
■
Last Temptation of Christ, The
The Eighties in America
Further Reading
King, Dennis. Lyndon LaRouche and the New American Fascism. New York: Doubleday, 1989. Mintz, John. “Presidential Candidate’s Ideological Odyssey.” The Washington Post, January 14, 1985, p. A1+. Tourish, Dennis, and Tim Wohlforth. “The Travels of Lyndon LaRouche.” In On the Edge: Political Cults Right and Left. New York: M. E. Sharpe, 2000. Janet E. Gardner See also
Conservatism in U.S. politics; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Liberalism in U.S. politics; Strategic Defense Initiative (SDI).
■ Last Temptation of Christ, The Identification American film Director Martin Scorsese (1942Date Released August 12, 1988
)
The Last Temptation of Christ explored the conflict between Christ’s divine identity and his mortal identity, incorporating an extended fantasy sequence in which he experienced a normal human life, including a sexual relationship, before choosing to embrace his divine role as humanity’s savior. The film proved incredibly controversial, inciting protests and boycotts across the country. As early as the mid-1970’s, the successful film director Martin Scorsese decided he wanted to make a movie about the life of Jesus Christ based on Ho teleutaios peirasmos (1955; The Last Temptation of Christ, 1960), the controversial novel by Greek writer Nikos Kazantzakis. Using a script prepared by Paul Schrader, Scorsese began work on the film in 1983 at Paramount Studios, but various Christian groups learned of his activity, and written protests began pouring into Paramount’s executive offices. Worried about the negative publicity and concerned that the film might not be a box-office success, Paramount withdrew its support; Scorsese was forced to shut down production and shelve the project for several years. Four years later, Scorsese signed on to work with Creative Artists Agency, the most powerful talent agency in Hollywood, and with its support he was able to convince Universal Studios to underwrite the Kazantzakis project. Working on a budget of less
than $7 million, Scorsese took his cast to the Moroccan desert for two months of shooting, using an adaptation of Schrader’s original script. Again, word that the project was being resurrected reached leaders of Christian churches and organizations, and an even more strident campaign was launched. To counter negative publicity and blunt charges that the film would be blasphemous, Universal hired consultants from Christian groups to work with Scorsese and certify that the movie was an acceptable, if somewhat controversial, adaptation of the Gospel stories. These consultants quit in protest, however, when they were not able to mandate changes to the script. Several key leaders among the Christian community were invited to an advance screening of the movie in July, 1988, but many refused to attend. Though the media often described protesters as fundamentalists, in fact the group opposing the movie included conservative Protestant denominations, the Roman Catholic Church, and several Jewish and Muslim leaders as well. Sensing that the longer the studio delayed release, the better organized opposition would become, Universal released the movie six weeks early, on August 12, 1988. Like the novel, the film portrays Christ upon the cross being tempted by Satan with the possibliity of abandoning his divine nature to live a happy, normal, mortal life. An extended sequence allows both Christ and the audience to experience that life, as he descends from the cross, marries Mary Magdalene, and builds a family with her, happily growing old as a normal man. In the end, however, after Judas reminds him of his obligations, Christ decides to return to the cross to redeem humanity. In another potential source of controversy, the film portrays Judas as heroic, betraying Christ to the Romans only because Christ tells him to, because he knows it is the role assigned to him in God’s plan. Bowing to protests from Christian groups, several major movie house chains refused to carry the film. In cities where the movie was shown, protests were organized outside theaters; some were vandalized, and in some locations violence broke out. Additionally, in a number of cities, individuals and organizations filed legal appeals to stop distribution or be paid damages under laws protecting the rights of those who felt ridiculed by the film. Impact The protests against the film affected boxoffice proceeds significantly. The Last Temptation of
The Eighties in America
Christ grossed less than $8.4 million in the United States and approximately half that amount in other countries. Suits brought against Scorsese and Universal Studios were decidedly less successful, however, as decisions handed down in several state and federal courts reaffirmed the director’s and studio’s First Amendment rights to produce and distribute the movie despite its objectionable subject matter. Beyond its narrative, the film was noteworthy for its set design, which was lauded by critics for generating a sense of what Galilee would have looked like in the time of Christ. Similarly, the innovative sound track by New Wave musician Peter Gabriel was composed and performed primarily on instruments that existed at the time that Christ lived. Further Reading
Keyser, Lester J. Martin Scorsese. New York: Twayne, 1992. Riley, Robin. Film, Faith, and Cultural Conflict: The Case of Martin Scorsese’s “The Last Temptation of Christ.” Westport, Conn.: Praeger, 2003. Laurence W. Mazzeno See also Film in the United States; Religion and spirituality in the United States; Scorsese, Martin.
■ Latin America Definition
The Western Hemisphere nations located south of the United States
During the 1980’s, the United States’ promotion of its own interests in Latin America led it frequently to support brutally oppressive regimes at the expense of the human rights of the Latin American people. U.S. foreign policy in Central and South America and the Caribbean was shaped decisively by the Cold War in the 1980’s. Fidel Castro’s Marxist government remained in power in Cuba. In 1979, meanwhile, the Marxist Sandinista National Liberation Front had overthrown Nicaragua’s Anastasio Somoza Debayle, a brutal right-wing dictator supported by the United States. In 1984, the Sandinistas sponsored free elections, winning two-thirds of the popular vote and making Daniel Ortega the freely elected socialist president of a democratic nation. These two threats to the interests of global capitalism sparked grave concern in the United States, as did a growing
Latin America
■
573
drug trade from south of the border. As a result, the U.S. government conducted two invasions and lent support to several Latin American governments that used terror and murder to suppress popular insurgencies. Central America and the Caribbean Beginning in 1981, the Ronald Reagan administration opposed the Sandinistas by supporting the Contras, a rebel army fighting against the Sandinista government. It also provided aid to the military governments in Honduras, Guatemala, and El Salvador. President Reagan courted public support for his policies throughout the decade by warning that communist movements in Central America, if successful, would threaten Mexico and ultimately the United States. The Contras, however, did not limit themselves to attacking military targets. Concerned by the Contras’ treatment of civilians, Congress banned any financial aid to the group, but the Reagan administration illegally circumvented this ban, resulting in a scandal known as the Iran-Contra affair. This scandal, which dominated much of Reagan’s second term in office, ultimately led to the criminal conviction of several members of his administration. In El Salvador, a series of military juntas confronted a liberation movement led by the Farabundo Martí National Liberation Front (FMLN). The juntas responded to the movement with death squads that operated with government approval. The thousands of victims of these death squads included Roman Catholic archbishop Oscar Romero of San Salvador, four American nuns, and two American aid workers. Visitors to the capitol were frequently told that body dump sites were a “must-see” attraction. Throughout the 1980’s, nevertheless, the U.S. Department of State continued to issue optimistic reports on the supposed “progress of democracy” in El Salvador in order to justify continuing aid money to the Salvadoran government. Many of the military officers who were responsible for the death squads were trained in the School of the Americas in the United States. The United States also provided aid to neighboring Honduras in exchange for permission to train Contras there for the war in Nicaragua. A group of thirty Salvadoran nuns and religious laywomen fled to Honduras after the assassination of Archbishop Romero and subsequently disappeared. It was later revealed that the women had been arrested by the
574
■
The Eighties in America
Latin America
Honduran secret police, tortured, and thrown alive from helicopters. John Negroponte, the ambassador to Honduras from 1981 to 1985, was alleged by some in Congress to have ignored Honduran human rights violations in order to promote the Contra war against Nicaragua. In Guatemala, Reagan supported an oppressive government that had fought a civil war against Mayan rebels for almost thirty years and that killed 200,000 people before the war ended in 1985. A human rights watch group reported that 93 percent of these killings were conducted by the Guatemalan army, with the direct and indirect support of the government of the United States. The commission used the word “genocide” to describe 626 massacres of entire Mayan villages during the 1980’s. President Reagan dismissed reports of the attacks on villages and called the Guatemalan dictator, General Efraín Ríos Montt, “totally dedicated to democracy.” In addition to helping preserve murderous governments, the United States employed such men as Manuel Noriega to provide it with anticommunist intelligence. Noriega, who was on the payroll of the Central Intelligence Agency (CIA) as early as the 1970’s, was the military dictator of Panama, as well as a major international drug dealer. Finally deciding that he was more of a liability than an asset, the United States invaded Panama in 1989, arrested Noriega, and took him to Miami to stand trial. In the only other Latin American military action of the decade, the Americans also invaded the island nation of Grenada in 1983 out of fears that Cuba had infiltrated Grenada’s revolutionary government. South America Uruguay, Argentina, and Chile were controlled by repressive military regimes until elections brought changes. The governments of all these countries used death squads, secret police, and torture to maintain power. In Argentina, the Dirty War ended with elections in 1983, after the country’s defeat by the United Kingdom in the Falkland Islands War. In Chile, General Augusto Pinochet Ugarte agreed to elections in 1988 and was defeated. Bolivia’s history during the 1980’s was another example of the heavy-handed use of American power. Both the Carter and Reagan administrations sought to pressure the Bolivian government to eradicate its coca crop, but coca (used for making cocaine) had been cultivated by the indigenous population for
centuries. Eradication policies and the presence of U.S. troops created fierce anger; a bomb was detonated during the visit of Secretary of State George P. Shultz in 1988. Colombia’s relations with the United States were complicated by the growing drug trade and the presence of two Marxist-oriented, violent guerrilla groups: the Fuerzas Armadas Revolucionarias de Colombia—Ejército del Pueblo (Revolutionary Armed Forces of Colombia—People’s Army, or FARC-EP) and the Movimiento 19 de Abril (Nineteenth of April Movement, or M-19). M-19 attacked the Dominican embassy in 1980 and kept a number of ambassadors hostage until given safe passage to Cuba; the FARC-EP financed its operations with kidnapping, extortion, and drug sales. Although Bolivia and Colombia received some of the largest South American grants of U.S. military aid during the 1980’s, little progress could be shown in control of either the drug trade or the Colombian rebels. Impact The United States consistently stood in its own eyes and in its rhetoric as a champion of freedom in the 1980’s. However, the government believed the opposite of freedom to be communism, and it was willing to support any regime, no matter how brutal, that would oppose communism. By the same token, the U.S. government opposed populist and democratic movements that were anticapitalist. This rigidly anti-Marxist policy of the Western Hemisphere’s only superpower left Latin America increasingly poor and vulnerable to unrest, while the drug trade continued to grow. These dubious achievements cost hundreds of millions of dollars in military aid, scores of American dead (in the Panama and Grenada invasions), and hundreds of thousands of Latin American dead in the various “dirty wars” the United States fought by proxy during the decade. Further Reading
Didion, Joan. Salvador. New York: Simon & Schuster, 1983. Eyewitness account of the terror campaign during El Salvador’s Dirty War. Middlebrook, Kevin J., and Carlos Rico, eds. The United States and Latin America in the 1980’s: Contending Perspectives on a Decade of Crisis. Pittsburgh: University of Pittsburgh Press, 1986. Twenty-four papers offer a variety of academic perspectives on U.S. relations with Latin America. Musicant, Ivan. The Banana Wars: A History of United
The Eighties in America
States Military Intervention in Latin America from the Spanish-American War to the Invasion of Panama. New York: Macmillan, 1990. The last two chapters detail military operations in the invasions of Grenada and Panama. Winn, Peter. Americas: The Changing Face of Latin America and the Caribbean. New York: Pantheon, 1992. Describes the political, artistic, and religious evolution of Latin America. Timothy Frazer See also Bush, George H. W.; Cold War; Congress, U.S.; Elections in the United States, 1980; Foreign policy of the United States; Grenada invasion; IranContra affair; Mariel boatlift; Mexico and the United States; Miami Riot of 1980; North, Oliver; Panama invasion; Poindexter, John; Reagan, Ronald; Reagan Doctrine; Weinberger, Caspar.
■ Latinos Definition
Americans originating from Spanishspeaking countries
Immigration from Latin America to the United States increased greatly in the 1980’s. The economic and cultural impact of Latino immigrants led the U.S. media to dub the 1980’s “The Decade of the Hispanic.” The term “Latin American” was coined by the French to indicate Catholic colonies that spoke romance languages, in an effort to unite them against the Protestant colonies of England. However, the term came to indicate only the Spanish- and Portuguesespeaking countries of Central and South America. The term “Latino” is used to denote those who speak Spanish. Within the United States, “Latino” has come to be nearly synonymous with “Hispanic,” a term invented by the U.S. Census Bureau in the 1970’s to group Spanish-speaking people. In common usage, the distinction between the terms is ambiguous. Some people use “Hispanic” to indicate Spanish ancestry, while “Latino” indicates a commonality of the Spanish language among people whose ethnic origins are African, Native American, or from other parts of Europe. Sometimes, within the Latin American community, “Latino” is used to represent those of higher socioeconomic or educational status, while “Hispanic” distinguishes those who are more economically disadvantaged.
Latinos
■
575
Immigrants from countries that were Portuguese, Dutch, French, or English colonies in Latin America are sometimes not counted as “Latinos” in the United States. For all intents and purposes, however, the terms are interchangeable. The Decade of the Hispanic
In the 1960’s, Latinos began to overtake Europeans at the major group immigrating to the United States as a result of liberalized immigration laws. By the 1980’s, immigration to the United States had reached its highest levels since the 1920’s, although the numbers were coming from Latin America and Asia, not Europe. The U.S. economy had a demand for cheap labor, while many developing countries were suffering from economic depressions and political upheavals. Compared to their predecessors in the early twentieth century, the new immigrants were younger and less educated. Where families had previously immigrated together, most of the Latino immigrants in the 1980’s were young, single men. Also, large numbers of immigrants came into the United States illegally, raising concerns about national security. Among the Latin Americans legally in residence in the United States at the time of 2000 census, nearly 30 percent arrived between 1980 and 1989; only 24 percent of those who arrived in those years had become naturalized citizens. This figure does not take into account undocumented immigrants. Latino immigrants came from South America and the Caribbean, although the largest group came from Mexico. Mexicans constituted the largest group of Latino immigrants. It is estimated that, in 1982, nearly three-quarters of the Mexican population were unemployed or underemployed. Malnutrition and infant mortality hit their highest levels in Mexico since the Great Depression. In 1980 and 1984, Fidel Castro gave Cubans permission to emigrate to the United States, a move he had previously made in 1965. Fleeing a communist government, Cuban immigrants tended to be more affluent and entrepreneurial, giving them an advantage in American society. Immigrants from other parts of the Caribbean and from Central America fled the violence and economic strife resulting from civil wars in their homelands, particularly Nicaragua, El Salvador, and Guatemala. The waves of refugees coming from these various countries followed similar patterns. First, when a revolution occurred, members of the fallen govern-
576
■
Latinos
ment and many upper-class citizens would flee to the United States, followed by relatives and friends. The next wave of immigrants would include middle-class citizens who had found life difficult under the new regimes or civil wars. Finally, as the economic situations in their homelands worsened, the lower classes would come to the United States seeking employment. Ironically, given the size and population of the continent, South America itself counts for the smallest percentage of Hispanic immigrants to the United States. They came for more traditional reasons, however, and tended to be better educated and economically middle class, leading to smoother integration into the United States. Latinidad and Protest Literature
Immigrants from Latin America included a wide range of races, ethnicities, and cultures. As they sought greater recognition as a minority in U.S. culture, they attempted to find a common cultural bond as Latinos, referred to as Latinidad. The entry of Hispanics into multicultural studies was accompanied by the rise of protest literature. Protest literature, and its accompanying social movement, rejected the paradigm of assimilation previously taken for granted in the American immigration process, insisting that Latinos had to retain their established culture. Previously, multicultural studies involved studying minorities from the establishment perspective; Hispanic scholars changed that by critiquing mainstream America from the Latino perspective, inspiring similar changes in how African Americans and other minorities engaged in multicultural research. In the 1980’s, the rise of hip-hop culture brought together not only Latinos from various backgrounds but also Latino and African American youths in a common countercultural movement.
Politics
Latino refugees tended to be more politically active than those who came seeking only jobs. Cuban immigrants became very active in trying to shape U.S. policy toward the Castro regime. Immigrants from Central America allied themselves with sympathetic Anglo-Americans to prevent deportation, change U.S. policies toward Central America, and educate the U.S. public about the situations in their homelands. The definition of a refugee in the U.S. Refugee Act of 1980 required only “a well-founded fear of
The Eighties in America
persecution,” but the practical policy of the U.S. government was to recognize only refugees from countries that the administration considered “hostile” and to reject refugee status to immigrants from governments that the United States supported. By 1990, these refugees achieved great victories both in Congress and in the federal courts. Economic Impact Like previous immigrant populations, Latinos would settle into ethnic neighborhoods, called barrios. As with those previous populations, early immigrants who had already achieved financial success in the United States, sometimes known as “pioneers” or “godfathers,” would sponsor other immigrants who came from their countries. These immigrants would settle near their sponsors. This led not only to the formation of ethnic neighborhoods but also to large concentrations of people of particular ethnicities in certain cities. As immigrant populations would move into particular cities and neighborhoods, the native populations would relocate, often to the suburbs, leading to the phenomenon of urban sprawl. This fueled construction, which, in turn, created jobs for low-wage immigrant workers. Hispanics tended to fill the lowest-wage positions, such as sanitation and manual labor. Many employers would seek out illegal immigrants as undocumented labor in order to avoid employment taxes. Many Latino immigrants, however, came from middle- and upper-class backgrounds, entering the U.S. economy at an advantage. Some looked down on their less fortunate counterparts, especially those who came from different countries. The Arts and Popular Entertainment In the 1980’s, Latino influence on the arts gradually moved from a protest standpoint to more mainstream acceptance, although some cultural critics argued that mainstreaming Latino art, literature, and music required weakening it. Like many minorities, Latinos sought acceptance by success in the entertainment industry. Meanwhile, U.S. media companies began to recognize the demographic trends and to integrate Latinos into their productions. Latino characters became more commonly represented in films and television series. Among several Latino actors who became stars during the 1980’s were Edward James Olmos, Andy Garcia, and Jimmy Smits. When Ramón Gerardo Antonio Estévez became an actor, he took the stage
The Eighties in America
name of Martin Sheen, in honor of the famous television host Archbishop Fulton Sheen. Sheen became a star with the success of Apocalypse Now (1979). Two of his sons would also rise to fame in the 1980’s, with Charlie Sheen retaining his father’s stage name and Emilio Estevez keeping his birth name. Geraldo Rivera, an investigative reporter at ABC’s 20/20, made huge ratings with his report “The Elvis Cover Up.” In 1987, he started his own daytime talk show, which ran for eleven years. Gloria Estefan became a household name as her band, Miami Sound Machine, had a number of hit songs in the 1980’s and spun off several popular Latino artists, such as Jon Secada. In athletics, runner Alberto Salazar broke records at the 1981 New York Marathon (which he also won three years in a row) and the 1982 Boston Marathon, and he was part of the U.S. Olympic team in 1980 and 1984. Nancy Lopez was Ladies Professional Golf Association (LPGA) Player of the Year in 1985 and 1988. Jose Canseco was the American League’s Rookie of the Year in 1986 and its Most Valuable Player (MVP) in 1988. Religion Latinos generally considered themselves Catholic, though increasing numbers were converting to Protestant religions. Like any population, their relationships to the Catholic Church varied: As with many local cultures within the Catholic world, Latin Americans adopted indigenous religious practices that Rome considered superstitious. Politically, many adhered to the movement known as liberation theology, which was censured by the Vatican on several occasions in the 1980’s. Nevertheless, the outward trappings of Catholicism, along with some Native American and African religious practices, were integral to Latino culture. Catholic immigrants from various parts of Europe had emphasized assimilation into American culture. They would start off in ethnic neighborhoods and parishes, then gradually shed aspects of their native culture and religion as each generation wanted greater acceptance in America. By the 1960’s, this trend had merged with interpretations of Vatican II to inspire rejection of the so-called Catholic ghetto, in which traditional Catholic devotions, music, and artwork were cast aside by many non-Latino American Catholics. This was not the case with Latinos, who saw Catholicism, at least culturally, as an integral part of
Latinos
■
577
Latinidad. While Catholics in the United States would support Latino immigrants (both legal and otherwise) in a spirit of brotherhood, anti-immigrant groups began reviving rhetoric that had been used against Eastern European, Irish, and Italian immigrants in the early twentieth century. Support for illegal immigrants tended to come from progressive Catholics who had made a point of shedding their own ethnic cultures since the 1960’s. While they shared the quest for social and economic justice, they were uncomfortable with the outward practices of Latino Catholics. Conversely, more traditional and conservative U.S. Catholics, who might not have agreed with the political causes of refugees, welcomed their use of traditional prayers and devotions. Devotions such as the Virgin of Guadalupe came to be adopted as symbols for all American Catholics, not just Latinos. Many new Catholic lay movements that started or expanded in the wake of Vatican II became catalysts for merging of the two Catholic populations through shared membership. Liberation theology, Opus Dei, and the Cursillo were all movements that started in Spanish-speaking countries and involved some elements of Spanish language and culture that were adopted by non-Latino Catholics. Impact The trends in immigration and cultural awareness greatly increased the role of Latino culture in the United States in the 1980’s. The number of U.S. residents identifying themselves as racially Hispanic would double between 1980 and 2000. Acceptance of Latino cuisine by the wider community would result in salsa surpassing ketchup as the most popular condiment in the United States by the early 1990’s. Illegal immigration would remain a topic of national concern for decades. In 2003, the Census Bureau would announce that Hispanics had surpassed African Americans as the largest minority group in the United States. Further Reading
Gonzalez, Juan. Harvest of Empire: A History of Latinos in America. New York: Penguin, 2001. A book dealing with the history and impact of Latino immigration to the United States. Gutierrez, David, ed. Columbia History of Latinos in the United States Since 1960. New York: Columbia University Press, 2004. A collection of articles presenting the impact of Latinos on the culture of the United States in the late twentieth century.
578
■
Lauper, Cyndi
Mintzer, Richard. Latino Americans in Sports, Film, Music, and Government: Trailblazers. Broomall, Pa.: Mason Crest, 2005. A volume that focuses on the role of Latino celebrities in U.S. culture. Oboler, Suzanne, and Deena J. Gonzalez, eds. The Oxford Encyclopedia of Latinos and Latinas in the United States. 4 vols. New York: Oxford University Press, 2006. A four-volume encyclopedia covering the history and culture of Latinos in the United States. Stavans, Ilan, and Harold Augenbraum, eds. Encyclopedia Latina: History, Culture, and Society in the United States. 4 vols. Danbury, Conn.: Grolier, 2005. A four-volume encyclopedia dealing with the history and culture of Latinos in the United States. John C. Hathaway
The Eighties in America
working-class family: Such people were not, to quote the song’s lyrics, “the fortunate ones.” The song and the album used an accessible, if quirky, New Wave presentation to bring attitudes sometimes associated with the punk scene to a more mainstream audience. Lauper became one of the most popular and marketable of the stars that emerged from the initial years of MTV and music videos. Her unpretentious, girl-next-door style made her seem more accessible and approachable than many of her fellow rock stars. She also benefited from the video revolution, which made performers’ appearances as important as their sound in promoting their music. Lauper wrote most of her own material and played a wide variety of instruments with skill. She had an interesting sense of visual style, and her performances appealed
See also
Business and the economy in the United States; El Niño; Film in the United States; Foreign policy of the United States; Immigration Reform and Control Act of 1986; Immigration to Canada; Immigration to the United States; L.A. Law; Latin America; Mariel boatlift; Mexico and the United States; Miami Vice; Music; Pop music; Television.
■ Lauper, Cyndi Identification
American New Wave singer, musician, and songwriter Born June 22, 1953; Ozone Park, Queens, New York Lauper’s quirky, tuneful music gave mainstream exposure to New Wave eccentricity and became permanently associated with the American popular culture of the 1980’s. Cyndi Lauper became a rock star in 1983 at the comparatively late age of thirty. Before that, she had grown up in the working-class suburb of Ozone Park in Queens, New York, and had had a fitful musical career, including fringe appearances with several groups and a near-bankruptcy in 1982. In 1983, however, she released her first album, She’s So Unusual, which became a major hit—rising to number four on the Billboard 200 chart—and launched Lauper to stardom. The album’s first hit single, “Girls Just Want to Have Fun,” gave voice, despite the hedonistic implications of its title, to a deeply felt sense of everyday life and of what it meant to be part of a
Cyndi Lauper holds aloft her award for best female video at the 1984 MTV Video Music Awards. (AP/Wide World Photos)
The Eighties in America
to the eye as well as the ear. Lauper’s second hit single, “Time After Time,” was a very different song from her first. It was a haunting ballad about loyalty and steadfastness that indicated her considerable emotional range as a singer and songwriter. “She Bop,” the third hit from the first album, was seen as having sexual overtones. The plaintive “All Through the Night,” which revealed an unexpected lyricism, was also a hit, giving Lauper a total of four consecutive top-five hits from the same album, a record for a solo female artist. The album itself also sold at an unprecedented level for such an artist. In 1985, Lauper won the Grammy Award for Best New Artist. Lauper, who featured wrestling star Captain Lou Albano in many of her videos, became involved with the World Wrestling Federation and often made appearances at wrestling events. She also sang the theme song for the film The Goonies (1985) and participated in the USA for Africa charity song “We Are the World,” which raised money for Ethiopian famine relief in 1984. In 1986, Lauper released True Colors, a more meditative and introspective album whose songs—such as the title track and “Change of Heart”—possessed both integrity and passion. The second album did not approach the sales levels of the first, but it established Lauper as not just a novelty act but a serious artist. Impact Lauper continued to have a loyal following, especially in the gay male community. By the late 1980’s, Lauper’s sales began to sag, but she will always be identified with the jaunty and spunky pop musical idiom of the decade. Further Reading
Gaar, Gillian. She’s a Rebel: The History of Women in Rock. Seattle: Seal Press, 1992. Kamin, Philip. Cyndi Lauper. New York: McGraw-Hill, 1986. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of That Decadent Decade. New York: St. Martin’s Griffin, 1996. Nicholas Birns See also Music; Music videos; New Wave music; Pop music; USA for Africa; Women in rock music; World Wrestling Federation.
Leg warmers
■
579
■ Leg warmers Definition
Long, footless socks made of heavy knit or wool Date 1980-1985 Although leg warmers had been used by dancers since the 1920’s to keep their leg muscles warm and flexible, the formerly utilitarian accessory became a must-have fashion staple following the release of a series of dance- and fitnesscentered movies in the early to mid-1980’s. The early 1980’s ushered in a period in American cinema that featured films focusing on dance, aerobic fitness, or, in some cases, both. The most notable of these films were Fame (1980), Stayin’ Alive (1983), Flashdance (1983), and Footloose (1984). All of these films feature both the element of dance and the presence of leg warmers. Leg warmers became the symbol of freedom and in many cases of sexual desirability obtained through a dedication to physical fitness. By wearing leg warmers, a woman could advertise her affinity with the world of dance, and therefore imply that she was sophisticated, physically fit, and desirable. Cinema was not the only arena that promoted physical activity. The music video for Olivia NewtonJohn’s single “Physical” was set in an aerobics class held in a mirrored room. The singer and all of the video’s participants were outfitted in sweatbands, leg warmers, leotards, and athletic shoes. “Physical” was the number-one most-played song on the radio for ten weeks between 1981 and 1982. Although the song’s lyrics refer to the transformation of a platonic relationship into a sexual one, its accompanying video transformed the title’s meaning into a fitness reference. Only Newton-John’s arguably flirtatious way with the camera provided a clue that another meaning was implied. Leg warmers became a way to identify with those who did not settle for the status quo, those who were capable of keeping their dreams alive. Jennifer Beals’ character in Flashdance, probably the most influential of the films to produce the leg warmer trend, had to work at a steel mill. Her occupation did not prevent her from living her dream of dancing, albeit late at night. As an article of everyday clothing, leg warmers could be worn over jeans or over tights with skirts and dresses. The accompanying footwear could be high-heeled pumps or athletic shoes, depending on the desired effect. One such effect was
580
■
Lemieux, Mario
bohemian chic, as the wearer could be said to possess a heightened sense of the physical self. By the middle of the decade, leg warmers were again relegated to the dance studio, as fitness trends turned toward more moderate types of exercise. It was no longer considered stylish to advertise one’s ability to dance, and the dance movie itself had fallen out of fashion at the box office. Impact Leg warmers played a pivotal role in the changing ways that Americans, particularly women, saw their bodies. By putting on the long, loose-fitting socks, wearers could display not only a sense of fashion but also a commitment to health and physical attraction. Further Reading
The Eighties in America
guins. The Penguins had reached the play-offs in 1982 but had suffered from financial woes in the following two seasons and were rumored to be contemplating a move to a different city. The team leaped at the chance to add Lemieux to their team. Lemieux was a success from the beginning; he scored on his first shot on his first shift as a Penguin. His superstar playing and the media excitement it generated energized fans of the team, and they eventually saved the team from having to move. Lemieux brought a winning presence to a team that had been in shambles. Still, it took several years for Lemieux and the Penguins to reach the playoffs. Lemieux became the heart and soul of the Pittsburgh Penguins, and his astonishing play earned him the nickname “Super Mario.” By the 1988-1989 NHL season,
Calasibetta, Charlotte Mankey, and Phyllis Tortara, eds. The Fairchild Dictionary of Fashion. 3d ed. New York: Fairchild, 2003. Rose, Cynthia, ed. American Decades Primary Sources, 1980-1989. Farmingdale, Mich.: Thomson Gale, 2004. Dodie Marie Miller See also
Aerobics; Ballet; Break dancing; Dance, popular; Fads; Fashions and clothing; Film in the United States; Power dressing.
■ Lemieux, Mario Identification Canadian hockey player Born October 5, 1965; Montreal, Canada
Star hockey player Lemieux was drafted by the Pittsburgh Penguins in 1984, becoming the team’s savior: His addition to the roster generated enough fan interest in Pittsburgh to keep the Penguins from having to relocate. Mario Lemieux was a phenomenon in the Quebec Major Junior Hockey League, scoring 133 goals and 282 points in seventy games. Lemieux had a long reach and impressive speed, which contributed to his scoring ability. In addition, he evinced that rare mental affinity for the game that sets sports superstars apart, enabling him intuitively to make splitsecond decisions in the heat of competition. In 1984, Lemieux decided to enter the National Hockey League (NHL) draft. The right to make the first draft pick was held by the team with the worst overall record in the previous year: the Pittsburgh Pen-
Mario Lemieux in March, 1984, when he played for the Laval Voisins. Lemieux was drafted into the National Hockey League by the Pittsburgh Penguins later that year. (AP/Wide World Photos)
The Eighties in America
Lemieux was a superstar in the league, recording his greatest regular season performance in that season. He scored 85 goals and finished with 199 points. It was also in that season that Lemieux finally led the Penguins into the play-offs, although they were eventually eliminated by the Philadelphia Flyers. Impact During the 1980’s, Mario Lemieux rose to prominence as a major star in the NHL, meriting comparisons to fellow great Wayne Gretzsky. In the decade’s final full season, he led his team to its first play-off appearance in seven years. As the 1990’s began, the team was poised for greatness, and it would go on to win two Stanley Cup titles before Lemieux’s career was shortened by a battle with Hodgkin’s lymphoma. Further Reading
Bynum, Mike, et al., eds. Mario Lemieux: Best There Ever Was. Toronto: Macmillan Canada, 1997. Christopher, Matt. On the Ice with Mario Lemieux. New York: Little, Brown, 2002. McKinley, Michael. The Magnificent One: The Story of Mario Lemieux. New York: Grosset & Dunlap, 2002. Rossiter, Sean. Mario Lemieux. Vancouver: Greystone Books, 2001. Timothy C. Hemmis See also
Gretzky, Wayne; Hockey; Sports.
■ LeMond, Greg Identification
American professional cyclist and Tour de France winner in 1986, 1989, and 1990 Born June 26, 1961; Lakewood, California LeMond garnered more attention for his sport in the United States and worldwide than anyone else in the 1980’s. He was an innovator in a sport ruled by tradition. Greg LeMond established his reputation within the sport of cycling early in the 1980’s by winning the World Championships in just his third year as a professional in 1983, after finishing second the previous year. LeMond’s fame centered on his role in two of the most sensational stories in the history of cycling. He first came to prominence in the Tour de France when, in 1985, he assisted his teammate, Bernard Hinault, to win his fifth Tour de France. At one stage in the race, it seemed that Hinault was unable to
LeMond, Greg
■
581
maintain the pace, and LeMond found himself in a winning position. On directions from his team, however, LeMond slowed to a near stop to wait for Hinault to catch up. As a result of that sacrifice, Hinault was able to win the overall race, while LeMond finished a close second. LeMond reported that he waited in reliance on a promise from Hinault and the team that the next year, Hinault would provide his full support to LeMond. LeMond therefore felt betrayed the following year, when the strongest challenge to his race lead came from Hinault. LeMond held off that challenge, however, and won his first of three Tours de France. He severed all ties with Hinault and his former team at the end of the season. The second defining challenge of LeMond’s career began just two months before the start of the 1987 Tour de France, when he was struck in the chest by a shotgun blast from his brother-in-law in a hunting accident. Few expected LeMond ever to race again, and he missed the 1987 and 1988 racing seasons while he recovered. When he entered the 1989 Tour de France, Lemond was not listed among the favorites to win. Win he did, however, achieving this comeback victory in dramatic fashion. He trailed the race leader, Laurent Fignon, by fifty seconds going into the final day of the race. LeMond stunned the hometown crowds by beating the Frenchman by fiftyeight seconds that day, winning the Tour de France by the smallest-ever margin of victory: just eight seconds. Impact Participating as an American in a Europeandominated sport set LeMond apart, and he brought unprecedented attention to cycling in the United States. His career was marked by a willingness to experiment in crucial racing situations with innovations in aerodynamic bicycle construction, as well as a single-minded focus on the Tour de France to the exclusion of other races. By doing what it took, both physically and technically, to win cycling’s greatest race, LeMond revolutionized the sport. Further Reading
LeMond, Greg, and Kent Gordis. Greg LeMond’s Complete Book of Bicycling. New York: Perigee Books, 1990. Marks, John. “Se faire naturaliser cycliste: The Tour and Its Non-French Competitors.” In The Tour de France, 1903-2003: A Century of Sporting Structures, Meanings, and Values, edited by Hugh Dauncey and Geoff Hare. London: Frank Cass, 2003.
582
■
Lennon, John
The Eighties in America
Thomson, Christopher S. The Tour de France: A Cultural History. Berkeley: University of California Press, 2006. Margot Irvine and John P. Koch See also
Hobbies and recreation; Sports.
■ Lennon, John Identification British musician and songwriter Born October 9, 1940; Liverpool, England Died December 8, 1980; New York City
The murder of former Beatle John Lennon in December, 1980, became an emblem for the struggle of many political progressives, who saw him as a standard bearer of their movement. His music outlived him, continuing to influence a wide range of listeners and fellow musicians throughout the decade. In the 1960’s, John Lennon achieved unparalleled fame as a member of the Beatles, a band whose impact on popular culture was enormous. After the Beatles’ breakup, Lennon, with his wife Yoko Ono, launched a solo career characterized by a raw style and radical political content. While alienating many Beatles fans, Lennon’s solo albums led numerous members of the counterculture to embrace Lennon as a major voice of the political left. After battles with depression, drinking, and deportation, Lennon withdrew from the music business in 1976, living an isolated life on his Upstate New York farm or in his apartment in the Dakota Building (better known as the Dakota) in New York City. By 1980, however, Lennon emerged rejuvenated, and in June he sailed to Bermuda, a trip that would reawaken his creative energies. In Bermuda, he listened to contemporary music for the first time in years, and he discovered in the Pretenders, Len Lovich, and the B-52’s the spirit of his solo music. He even enjoyed Paul McCartney’s new album, McCartney II (1980). In response, Lennon began to compose new songs. When he returned home, Lennon was anxious to enter a studio. On August 8, he and Ono assembled a studio band in New York’s Hit Factory. Over the next nine days, they recorded all of Double Fantasy (1980), an ambitious double album, and most of Milk and Honey (1984). Lennon had mixed feelings about Double Fantasy, which was released on November 17, but to his surprise it
An aerial view of crowds gathering by the Dakota Building to mourn John Lennon’s death a week after he was killed there in December, 1980. (AP/Wide World Photos)
reached the top ten in the Billboard 200 chart by December 1. Lennon and Ono were elated and started work on “Thin Ice,” a Yoko Ono single. On the morning of December 8, Annie Leibovitz of Rolling Stone magazine photographed Lennon and Ono. The shoot included the later-famous portrait of Lennon, nude, curled up in a fetal position with Ono. Next, Lennon and Ono gave an interview for RKO Radio. Afterward, the couple left for the Record Plant, where they worked on “Thin Ice.” When they returned to the Dakota at 11:00 p.m., a man stepped from the shadows of the arched entranceway and shot Lennon five times with a .38 caliber revolver. The man, Mark David Chapman, was later declared paranoid-schizophrenic. Within minutes, Lennon died from massive bleeding. Impact As the shocking news broke, crowds formed outside the Dakota, resulting in a continuous vigil that climaxed on December 14, 1980 when 400,000 people gathered in Central Park to mourn Lennon’s passing, an act echoed across the planet. Lennon remained a presence across the decade. Double Fantasy reached number one and won a Grammy. The post-
The Eighties in America
humous Milk and Honey, Live in New York City (1986), and Menlove Ave. (1986) sold well. Lennon influenced many musicians popular in the 1980’s, including Jello Biafra, Bono, Peter Gabriel, and Iggy Pop. Lennon’s political activism also served as an inspiration. Admirers viewed him as a martyr to the causes of social and personal liberation. His face became a revolutionary icon and his song “Imagine” an anthem of planetary transformation. Further Reading
Rosen, Robert. Nowhere Man: The Final Days of John Lennon. New York: Soft Skull, 2000. Seaman, Frederic. The Last Days of John Lennon: A Personal Memoir. New York: Birch Lane, 1991. John Nizalowski See also Crime; Music; Pop music; Rock and Roll Hall of Fame.
Leonard, Sugar Ray
■
583
■ Leonard, Sugar Ray Identification American professional boxer Born May 17, 1956; Wilmington, North Carolina
Leonard was one of the top professional boxers of the 1980’s, winning titles in five different weight classes and fighting many of the best boxers of the era. After winning a gold medal at the 1976 Summer Olympics in Montreal, Sugar Ray Leonard began his professional boxing career the following year under the guidance of legendary trainer Angelo Dundee. He won twenty-five consecutive bouts before defeating Wilfred Benitez for the World Boxing Council (WBC) welterweight title in November of 1979. After defending the title once, he lost it by decision to Roberto Duran, the former world lightweight champion, in June of 1980, but regained it in a rematch the
Sugar Ray Leonard, right, faces Thomas Hearns in a battle for the world welterweight championship in September, 1981. (AP/Wide World Photos)
584
■
The Eighties in America
Letterman, David
following November. While Leonard attempted to battle head-to-head with Duran in the first fight, he boxed far more skillfully in the second, frustrating his opponent so much that Duran finally waved the fight off in the eighth round, famously declaring “no mas.” The win over Benitez and the two fights with Duran brought Leonard to the top years of his boxing career. After moving up in weight in June of 1981 to defeat Ayub Kalule for the WBC light middleweight title, Leonard faced undefeated knockout artist Thomas Hearns in September for the combined WBC and World Boxing Association (WBA) welterweight titles. Leonard defeated Hearns by a technical knockout (TKO) in the fourteenth round; the bout was later chosen by The Ring as the magazine’s 1981 Fight of the Year. After one defense of the combined title in February of 1982, and while preparing for his next fight in May, Leonard suffered a detached retina in his left eye. Although surgery to repair his eye was successful, Leonard announced the first of his several retirements from the ring in November of 1982. Following a brief return to the ring in 1984 and further eye surgery, Leonard again retired, but he was enticed out of retirement a second time to fight middleweight champion Marvin Hagler in April of 1987. At the time of the fight, Hagler, who had a record of sixty-two wins with fifty-two knockouts, was favored three to one. The bout ended in a controversial decision win for Leonard. The following year, Leonard defeated Donny Lalonde to win two WBC titles: the light heavyweight and the super middleweight. In 1989, he fought consecutive bouts with his earlier rivals Thomas Hearns and Roberto Duran, fighting to a controversial draw with Hearns and easily defeating Duran on points. After losing two comeback efforts in the 1990’s, he retired for good in 1997. Impact During the 1980’s, Leonard fought and beat the best fighters in his weight divisions: Benitez, Duran, Hearns, and Hagler. Although several of his fights involved controversial decisions, he was clearly one of the dominant figures in the sport throughout the decade. Further Reading
McIlvanney, Hugh. “Sugar Ray Leonard v. Thomas Hearns, Las Vegas, 16 September 1981.” In The Hardest Game: McIlvanney on Boxing. Updated ed. New York: Contemporary Books, 2001. Myler, Patrick. “Sugar Ray Leonard.” In A Century of
Boxing Greats: Inside the Ring with the Hundred Best Boxers. London: Robson Books, 1999. Toperoff, Sam. Sugar Ray Leonard and Other Noble Warriors. New York: McGraw-Hill, 1987. Scott Wright See also
African Americans; Boxing; Sports.
■ Letterman, David Identification
Late-night television talk-show host and comedian Born April 12, 1947; Indianapolis, Indiana Originally regarded as a kind of young person’s Johnny Carson, Letterman quickly developed his own style of comedy, establishing a format that would be widely imitated by the late-night television hosts of the next generation. When David Letterman debuted as the host of NBC’s Late Night with David Letterman on February 1, 1982, he faced the dual challenge of maintaining the entertainment standards of The Tonight Show with Johnny Carson (which he immediately followed in NBC’s programming schedule) and of developing an audience for late-night, talk-show comedy among a generation assumed to be too young to embrace Johnny Carson as its own. As if sensing the absurdity of being expected to realize two such contradictory goals simultaneously, Letterman immediately established as the tone of his show the relaxed, self-referentially playful irony that he had honed during the 1970’s as a comedy writer and a stand-up comedian. Coincidentally, the growing influence of postmodernist studies was beginning to popularize just such self-reflexive humor among his target audience: college students. Like Carson, Letterman involved his sidekick, the show’s bandleader Paul Schaffer, in both his banter and his routines. Unlike Carson or any other talkshow host up to that time, Letterman frequently utilized the members of his crew, making unlikely stars of his writers, cue-card holders, and cameramen. He also developed the comic potential of the mundane, getting creative mileage (and ever-increasing ratings) from activities such as reading viewer mail; interviewing unsuspecting Manhattan store owners; and dropping bowling balls, watermelons, and other otherwise uninteresting objects off the tops of tall buildings. Large portions of Letterman’s show were scripted
The Eighties in America
Lévesque, René
■
585
and thoroughly rehearsed, but the interviews with the show’s guests were not. This situation—given Letterman’s willingness to needle even the most famous celebrities—led to some notoriously tense moments, especially when the celebrities did not share Letterman’s sense of humor. (The actress Nastassja Kinski, for example, vowed never to return to his show after her on-air hairdo served as the butt of his unwelcome jokes.) This element of unpredictability, however, only made the show more popular, giving viewers reared on television’s more pat conventions a fresh reason to tune in. By forcing guests out of their familiar personas, Letterman provoked some of television’s earliest “real” moments, thus laying, if inadvertently, the groundwork for the “reality television” of the twenty-first century. Impact
Despite adopting the trappings of The Tonight Show, Letterman adapted them to his own comic purposes, adding original routines and elements that would become standards among television talk shows and exemplifying the possibility of innovation within even the most well-established of formats.
Further Reading
Dunn, Brad. When They Were Twenty-Two: One Hundred Famous People at the Turning Point in Their Lives. Riverside, N.J.: Andrews McMeel, 2006. Letterman, David. Late Night with David Letterman. New York: Random House, 1985. _______. The Late Night with David Letterman Book of Top Ten Lists. Edited by Leslie Wells. New York: Simon & Schuster, 1990. Arsenio Orteza See also
Comedians; Talk shows; Television.
■ Lévesque, René Identification
French Canadian premier of Quebec from 1976 to 1985 Born August 24, 1922; Campbellton, New Brunswick, Canada Died November 1, 1987; Île des Sœurs, Quebec, Canada René Lévesque led Quebec’s sovereignist movement and forced the Canadian government to try to address the concerns of the Québécois.
René Lévesque. (Library of Congress)
René Lévesque was a Quebec nationalist activist and politician. After uniting several groups that supported independence for the Canadian province to form the Parti Québécois, he led this political party to victory in Quebec’s election in 1976. Soon after coming to power, his government passed Bill 101, which made French the official language of Quebec. In 1980, Premier Lévesque held a referendum on independence from Canada. His approach to achieving independence, however, was somewhat complex. It was known as sovereignty-association. This term meant that he sought popular approval from his people to conduct negotiations with Canada regarding independence. If he were successful, another referendum on the agreement reached in those negotiations would follow. Though the first referendum failed to obtain public support, it led Canadian prime minister Pierre Trudeau to attempt to satisfy the demands of Quebec without granting the province sovereignty. Even during the referendum campaign, Trudeau
586
■
The Eighties in America
Lewis, Carl
told the Québécois that he would act to alter the Canadian constitution to recognize Quebec’s distinctive position within Canada. Trudeau held discussions with the premiers of Canada’s other provinces about how to pursue such constitutional changes, which were one portion of a comprehensive modification of the constitution. The Canadian constitution was to be patriated, meaning that the United Kingdom, which still technically controlled the Canadian constitution, was finally going to grant full sovereignty to its former colony, as well as granting it control over its own constitution. In the process, the constitution would be altered. Lévesque, however, was dissatisfied with the specific proposed constitutional changes put forth by Trudeau. Lévesque and other provincial leaders developed an alternative package of reforms. In the midst of the constitutional debate, Quebec held a provincial election in the spring of 1981. The Parti Québécois was victorious again, and Lévesque was reelected as premier. As the crisis over constitutional reform continued, however, Quebec’s leader became more isolated. In November, 1981, the Canadian government reached an agreement with all the provincial leaders except for Lévesque, who argued that the package refused fully to recognize Quebec’s special rights. In particular, Lévesque opposed the constitutional reforms, because they failed to provide Quebec with veto power in federal affairs and lacked recognition of the province’s unique needs and culture. The following year, Canadian parliament passed the Constitution Act, 1982, and the United Kingdom passed the corresponding Canada Act of 1982, amending and patriating the Canadian constitution. The Constitution Act, however, was adopted without the formal approval of Quebec. Lévesque even refused to send a delegation to the official celebration of its passage. The Quebec leader’s political position became even tougher as a result of a provincial budget crisis. Lévesque’s government responded by reducing public salaries in 1982. As a result, he lost his popularity, as important constituencies such as bureaucrats and teachers disapproved of these cuts. In 1985, following Lévesque’s retirement from politics and return to journalism, the Parti Québécois was defeated by the Liberal Party. Impact Rene Lévesque created a viable political party in Quebec based almost entirely on support
for the province’s independence. By leading the Parti Québécois to consecutive electoral victories and holding a referendum on sovereignty, his efforts served as a catalyst to Canadian constitutional reform and greater attention being paid to federalprovincial relations. Further Reading
Bothwell, Robert. Canada and Quebec: One Country, Two Histories. Vancouver: UBC Press, 1995. Fraser, Graham. René Lévesque and the Parti Québécois in Power. Toronto: Macmillan, 1984. Morton, Desmond. A Short History of Canada. 2d ed. Toronto: McClelland & Stewart, 1984. Kevin L. Brennan See also Bourassa, Robert; Canada Act of 1982; Canadian Charter of Rights and Freedoms; Meech Lake Accord; Minorities in Canada; Quebec English sign ban; Quebec referendum of 1980; Trudeau, Pierre.
■ Lewis, Carl Identification African American track star Born July 1, 1961; Birmingham, Alabama
Lewis was the most outstanding male track-and-field star not only in the 1980’s but perhaps in all of track-and-field history. He equaled Jesse Owens’s 1936 Olympic performance, and he won medals in three Olympic Games. After enrolling at the University of Houston in 1979, Carl Lewis and his sister Carol qualified for the 1980 Olympics in Moscow, but because of the American boycott of the games, the eighteen-year-old Carl did not get the opportunity to compete against the world’s best athletes in the long jump and the 4 × 100-meter relay. Competing for the University of Houston in 1981, he won the first of his six National Collegiate Athletic Association (NCAA) titles, ran the world’s fastest 100-meter race, and received the James E. Sullivan Award as the top American amateur athlete. In 1983, at the first World Championships sponsored by the International Association of Athletic Federations (IAAF), Lewis won the longjump competition and was a member of the winning 4 × 100-meter relay team. The following year, Lewis qualified to compete in four events at the 1984 Summer Olympic Games in
The Eighties in America
Los Angeles: the long jump, 100-meter dash, 200meter dash, and 4 × 100-meter relay race. He declared his intention to equal Jesse Owens’s 1936 feat of winning four Olympic gold medals. He met this goal. Lewis won the 100-meter dash with a time of 9.9 seconds; he won the long jump; he set a new Olympic record for the 200-meter dash at 19.8 seconds; and he anchored the winning 4 × 100-relay race. His performance was so outstanding that it may have led to his being drafted that year by the National Basketball Association’s Chicago Bulls and by the National Football League’s Dallas Cowboys. After the 1984 Olympic Games, Ben Johnson became Lewis’s chief competitor, and in the 1987 World Championships in Rome, Johnson won the 100-meter dash. Lewis’s suggestion that Johnson was using drugs proved to be accurate, however, for in the 1988 Olympic Games, Johnson tested positive for steroids after again defeating Lewis. As a result of the test, Johnson lost his gold medal, which was then awarded to Lewis, who also received a silver medal in the 200-meter event and a gold medal in the long jump. Impact Track and Field News named Carl Lewis the Athlete of the Year in 1982, 1983, and 1984; Sports Illustrated named him the Olympian of the Century; and the International Olympic Council deemed him the Sportsman of the Century. He set world records in the 100-meter dash and, with his team members, in the 4 × 100-meter and 4 × 200-meter relay races. He perhaps also bested Bob Beamon’s record in the long jump, but a judge’s error prevented him from achieving that feat; he did win sixty-five consecutive long-jump events. Lewis played a large part in establishing track and field as a professional sport. Because of his aloof manner and lack of humility, he was not a favorite with fans or with his colleagues, but his achievements in the 1980’s were spectacular. Further Reading
Klots, Steve. Carl Lewis. Philadelphia: Chelsea House, 2001. Lewis, Carl, with Jeffrey Marx. Inside Track: My Professional Life in Amateur Track and Field. New York: Simon & Schuster, 1990. Thomas L. Erskine See also African Americans; Olympic Games of 1980; Olympic Games of 1984; Olympic Games of 1988; Sports.
Liberalism in U.S. politics
■
587
■ Liberalism in U.S. politics Definition
A political ideology that tends to support progress, civil rights and liberties, reform, social justice, and using the power of the federal government to improve the general welfare of the nation
During the 1980’s, liberalism was challenged, criticized, and often defeated by the conservative ideologies of the Reagan and Bush administrations. The liberal Democratic Party retained control of the House of Representatives, but it faced Republican control of the Senate from 1981 to 1986, the development of a conservative majority in the Supreme Court, and the growing influence of the Religious Right in American politics, especially among southern whites. Republican victories in the 1980, 1984, and 1988 presidential elections influenced leading Democratic politicians, especially presidential candidates, to avoid identifying themselves with liberalism. President Ronald Reagan’s landslide reelection in 1984 motivated some Democrats, such as future president Bill Clinton and future vice president Al Gore, to establish the Democratic Leadership Council (DLC) in 1985. The DLC intended to develop more moderate policy positions in order to make Democratic presidential candidates more successful in future elections. Liberalism in U.S. politics is often associated with the domestic policy legacy of the New Deal of the 1930’s, especially its support of social welfare programs and the interests of labor unions, and major political movements and events of the 1960’s, especially concerning civil rights, Supreme Court decisions on civil liberties, feminism, and opposition to the Vietnam War. In general, liberals favor more government intervention in order to reduce economic inequality; achieve greater racial, ethnic, and gender diversity in education and employment; and improve environmental protection, consumer rights, and public health and safety through regulation. On social issues, such as abortion, school prayer, and crime, liberals usually favor protecting individual liberty and privacy, as well as stronger due process rights for those accused of crimes. During the 1970’s and 1980’s, the enduring controversy of the Vietnam War led many liberals to oppose high defense spending, new nuclear weapons, and an aggressively anticommunist foreign policy, especially in Latin America.
588
■
The Eighties in America
Liberalism in U.S. politics
Liberalism and Two-Party Politics Senator Ted Kennedy of Massachusetts unsuccessfully ran for the Democratic presidential nomination of 1980. Kennedy ran on a liberal platform, partly because he believed President Jimmy Carter was too moderate and cautious in his domestic policy agenda, especially on poverty, unemployment, and health care. In the 1980 presidential election, Ronald Reagan, the Republican nominee and former governor of California, decisively defeated Carter with an explicitly and comprehensively conservative platform that included sharp cuts in taxes, domestic spending, and regulations on business; higher defense spending; a more aggressive Cold War foreign policy; and opposition to liberal Supreme Court decisions on social issues, especially abortion and school prayer. Furthermore, Republicans won control of the Senate in 1980 and soon developed a bipartisan, conservative majority by aligning with Southern Democrats in the House of Representatives. During the 1980’s, liberal Democrats in Congress and liberal interest groups, such as labor unions, the American Civil Liberties Union (ACLU), and the National Organization of Women (NOW), mostly fought defensive actions to defeat or dilute conservative policies. With the support of Republicans and conservative Southern Democrats in Congress, Reagan achieved his major conservative policy goals of tax cuts, deregulation, higher defense spending, and appointing conservatives to federal courts. Led by Speaker of the House Tip O’Neill, however, liberal Democrats defeated Reagan’s attempts to make bigger cuts in domestic spending, eliminate two cabinet departments, reduce Social Security benefits, and transfer more anti-poverty responsibilities to the states. With the Democrats winning control of the Senate in 1986, liberals rallied to defeat Reagan’s nomination of Robert H. Bork, an outspoken conservative, to the Supreme Court in 1987. Liberalism and Foreign Policy
In foreign policy, liberals in Congress, the media, interest groups, and think tanks opposed Reagan’s first-term emphasis on higher defense spending, as well as his initial refusal to negotiate new nuclear arms control treaties with the Soviet Union and his deployment of new American missiles in NATO countries. Liberals also tended to side with the United Nations when disagreements arose between that body and the Reagan administration. While most liberals wanted to
revive a 1970’s-style policy of détente with the Soviet Union, they also wanted U.S. foreign policy to improve human rights, public health, and economic conditions in developing nations, end or reduce American military aid to anticommunist repressive dictatorships and guerrilla movements, emphasize international efforts to protect the environment and fight the AIDS epidemic, and develop a cooperative diplomatic relationship with the Sandinista government of Nicaragua by rejecting Reagan’s support of the Contra rebels. The 1988 Presidential Election
Shortly before the 1988 presidential campaign began, some liberals were optimistic, because public opinion polls and the results of the 1986 midterm elections seemed to indicate that more Americans supported liberal policy positions on the environment, education, health care, and defense spending and would be willing to elect a Democratic president in 1988. In the 1988 presidential election, however, Republican vice president George H. W. Bush decisively defeated Michael Dukakis, the Democratic governor of Massachusetts. Bush identified Dukakis and liberalism with unpopular positions, such as softness on crime, weakness on defense, high taxes, and hostility to traditional American values on social issues. Indeed, Bush successfully turned the word “liberal” itself into an accusation or an insult, thereby setting the tone for Democratic politics for years.
Impact Liberalism in U.S. politics during the 1980’s mostly criticized and opposed the conservative policies and issue positions of the Reagan administration and the Republican Party. Its few political victories were limited to its defeat or dilution of several conservative objectives, including the end of abortion rights and affirmative action, the overthrow of the Sandinista government of Nicaragua, and additional tax cuts. By the end of the 1980’s, Democratic politicians and liberal activists were discussing and considering how to define and communicate liberalism in order to make it more attractive to American voters. Further Reading
Derbyshire, Ian. Politics in the United States from Carter to Bush. New York: Chambers, 1990. Broad survey of American politics from 1976 to 1989. Germond, Jack W., and Jules Witcover. Whose Broad Stripes and Bright Stars? New York: Warner Books,
The Eighties in America
1989. Detailed study of the 1988 presidential election that includes an analysis of the ideological and policy conflicts between liberalism and conservatism during the 1980’s. Rothenberg, Randall. The Neo-liberals. New York: Simon & Schuster, 1984. Compares New Deal liberalism with the emerging “neo-liberalism” of the 1980’s, which emphasized free trade, high technology, and economic growth. Sean J. Savage See also Abortion; Affirmative action; Bork, Robert H.; Bush, George H. W.; Cold War; Congress, U.S.; Conservatism in U.S. politics; Elections in the United States, midterm; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Environmental movement; Feminism; Foreign policy of the United States; Iran-Contra affair; O’Neill, Tip; Reagan, Ronald; Reagan Revolution; Social Security reform; Soviet Union and North America; Unions; Welfare.
■ Libya bombing The Event
The United States bombs military facilities in Libya in retaliation for a terrorist attack in Berlin Date April 15, 1986 Place Tripoli and Benghazi, Libya, North Africa The military strike against Libya and an earlier naval action, both intended to curb terrorism, generated support for the Reagan administration at home but were widely condemned abroad. The bombing established a pattern of unilateral American military intervention in the Middle East. At 2:00 a.m. on April 15, 1986, British-based American F-111 bombers launched surprise attacks on Aziziya Barracks in Tripoli, the residence of Libya’s head of state, Muammar al-Qaddafi. They also struck a suspected terrorist training facility at the alJamahiriya barracks in Benghazi, another alleged training facility in Tripoli, and several airfields. The attacks constituted the largest U.S. air raid since the Vietnam War. Sixty-three people, including one of Qaddafi’s sons and an adopted daughter, died in the attacks, which the United States justified as retaliation for the April 4 bombing of a Berlin discotheque frequented by American servicemen. The Berlin bombing in turn had represented retribution for an
Libya bombing
■
589
American naval engagement in Libya’s Gulf of Sidra in late March, in which fifty-six Libyan sailors were killed and coastal radar installations were destroyed. The Libya bombing occurred as the Cold War threat of the Soviet Union was on the wane, and the Reagan administration sought a new military threat to replace it. The administration focused on international terrorism. Qaddafi, who actively supported both the Palestine Liberation Organization (PLO) and the Irish Republican Army (IRA), was an obvious target. A majority of Americans approved of the action, but, except in Britain and Israel, international reaction was strongly negative. The United Nations Security Council attempted to issue a vote of condemnation, but the United States vetoed it. Particularly disturbing, from an international perspective, was the use of U.S. military force in the attempted assassination of a head of state. Impact Bombing Libya solved nothing. Qaddafi, demonized in the American press as a madman, emerged more popular in his own country and in the Arab world. In subsequent years, on the other hand, he modified Libya’s international relations in response to economic sanctions. In the short run, the attack strengthened Libya’s commitment to terrorism. Over the next two years, the country clandestinely built up its capacity to manufacture chemical weapons, none of which was ever deployed. On December 21, 1988, a terrorist bomb destroyed Pan American Flight 103 over Lockerbie, Scotland. A Scottish court subsequently convicted a Libyan national of planting the bomb in retaliation for the 1986 raids on Libya. Although some analysts believe Iran actually perpetrated this act in response to the destruction of an Iranian commercial airliner by the USS Vincennes earlier in the year, Libyan outrage and willingness to retaliate with terrorist acts were real enough. Contemporary British commentators argued that the 1986 Libya bombing represented a new course of unilateral action that, if carried to its logical conclusion, threatened to embroil the entire Middle East in a generalized conflict. Subsequent events did nothing to contradict this interpretation. Further Reading
Davis, Briant. Qaddafi, Terrorism, and the Origins of the U.S. Attack on Libya. New York: Praeger, 1996. Kaldor, Mary, and Paul Anderson. Mad Dogs: The U.S. Raids on Libya. London: Pluto Press, 1986.
590
■
Literature in Canada
St. John, Ronald Bruce. Libya and the United States: Two Centuries of Strife. Philadelphia: University of Philadelphia Press, 2002. Martha A. Sherwood See also
Foreign policy of the United States; Middle East and North America; Pan Am Flight 103 bombing; Reagan, Ronald; Terrorism; USS Vincennes incident; West Berlin discotheque bombing.
■ Literature in Canada Definition
Drama, prose, and poetry by Canadian
authors Canadian literature consolidated its strengths during the 1980’s, a decade of remarkable achievement. At the start of the decade, only a few Canadian writers were known worldwide. The nation’s writers were according growing acclaim, however, auguring a decisive expansion of the literary arena during the decade. Canadian literature in the 1980’s was both similar to and different from its American counterpart. Like 1980’s American literature, the Canadian field featured a number of disparate generations and regional influences; in both countries, women writers moved to the forefront of literary culture. Canada, though, lacked the hip young writers and the multicultural authors who emerged in the United States during the 1980’s; these trends would come to Canada in the 1990’s. On the other hand, Canada preserved the tradition of the mainstream novel more than did its neighbor to the south. Established Figures At the start of the 1980’s, only a few Canadian writers were known outside Canada. Mordecai Richler, a Jewish Canadian novelist from Montreal, was widely popular for his satiric novels, which were lauded by such prominent U.S. writers as Joseph Heller. Robertson Davies was well known for the Deptford Trilogy, published in the 1970’s. The Rebel Angels (1981) the first novel of his Cornish Trilogy, was reviewed across the English-speaking world upon its publication. Richler’s amiable hijinks and Davies’ deep immersion in magic and mysticism drew strength from their authors’ local Canadian context. The other widely known Canadian writer at the start of the decade, Margaret Atwood, was a key figure in the Canadian nationalist literary renaissance of the
The Eighties in America
late 1960’s and the 1970’s. Atwood demonstrated her versatility in 1982 by editing The New Oxford Book of Canadian Verse, which covered the nation’s poetry from its eighteenth century beginnings to recent experimentation. She also drew recognition for her 1985 novel The Handmaid’s Tale. Set in a futuristic America in which a repressive Puritan government has reconsigned women to traditional roles, the book portrayed a nightmarish vision of a harsh and patriarchal United States that enunciated a tacitly Canadian critique of U.S. cultural arrogance. Two women short-story writers attained worldwide literary prominence in the 1980’s. Mavis Gallant lived in Paris but still wrote poignantly and wittily of her native Canada, as well as of wider worlds. Alice Munro’s remarkable career gained full traction in the 1980’s. Munro’s stories of provincial life in southwestern Ontario are credited with unleashing the artistic potential of the short-story form. Meanwhile, two of Canada’s most acclaimed senior novelists published their final works of fiction in the 1980’s. Morley Callaghan’s A Time for Judas (1983) and Hugh MacLennan’s Voices in Time (1980) capped these venerable writers’ careers and continued to demonstrate their grasp of character and moral situation. Like Davies, Hugh Hood was influenced by the tradition of the English novel, though with a distinctively Canadian spin. Unlike Davies, Hood did not write in the readily digestible form of the trilogy, instead attempting a twelve-volume mega-novel, The New Age/La Nouveau Siècle, which sought to span the twentieth century in Canada. Four of the strongest installments of this saga were published in the 1980’s, taking Hood’s protagonist, Matt Goderich, through the twentieth century’s tragic middle portion. A more experimental writer was Leon Rooke, whose Shakespeare’s Dog (1984) was a garrulous narrative told from the viewpoint of the title animal. Timothy Findley managed to be both experimental and mainstream; his Famous Last Words (1981) presents Hugh Selwyn Mauberley, a poetic persona created by Ezra Pound, as he wrestles with the temptations of fascism and the corruptions of wealth. Not Wanted on the Voyage (1984) tells the story of the biblical ark, from the viewpoint of Mrs. Noah. Robert Kroetsch, in turn, combined experimentation with regionalism in both fiction and poetry. Kroetsch’s fellow Albertan Rudy Wiebe explored contradictions of rural Christianity in My Lovely Enemy (1983).
The Eighties in America New Developments
A younger wave of writers included Guy Vanderhaeghe, who explored postmodern masculinity in Man Descending (1982), and Sarah Sheard, who assayed the politics of cultural trespass in Almost Japanese (1985). Canadian poets drew more notice as well. Indeed, some poets became widely acclaimed as fiction writers. Michael Ondaatje had previously practiced experimental prose in the interstices of his poetic career. His In the Skin of a Lion (1987) was a full-fledged novel about the construction of a bridge in early twentieth century Toronto that brought together people of different ethnicities, backgrounds, and aspirations. The book represented the first step in what was to be a remarkable career as a novelist. Paulette Jiles was another poet who turned to fiction. Other Canadian poets remained exclusively loyal to the verse form. These included Robert Bringhurst, who was powerfully influenced by Asian poetry and philosophy, and Roo Borson, known for her probing, imagistic work. Dominated in previous decades by male writers such as Roger Lemelin, Roch Carrier, and Hubert Aquin, Québécois literature in the 1980’s saw the emergence of feminist writing in the work of Nicole Brossard and Louky Bersianik. The openly gay Michel Tremblay produced lyrical and politically provocative plays. Anne Hébert, at her peak in her late sixties and early seventies, absorbed every new technique she encountered and incorporated each one into in her lucid, compassionate work. François Benoit and Philippe Chauveau in Acceptation Globale (1986) presented a generation disillusioned with the political idealism of the 1970’s, but Canadian literature in the 1980’s as a whole sustained a resolute idealism.
Impact
Canadian literature increased its profile within global literary culture during the 1980’s. Several established writers produced some of their most important works in that decade, while significant new voices began to emerge as well. Like the rest of Canadians during a decade that witnessed a new constitution, full independence from the United Kingdom, and a crisis over Québécois sovereignty, Canada’s authors both sought to forge a collective national identity and insisted upon understanding themselves and their experience through a more narrowly focused, regional lens.
Literature in the United States
■
591
Further Reading
Davey, Frank. Post-national Arguments: The Politics of the Anglophone Canadian Novel Since 1967. Toronto: University of Toronto Press, 1993. A Western Canadian and a poet, Davey examines how the 1980’s both furthered and problematized the nationalistic tendencies of the previous decade. Heble, Ajay. The Tumble of Reason: Alice Munro’s Discourse of Absence. Toronto: University of Toronto Press, 1994. The first theoretically engaged study of a major Canadian writer who emerged in the 1980’s. New, W. H. A History of Canadian Literature. Montreal: McGill-Queen’s University Press, 2001. Gives ample coverage of the decade’s most important writers and trends. Powe, B. W. A Climate Charged: Essays on Canadian Writers. Toronto: Mosaic, 1984. Powe’s attacks on many of the sacred cows of the Canadian literary establishment offered an alternative vision that exemplified the increasing ideological pluralism of the 1980’s. Toye, William, and Eugene Benson. The Oxford Companion to Canadian Literature. 2d ed. Toronto: Oxford University Press, 1997. Includes many entries on Canadian writers of the 1980’s. Nicholas Birns See also Book publishing; Davies, Robertson; Feminism; Handmaid’s Tale, The; Literature in the United States; Poetry; Richler, Mordecai; Theater.
■ Literature in the United States Definition
Drama, prose, and poetry by U.S.
authors Some American literature of the 1980’s commented on the culture of the times, which was often seen as greedy and materialistic, as well as on the emerging AIDS epidemic. The gulf between popular and “literary” texts widened, as bookstores featured separate sections of “literature” and “fiction,” and the academic study of literature turned toward theory. The decade also witnessed the rise of significant voices from the margins of society, as racial and ethnic minorities produced much of the best American literature. These works were quickly incorporated into the literary canon by universities concerned to adopt multicultural curricula.
592
■
The Eighties in America
Literature in the United States
During the 1980’s, three immigrant authors—Czesuaw Miuosz, Joseph Brodsky, and Elie Wiesel— received Nobel Prizes (Wiesel’s was for peace, not literature). Writing by social outsiders, including African Americans, Latinos, Asian Amerians, Native Americans, and gays and lesbians, gained prominence. Women, who constituted the majority of readers, became more conspicuous as authors. However, as personal computers began to be marketed and videocassette recorders penetrated half the American market, the sustained reading of printed texts seemed a beleaguered cultural activity. Nevertheless, so many noteworthy novels were published in the decade that it is difficult to find any common trait with which to unify them. Fine poets also continued to write and to proliferate, and it is equally difficult to single out qualities common to such disparate but talented 1980’s poets as A. R. Ammons, John Ashbery, Rita Dove, Robert Duncan, Carolyn Forch, Allen Ginsberg, Jorie Graham, Donald Hall, Joy Harjo, Richard Howard, Donald Justice, Galway Kinnell, Stanley Kunitz, Audre Lorde, James Merrill, W. S. Merwin, Sharon Olds, Alicia Ostriker, Adrienne Rich, Charles Simic, and Richard Wilbur. However, poetry had become an art for connoisseurs and students more than for the general reading public, and it receded to the margins of American culture. Throughout the decade, winners of the Pulitzer Prize in poetry, such as James Schuyler, Mary Oliver, Carolyn Kizer, and Henry Taylor, were little more than names to even the most sophisticated readers. No living poet possessed the authority that Robert Frost, T. S. Eliot, and Wallace Stevens had exerted for an earlier generation; the most influential contemporary American poet was perhaps Sylvia Plath, who died, notoriously, by suicide in 1963, but whose Collected Poems won a Pulitzer Prize in 1982. In drama, David Mamet, Wendy Wasserstein, and August Wilson made important contributions to the theatrical repertoire, and monologists such as Spalding Gray, Eric Bogosian, Laurie Anderson, and Anna Deavere Smith pioneered performance art. Commercial theaters, however, were more inclined to invest in accessible musicals and familiar revivals than in original American drama. Raymond Carver, Ann Beattie, Bobbie Ann Mason, Grace Paley, and others sparked a revival of the short story, but genre novelists such as Tom Clancy, Robert Ludlum, and Danielle Steele were more likely to show up on best seller lists
than were more challenging writers such as Don DeLillo, Cormac McCarthy, and Gilbert Sorrentino. Especially after the publication of Jean Baudrillard’s Simulacres et simulation (1981; Simulations, 1983) and Jean-François Lyotard’s La Condition postmoderne: Rapport sur le savoir (1979; The Postmodern Condition: A Report on Knowledge, 1984), thoughtful American writers and readers sensed that the basis of the culture had changed from production to consumption, from homogeneity to hybridity, and from originality to simulation. In a postmodern world saturated by disparate media, including television, radio, recordings, and film, everything was being recycled into selfconscious hybrid texts. Leslie Fiedler’s What Was Literature? (1983) challenged the culture to reexamine the nature and function of reading. A belief in the grand master narrative, the single story that would explain everything for all, was subverted, and if the novel, the poem, and the play were not dead, they remained in urgent need of reinvention. Social Observers
In 1981, with Rabbit Is Rich, his third novel tracing the life of Harry “Rabbit” Angstrom, John Updike continued his project of chronicling the experiences of small-town, middle-class, Protestant Americans. In Bech Is Back (1982), he revived a fictional Jewish novelist he had introduced in 1970 in Bech: A Book. The very successful The Bonfire of the Vanities (1987) by Tom Wolfe—a leading figure within the nonfiction movement known as New Journalism—examined greed and corruption in contemporary New York City. In the short stories collected in What We Talk About When We Talk About Love (1981), Cathedral (1983), Where I’m Calling From (1988), and Elephant, and Other Stories (1988), Raymond Carver employed a spare, flat style to depict the bleak lives of contemporary working-class characters. Carver’s work popularized minimalism, sometimes called “KMart realism,” a style that was shared in varying degrees by Beattie, in The Burning House (1982); Mason, in Shiloh, and Other Stories (1982), In Country (1985), Spence + Lila (1988), and Love Life (1989); Jayne Anne Phillips, in Machine Dreams (1984) and Fast Lanes (1987); and David Leavitt, in Family Dancing (1984). Richard Ford, whose Rock Springs (1987) was another minimalist collection, began, with The Sportswriter (1986), to create a trilogy that, like Updike’s Rabbit novels, followed closely a single character, Frank Bascombe, throughout a lifetime of disappointments. With Lincoln (1984) and Empire (1987),
The Eighties in America
Gore Vidal continued his project of providing a pointed, fictionalized version of American history. Ironweed (1983) and Quinn’s Book (1988) extended William Kennedy’s Albany cycle, his series of novels about powerful and pitiful characters living in the New York State capital throughout the twentieth century. No novelist was more prolific than Joyce Carol Oates, however, in documenting the varieties of contemporary American desperation. In counterpoint to the realism of Solstice (1985), Marya (1986), You Must Remember This (1987), and American Appetites (1989), she also produced, in Bellefleur (1980), A Bloodsmoor Romance (1982), and Mysteries of Winterthurn (1984), a bizarre Gothic trilogy. Minority Writers
Changing demographics, by which formerly invisible groups constituted an increasingly large percentage of the American population, would soon render the term “minority” problematic. However, during the 1980’s, authors from groups still referred to as “minority” found and raised their voices. The emergence of Saul Bellow, Bernard Malamud, and Philip Roth had put Jewish American writing on the literary map during the 1950’s and 1960’s, and those three continued to be productive during the 1980’s. Bellow published The Dean’s December (1982), More Die of Heartbreak (1987), A Theft (1989), and The Bellarosa Connection (1989). Malamud, who died in 1986, wrote God’s Grace (1982). Roth produced four works featuring protagonist Nathan Zuckerman, Zuckerman Unbound (1981), The Anatomy Lesson (1983), The Prague Orgy (1985), and The Counterlife (1986). Other notable Jewish American novelists active during the 1980’s included E. L. Doctorow, with Loon Lake (1980), World’s Fair (1985), and Billy Bathgate (1989); Mark Helprin, with Ellis Island, and Other Stories (1981) and Winter’s Tale (1983); and Chaim Potok, with The Book of Lights (1981) and Davita’s Harp (1985). Levitation: Five Fictions (1982), The Cannibal Galaxy (1983), The Messiah of Stockholm (1987), and The Shawl (1989), works that ponder the Holocaust and Jewish traditions, established Cynthia Ozick as the leading figure among Jewish American women authors. Norman Mailer rarely dealt directly with Jewish themes, but he continued to advance his grandiose literary ambitions with a vast novel about Pharaonic Egypt called Ancient Evenings (1983). Neil Simon extended his run as one of the most successful playwrights in the history of American theater with three popular autobiographical works, Brighton Beach
Literature in the United States
■
593
Memoirs (1983), Biloxi Blues (1985), and Broadway Bound (1986). In Glengarry Glen Ross (1983) and Speed the Plow (1987), David Mamet extended his distinctive mastery of the speech rhythms and dreams of con temporary Americans. Wendy Wasserstein’s The Heidi Chronicles (1989), a play about an art historian whose personal independence comes at the price of alienating her from other women as well as men, became an icon of feminist theater. African American men who published notable work throughout the decade included Ernest J. Gaines with A Gathering of Old Men (1983); Ishmael Reed with The Terrible Twos (1982) and The Terrible Threes (1989); and John Edgar Wideman with Hiding Place (1981), Damballah (1981), Sent for You Yesterday (1983), Brothers and Keepers (1984), and Fever (1989). Charles Fuller won a Pulitzer Prize for a powerful drama about bigotry on a military base, A Soldier’s Play (1981). August Wilson began an ambitious ten-play cycle about African American life in the twentieth century, in which each play was set in a different decade. Over the course of the 1980’s, he covered onehalf of the century, producing Jitney (1982), Ma Rainey’s Black Bottom (1984), Fences (1985), Joe Turner’s Come and Gone (1986), and The Piano Lesson (1987). While these works by men were important contributions to American literary culture, it was African American women who generated the most spirited literary discussion during the 1980’s, some of it in response to their perceived negative depiction of African American men. Toni Morrison, who would win a Nobel Prize in Literature in 1993, consolidated her reputation as the foremost chronicler of the ordeal of being black and female in the United States with Tar Baby (1981) and Beloved (1987). Alice Walker published several books of fiction, poetry, and essays, achieving her greatest popular and critical success with The Color Purple (1982). Other significant works by African American women included Toni Cade Bambara’s The Salt Eaters (1980) and If Blessing Comes (1987); Jamaica Kincaid’s At the Bottom of the River (1983), Annie John (1985), and A Small Place (1988); Paule Marshall’s Praisesong for the Widow (1983) and Reena, and Other Stories (1983); Gloria Naylor’s The Women of Brewster Place (1982), Linden Hills (1985), and Mama Day (1988); and Sherley Anne Williams’s Dessa Rose (1986). With The Mambo Kings Play Songs of Love (1989), a boisterous but poignant evocation of the lives of Cuban immigrants, Oscar Hijuelos became the first La-
594
■
Literature in the United States
tino to win the Pulitzer Prize in fiction. Although the new Latino literature would not fully flower for another decade, one of its most celebrated figures, Sandra Cisneros, published her first book, The House on Mango Street—the portrait of a Chicana artist as a young woman—in 1983 and a volume of poetry, My Wicked, Wicked Ways, in 1987. Lorna Dee Cervantes published her first book of poetry, Emplumada, in 1981. Blending prose and poetry, Cherrie Moraga, in Loving in the War Years (1983), and Gloria Anzaldúa, in Borderlands/La Frontera: The New Mestiza (1987), each celebrated sexual difference and hybrid ethnic identities. Novels about Chinese immigrants and their American-born children, such as Amy Tan’s The Joy Luck Club (1989) and Maxine Hong Kingston’s China Men (1980) and Tripmaster Monkey: His Fake Book (1989), provided Asian American literature with new energy and visibility. Playwright David Henry Hwang explored the lives of Chinese in America in F.O.B. (1981), The Dance and the Railroad (1982), and Family Devotions (1983), while his Tony Awardwinning M. Butterfly (1988) was a study in sexual ambiguity that focused on the relationship between a French diplomat and a Chinese opera singer. In Jasmine (1989), Bharati Mukherjee followed her protagonist from her native village in India to New York and then to Iowa, while the stories in both Darkness (1985) and The Middleman, and Other Stories (1988) featured immigrants from a variety of countries in addition to India. During the 1980’s, writers from tribal backgrounds also affirmed their place in American literary culture. In Love Medicine (1984) and Tracks (1988), Louise Erdrich examined relationships among Ojibwa in North Dakota through seven decades, while Beet Queen (1986) focused on non-Indian characters in the fictional North Dakota town of Argus. Another part-Ojibwa author, Gerald Vizenor, deployed a more elusive style in Griever: An American Monkey King in China (1987) and Trickster of Liberty: Tribal Heirs to a Wild Baronage at Petronia (1988). James Welch’s Fools Crow (1986) was a historical novel set in 1870 among the author’s Blackfoot ancestors. In The Woman Who Owned the Shadows (1983), Paula Gunn Allen, of mixed Laguna Pueblo, Sioux, and Lebanese Jewish ancestry, offered a feminist take on a woman’s quest to understand her multiracial identity.
The Eighties in America The Legacy of AIDS During the 1980’s, acquired immunodeficiency syndrome (AIDS) began to ravage communities throughout the United States. In And the Band Played On (1987), Randy Shilts offered the first comprehensive narrative of how AIDS spread and how officials, including President Ronald Reagan, tried to ignore it, because its principal victims at first were sexually active gay men. The devastation caused by AIDS inspired many works that combined elegy with anger, such as activist and author Larry Kramer’s play The Normal Heart (1985). Widespread indifference or even hostility toward the gay casualties of AIDS also prompted many writers to affirm publicly their own gay or lesbian identity. Edmund White’s autobiographical novels A Boy’s Own Story (1982) and The Beautiful Room Is Empty (1988) and Harvey Fierstein’s theatrical Torch Song Trilogy (1982) testified to the challenges of coming out of the closet. Armistead Maupin’s popular Tales of the City novel sequence (1978, 1980, 1982, 1984, 1987, and 1989) provided inside accounts of gay lives. Marilyn Hacker, Audre Lorde, and Adrienne Rich created a body of poetry drawn from lesbian experience. Avant-Garde Fiction Though doing so brought them neither fame nor fortune, at least initially, several talented writers were intent on challenging the conventions of fiction. The structure of Walter Abish’s How German Is It (1980) was deliberately disjunctive, David Markson’s Wittgenstein’s Mistress (1988) jettisoned plot, and Gilbert Sorrentino teased readers’ formal expectations with Aberration of Starlight (1980), Odd Number (1987), and Misterioso (1989). William Gaddis’s Carpenter’s Gothic (1985) was a bitter satire that abandoned linear storytelling in service to a vision of moral chaos. One of the darkest novels of the decade was Cormac McCarthy’s Blood Meridien (1985), a fragmented, gory account of mayhem in the old Southwest. With The Names (1982), White Noise (1985), and Libra (1988), works murky with specters of conspiracy, Don DeLillo began to acquire a following, both despite and because of his refusal to explain his enigmatic world. In Theory In Cultural Literacy: What Every American Needs to Know (1987), E. D. Hirsch, Jr., lamented the loss of a common body of knowledge in the splintered, contemporary American society. Allan Bloom’s The Closing of the American Mind (1987) came down on the side of an aristocracy of the intel-
The Eighties in America
lect, indicting mass culture for the dissolution and dissipation of that aristocracy. A few books judged by tastemakers to possess literary merit managed to attract a wide readership. Among them were John Irving’s The Hotel New Hampshire (1981), The Cider House Rules (1985), and A Prayer for Owen Meany (1989) and Anne Tyler’s Dinner at the Homesick Restaurant (1982), The Accidental Tourist (1985), and Breathing Lessons (1988). For the most part, however, a deep chasm was created between popular and literary fiction, while no poetry, except that by Maya Angelou and very few others, was broadly popular. The formal study of literature was embracing popular forms. Founded in 1967, the Popular Culture Association was by the end of the 1980’s near its peak in membership and influence, and previously scorned genres such as science fiction, mystery, and romance were showing up on syllabi everywhere. However, the leading academic institutions took a sharp turn toward “theory,” which often meant turning away from an engagement with details of particular texts and toward questions about the nature and function of texts in general. It was a movement of the discipline toward context and away from text. Among several schools of literary theory competing for influence, deconstruction, an import from France championed by Paul de Man, Jonathan Culler, Barbara Johnson, and others, was the project of exposing the fissures in texts that presume to be univocal and coherent. During the decade in which Geraldine Ferraro became the first woman nominated on the national ticket of a major political party and Sally Ride the first woman to travel into space, feminist works such as The Madwoman in the Attic: The Woman Writer and the Nineteenth-Century Literary Imagination (1980), by Sandra Gilbert and Susan Gubar, exerted a strong influence. So did reader theory, which represented an emphasis, by Stanley Fish, Walter J. Ong, and others, on how one goes about processing a text. In reaction against the old formalist insistence on ignoring everything but the text itself, a movement called the New Historicism, exemplified by Stephen Greenblatt, insisted on the shifting interplay between texts and contexts. Ethnocentric theories, such as the African American studies advanced by Henry Louis Gates, Jr., called for reading literature through the prism of race. Postcolonialists such as Edward Said called for subverting the hegemony of Eurocentric interpretation. Finally, the discipline
Literature in the United States
■
595
that began to call itself “cultural studies” challenged the privileged position of literature and, in the work of Susan Bordo, Janice Radway, and others, examined all media for evidence of how social structures shape meaning. Impact Printed poetry, drama, and fiction certainly did not vanish at the end of the 1980’s, but the manner in which they were consumed was significantly altered. In later years, the increased availability of personal computers and other electronic media accelerated the cultural marginalization of printed books. Talented authors, many of whom emerged first during the 1980’s, would continue to create brilliant works of literature, but never again would they command the culture’s attention in the way that Ernest Hemingway, Albert Camus, or Alexander Solzhenitsyn had just a few decades before. Further Reading
Contemporary Literature 33, no. 2 (Summer, 1992). This special issue—titled American Poetry of the 1980’s and edited by Thomas Gardner—contains ten essays on general trends in American poetry and particular poets, including John Ashbery, Charles Bernstein, Robert Duncan, Kathleen Fraser, Jorie Graham, Lyn Hejinian, and C. K. Williams. Hendin, Josephine G., ed. A Concise Companion to Postwar American Literature and Culture. Malden, Mass.: Blackwell, 2004. Essays by fifteen critics provide overviews of such phenomena as American drama, gay and lesbian writing, Jewish American fiction, African American literature, and Irish American writing during the postwar period. Millard, Kenneth. Contemporary American Fiction: An Introduction to American Fiction Since 1970. New York: Oxford University Press, 2000. Informative overview of the subject that also provides detailed analyses of thirty texts by thirty different authors. O’Brien, Sharon, ed. Write Now: American Literature in the 1980’s and 1990’s. Durham, N.C.: Duke University Press, 1996. Twelve essays cover subjects including the literature of AIDS, Jay McInerney, Lyn Hejinian, and Alice Walker. Spikes, Michael P. Understanding Contemporary American Literary Theory. Columbia: University of South Carolina Press, 2003. This useful guide offers separate chapters on de Man, Gates, Greenblatt, Said, Fish, and Bordo. Steven G. Kellman
596
■
Little Mermaid, The
The Eighties in America
See also
Beattie, Ann; Beloved; Bonfire of the Vanities, The; Book publishing; Closing of the American Mind, The; Color Purple, The; Erdrich, Louise; Hwang, David Henry; Irving, John; King, Stephen; Literature in Canada; Mamet, David; Minimalist literature; Oates, Joyce Carol; Poetry; Theater; White Noise; Wilson, August.
■ Little Mermaid, The Identification Animated film Directors John Musker (1953-
) and Ron
Clements (1953) Date Released November 17, 1989 The Little Mermaid solidified the beginning of the Disney renaissance in animation, while playing to the hearts and minds of the children of the 1980’s. The last of five feature-length animated films released by the Walt Disney Company in the 1980’s, The Little Mermaid came on the shoulders of two other Disney works. The success of Who Framed Roger Rabbit (1988), a mixed live-action and animated film, had helped build an audience for a large-scale animated feature. Meanwhile, Oliver and Company (1988) demonstrated that a market also existed for films fitting Disney’s traditional animated-feature format, the musical. Learning from the successes of both films, Disney made some changes to The Little Mermaid during its final year of production, tailoring it for the audiences of the late 1980’s. The company also chose to set the film in a fantasy world, as it had in former classics with great success. The result was a breathtaking glimpse of life under the sea featuring the mermaid Ariel and her friends. The musical elements of the movie included classic Broadway-like songs mixed with such Caribbean-flavored numbers as “Under the Sea,” which received an Academy Award for Best Song. The film also won the Oscar for Best Score. The movie signified the beginning of computer-generated twodimensional animation at Disney, as the process was tested on the final scenes of the film.The Little Mermaid included something new to Disney animated features that was a rarity even in previous live-action films: a strong heroine. Before Ariel, most of Disney’s leading females relied on their Prince Charmings and were only the source of conflict. Although she is such a source in The Little Mermaid, Ariel is also
an intelligent, inquisitive young woman who makes her own decisions. Ultimately, it is Prince Eric who kills the villain Ursula, but it is Ariel’s courage that drives the story. This is a change from the Disney norm and the beginning of a trend by the company to use both sexes as intelligent leading roles. Ariel reflected the growing equalization of the sexes in media that occurred during the 1980’s. These elements combined to make Disney’s The Little Mermaid a staple for the company. Impact Disney’s version of Hans Christian Andersen’s The Little Mermaid became the strongestperforming full-length animated feature since Walt Disney’s death. Grossing over $110 million in the United States, The Little Mermaid gave Disney animation a resurgence that would allow it to dominate the 1990’s animated feature genre. It also sparked a merchandising spree for Disney and led a missing generation of consumers to become reacquainted with the Disney name. Further Reading
Haas, Linda, Elizabeth Bell, and Laura Sells. From Mouse to Mermaid: The Politics of Film, Gender, and Culture. Bloomington: Indiana University Press, 1995. Johnston, Ollie, and Frank Thomas. The Illusion of Life: Disney Animation. New York: Disney Editions, 1995. Kurtti, Jeff. The Art of The Little Mermaid. New York: Hyperion Books, 1997. Daniel R. Vogel See also Academy Awards; Feminism; Film in the United States.
■ Live Aid The Event Two synchronized benefit concerts Date July 13, 1985 Places Wembley Stadium, London, England, and
John F. Kennedy Stadium, Philadelphia, Pennsylvania Live Aid was a fundraising event staged by many of the defining musical icons of the 1980’s. It raised hundreds of millions of dollars for Ethiopian famine relief, and it demonstrated the extent to which musical artists were willing to exercise their financial power for humanitarian causes.
The Eighties in America
Live Aid
■
597
Participants in the Wembley Stadium Live Aid concert join together for the concert finale. From left: George Michael of Wham!, concert promoter Harvey Goldsmith, Bono of U2, Paul McCartney, and Freddie Mercury of Queen. (AP/Wide World Photos)
On October 23, 1984, rock star Bob Geldof watched British Broadcasting Company (BBC) correspondent Michael Buerk’s special report about a famine gripping Ethiopia. Geldof was so saddened by the plight of these starving millions that he decided to do something about it. With his friend Midge Ure from the band Ultravox, he wrote “Do They Know It’s Christmas?” and quickly assembled an ad hoc supergroup, Band Aid, to record the song and donate its proceeds to Ethiopian relief. Band Aid consisted of forty-four musicians, including Phil Collins, Bono, Sting, David Bowie, Paul McCartney, Jody Watley, Boy George, and various members of Ultravox, Spandau Ballet, Kool and the Gang, Bananarama, Duran Duran, Status Quo, Big Country, and the Boomtown Rats. “Do They Know It’s Christmas?” was released on December 3, 1984, and instantly hit number one on the British pop charts. About the same time, inspired by Geldof, a similar project came to fruition in America. Harry Bela-
fonte, Ken Kragen, Michael Jackson, Lionel Richie, and Quincy Jones assembled USA for Africa to record Jackson and Richie’s “We Are the World.” Among the American supergroup’s forty-five musicians were Geldof, Stevie Wonder, Paul Simon, Kenny Rogers, Tina Turner, Billy Joel, Diana Ross, Dionne Warwick, Bruce Springsteen, Cyndi Lauper, Bob Dylan, Ray Charles, Bette Midler, Waylon Jennings, Smokey Robinson, and George Michael. Released on March 7, 1985, “We Are the World” was number one for four weeks in April and May. Determined to push the fundraising as far as possible, Geldof, Ure, and Harvey Goldsmith conceived and organized Live Aid, an unprecedentedly huge, transatlantic, all-star musical collaboration. They booked Wembley Stadium in London and John F. Kennedy Stadium in Philadelphia for overlapping and nearly simultaneous concerts. Each stadium erected giant video screens, so its fans could see all the acts in the other city as well as those in their own.
598
The Eighties in America
Live Aid
■
Select Performers at Live Aid JFK Stadium, Philadelphia Kenny Loggins Bryan Adams Madonna Ashford & Simpson Joan Baez Pat Metheny Billy Ocean The Beach Boys Tom Petty Black Sabbath The Cars Power Station The Pretenders Eric Clapton REO Speedwagon Phil Collins Crosby, Stills, Nash Keith Richards David Ruffin & Young Run-D.M.C Duran Duran Santana Bob Dylan Simple Minds The Four Tops Rick Springfield Hall & Oates Thompson Twins Mick Jagger George Thorogood Judas Priest and the Destroyers Eddie Kendricks Tina Turner B. B. King Ron Wood Kool & the Gang Neil Young Patti LaBelle Led Zeppelin Wembley Stadium, London Adam Ant Boomtown Rats David Bowie Phil Collins Elvis Costello Kiki Dee Dire Straits Thomas Dolby Bryan Ferry David Gilmour Elton John Howard Jones Nik Kershaw
INXS
Paul McCartney Branford Marsalis Alison Moyet Queen Sade Spandau Ballet Sting Style Council U2 Ultravox Wham! The Who Paul Young
Melbourne, Australia Men at Work
Besides the live acts performing in the two major venues, the concerts included real-time video transmissions of live performances around the world. Among them were performances by INXS in Mel-
bourne, Australia; B. B. King in The Hague, the Netherlands; Yu Rock Mission in Belgrade, Yugoslavia; Autograph in Moscow; Udo Lindenberg in Cologne, West Germany; and Cliff Richard at the BBC studio in London. London Prince Charles and Princess Diana introduced the Wembley portion of the concert at noon, Greenwich mean time (GMT). Various British deejays served as masters of ceremonies. Seventy-two thousand people attended. Status Quo opened, followed by Style Council, the Boomtown Rats (Geldof’s own band), Adam Ant, Ultravox, and Spandau Ballet. That much of the card took two hours, with some sets as short as four minutes. Thereafter, London and Philadelphia alternated, so fans experienced a continuous stream of music. Subsequent acts on the British stage included Elvis Costello, Nik Kershaw, Sade, Sting, Phil Collins, Howard Jones, Bryan Ferry, Paul Young, Alison Moyet, U2, Dire Straits, Queen, David Bowie, the Who, Elton John, Kiki Dee, George Michael, Andrew Ridgeley, and Paul McCartney. The London show ended at 10:00 p.m., GMT. Philadelphia Comedians Chevy Chase and Joe Piscopo, actor Jack Nicholson, and impresario Bill Graham introduced the acts for ninety thousand fans in Philadelphia. The show began at 8:00 a.m., Eastern Daylight Time (EDT; 2:00 p.m. GMT), with Bernard Watson, Joan Baez, and the Hooters. Additional performers included the Four Tops, Billy Ocean, Black Sabbath, Run-D.M.C., Rick Springfield, REO Speedwagon, Judas Priest, Bryan Adams, the Beach Boys, George Thorogood and the Destroyers, Bo Diddley, Albert Collins, Simple Minds, the Pretenders, Santana, Pat Metheny, Ashford and Simpson, Teddy Pendergrass, Kool and the Gang, Madonna, Tom Petty and the Heartbreakers, Kenny Loggins, the Cars, Power Station, Thompson Twins, Eric Clapton, Phil Collins (the only star to perform live in both stadiums), Duran Duran, Patti LaBelle, Hall and Oates, Eddie Kendricks, David Ruffin, Mick Jagger, Tina Turner, Bob Dylan, Ron Wood, and Keith Richards. Highlights included sets by Crosby, Stills, and Nash; Neil Young; and all four together. Jimmy Page, Robert Plant, and John Paul Jones reunited as Led Zeppelin, with Phil Collins and Tony Thompson substituting on drums for the late John Bonham. The Philadelphia show ended at 10:00 p.m., EDT. The combined length of both shows was sixteen hours.
The Eighties in America Impact Besides the 162,000 people who attended the live concerts, about 1.5 billion people in about one hundred countries either watched on television or heard on radio the live feeds from the BBC, the American Broadcasting Company (ABC), MTV, or their affiliates. Through sales of tickets, albums, CDs, videos, DVDs, and promotional merchandise, as well as donations, Live Aid raised over $245 million. Musically, only Queen’s, U2’s, and a few other performances were outstanding. Live Aid provided the exposure that made superstars of U2 and its lead singer, Bono. Geldof and U2’s home country, Ireland, contributed the highest per capita rate of donations. Geldof continued organizing gigantic benefit concerts, although Live Aid remained the model and set the standard for all his subsequent efforts. Further Reading
Blundy, David, and Paul Vallely. With Geldof in Africa: Confronting the Famine Crisis. London: Times Books, 1985. Minimally informative fundraising material by two London Times reporters. D’Acierno, John David. “Saving Africa: Press Issues and Media Coverage of Live Aid, Band Aid, and USA for Africa from 1984-1986.” M.A. thesis, University of Texas at Austin, 1989. Looks at the charity concerts from a sociological perspective. Geldof, Bob. www.bobgeldof.info/Charity/liveaid .html. The Live Aid organizer’s own Web site provides detailed and authoritative information, both directly and through proven links. Geldof, Bob, with Paul Vallely. Is That It? New York: Weidenfeld and Nicolson, 1986. Sensationalist autobiography of the then-thirty-five-year-old Geldof, telling how his self-destructive adolescent rebellion culminated in his spiritual rebirth as a man with a vision to help the world. Hillmore, Peter. Live Aid: World Wide Concert Book. Parsippany, N.J.: Unicorn, 1985. Contains interviews with performers, an introduction by Geldof, and many illustrations. Eric v. d. Luft See also
Adams, Bryan; Africa and the United States; Comic Relief; Duran Duran; Farm Aid; Hands Across America; Hip-hop and rap; Jackson, Michael; Madonna; Michael, George; MTV; Music videos; Natural disasters; Nicholson, Jack; Osbourne, Ozzy; Pop music; Run-D.M.C.; Sting; Turner, Tina; USA for Africa; U2; Women in rock music; World music.
Loma Prieta earthquake
■
599
■ Loma Prieta earthquake The Event A 6.9 magnitude earthquake Date October 17, 1989 Place The Santa Cruz Mountains, near Loma
Prieta Peak, southeast of San Francisco After decades of relative seismic quiet, the Loma Prieta earthquake was a wakeup call for Californians, warning them to prepare for potentially stronger future earthquakes. At 5:04 p.m., on the evening of October 17, 1989, an earthquake occurred in the Santa Cruz Mountains southeast of San Francisco, California. The epicenter was near Loma Prieta Peak, in Nisene Marks State Park, at a distance of about ten miles from the town of Santa Cruz. Sometimes called the World Series earthquake because it happened just before the third game of the 1989 World Series between the Oakland As and the San Francisco Giants, the Loma Prieta earthquake was the first major earthquake to shake the area since the Great San Francisco Earthquake of 1906. Although its epicenter was located in the mountains about sixty miles away, the earthquake caused extensive damage and loss of life in the densely populated areas of San Francisco, Monterey, and Oakland. This quake, which was felt by millions of people scattered over an area of about 400,000 square miles, occurred along the San Andreas Fault, which marks the boundary between the North American and Pacific tectonic plates, but geologists believe that the rupture actually occurred ten or eleven miles below the fault itself. Casualties and Destruction Although it lasted only fifteen seconds, the earthquake caused 65 deaths, 3,757 injuries, and $5.9 billion of property damage. Approximately 90 percent of the reported injuries were in Alameda, San Francisco, Santa Cruz, Santa Clara, and Monterey Counties, the five counties closest to the epicenter. Many human-made structures collapsed, particularly old, unreinforced masonry buildings. In downtown San Francisco, the exterior walls of old buildings fell on cars and caused five deaths. In Watsonville, flying debris from an outside wall struck and killed a passerby, and in Santa Cruz, three people were killed when a roof collapsed along Pacific Avenue, where twenty-nine buildings were destroyed. Many other deaths were caused by the damage
600
■
Loma Prieta earthquake
The Eighties in America
the developable portion of the Bay Area. Although the 1906 San Francisco earthquake had already demonstrated that these soils are not suited to withstand ground motion triggered by an earthquake, the practice of using fill continued. Moreover, the Bay Area has a very shallow water table, which saturates the underground materials. Soft mud, alluvial deposits, and loosely compacted fill amplify the shaking of an earthquake, and, when combined with a shallow water table, they can produce severe ground failure. During liqueA section of the San Francisco-Oakland Bay Bridge that failed during the Loma Prieta faction, the strength of the soils earthquake. The bridge had been scheduled to be reinforced the following week. (NOAA) decreases and the ability of these materials to support bridge and building foundations is so dimindone to transportation arteries. The Bay Bridge, ished that buildings tilt, freeway overpasses collapse, which links San Francisco to Oakland, sustained a and gas pipelines and water mains break. steel truss failure, causing two fifty-foot spans of the Geologic studies were conducted after the Loma upper deck to collapse onto the lower deck. Two inPrieta earthquake to examine prehistoric earthterstate highways, Highway 280 and Highway 880 quakes, not only on the San Andreas Fault but also (also known as the Nimitz Freeway), suffered major on the Hayward and Calavera faults. Integrating the failures in their reinforced concrete structure. The information gained in these studies with its recent upper deck of the freeway collapsed at the Cypress observations, the U.S. Geological Survey (USGS) Street viaduct, in the western suburbs of Oakland. reported in 1990 that there was a 60 percent probForty-two people died in the collapse, crushed by ability that one or more destructive earthquakes tons of concrete, burned by gas tank explosions, or (magnitude 7.0 or larger) would occur in the San killed when their vehicles were thrown from the Francisco Bay Area between 1990 and 2020. Other pitching roadway. Casualties continued for some studies undertaken by the California Division of time after the initial event. Thirty minutes after the Mines and Geology and the USGS have led to significollapse, a twenty-three-year-old woman died when cant changes in building codes related to the design she failed to notice the gap in the upper deck of the and construction of bridges, highways, and buildBay Bridge and drove over the precipice. More than ings. At the federal level, the Federal Response Plan seven hours after the earthquake, a driver was killed was created to better organize the activation, mobiliwhen he hit three horses running loose on the Santa zation, and deployment of personnel and resources Cruz freeway, and a civilian was gunned down as he and the assessment of damages. was directing traffic in San Francisco. Impact The highly televised earthquake was an instant reminder of the danger of living along a tecLiquefaction Most of the loss of life caused by the tonic plate boundary, and it stimulated much reearthquake was associated with the failure of humansearch to assess both the probability and the possible made structures, much of which was the result of a impact of more major earthquakes striking the San geologic process called liquefaction. Underneath Francisco Bay Area. Scientists thus attempted to prothe area where the earthquake struck sit loosely vide people with better tools to build a more secure compacted alluvial deposits in riverbeds and soft life on unstable land. mud around the bay. There is also a considerable amount of artificial fill, used by builders to extend
The Eighties in America Further Reading
Hough, Susan Elizabeth. Earthshaking Science: What We Know (and Don’t Know) About Earthquakes. Portland, Oreg.: Book News, 2004. Overview of seismology designed to provide a basic introduction for a lay audience. _______. Finding Fault in California: An Earthquake Tourist’s Guide. Missoula, Mont.: Mountain Press, 2004. Details the locations and history of significant faultlines within California. Reti, Irene, ed. The Loma Prieta Earthquake of October 17, 1989: A UCSC Student Oral History Documentary Project. Santa Cruz, Calif.: University of California, Santa Cruz, University Library, 2006. Extensive oral history of the earthquake and its effects, assembled by students at the university nearest its epicenter. Yeats, Robert S, Living with Earthquakes in California: A Survivor’s Guide. Corvallis: Oregon State University Press, 2001. Guide to the steps one should take in advance of an earthquake in order to increase the likelihood of survival. Denyse Lemaire and David Kasserman See also
Baseball; Natural disasters; Sports.
Louganis, Greg
■
601
■ Louganis, Greg Identification Olympic diving champion Born November 29, 1960; El Cajon, California
Louganis, the best springboard and platform diver of the 1980’s, is widely regarded as the world’s greatest diver. During his career, Greg Louganis won forty-eight national diving titles, ten gold medals in the Pan American Games (1979, 1983, 1987), and the 1986 Jesse Owens Trophy. He also received the 1984 James E. Sullivan Award for the best amateur athlete and was elected to the Olympic Hall of Fame in 1985. Among all of his victories and honors, it was Louganis’s performances at the Olympic Games that best reflected his ability. After finishing second at the 1976 Olympics, he was the favorite to win at the Olympic Games in Moscow in 1980, but the United States’ decision to boycott those games forced him to wait until the 1984 Los Angeles Olympics to demonstrate his prowess. In Los Angeles, Louganis won both the springboard and platform diving events, a feat he would repeat in the 1988 Olympic Games in Seoul, Korea. At the Seoul Olympics, he had a comfortable lead going into the last few dives when he attempted a two-
Greg Louganis dives from the 10-meter platform during the preliminary round of the competition at the 1988 Summer Olympics. (AP/ Wide World Photos)
602
■
Louisiana World Exposition
and-a-half somersault pike and hit his head on the board. He completed the dive, but he required four stitches to stop the bleeding. Despite the injury, he made his last dive successfully and won the event. However, since he was gay and HIV-positive, he was concerned about possibly infecting other swimmers; later, that possibility was dismissed as being extremely remote. Impact Because no other American swimmer, other than perhaps Mark Spitz, could rival his Olympic achievements and because he was so physically attractive, Louganis was a media hero, with his face on the cover of Life magazine (October, 1988) and Time magazine (July 28, 1986) and his nude body in Playgirl (1987). He also gained several endorsement contracts, making him a wealthy young man. Subsequent Events
Louganis revealed his homosexuality to the general public in 1994 and published his autobiography, Breaking the Surface, in 1995. He lost his corporate sponsorships, except those of the Children’s Television Network and Speedo. At the time that he revealed his homosexuality and infection with human immunodeficiency virus (HIV), he was the most famous athlete to do so. Undoubtedly, his disclosure made it easier for other gay athletes to reveal their sexuality. Louganis’s popularity resulted in his autobiography being adapted for television in 1996.
Further Reading
Anderson, Eric. In the Game: Gay Athletes and the Cult of Masculinity. Albany: State University of New York Press, 2005. Louganis, Greg, with Eric Marcus. Breaking the Surface. New York: Random House, 1995. Milton, Joyce. Diving for Gold. New York: Random House, 1989. Woog, Dan. Jocks: True Stories of America’s Gay Athletes. Los Angeles: Alyson Books, 1998. Thomas L. Erskine See also AIDS epidemic; Homosexuality and gay rights; Olympic Games of 1984; Olympic Games of 1988; Sports.
The Eighties in America
■ Louisiana World Exposition The Event Cultural and trade exhibition Date May 12 to November 11, 1984 Place New Orleans, Louisiana
The 1984 Louisiana World Exposition lost money when attendance failed to meet expectations. However, the infrastructure improvements and urban rejuvenation undertaken to prepare for the fair later paid off in increased tourism for the city. Like world fairs of preceding decades, the Louisiana World Exposition was intended to facilitate cultural exchange among the participating nations. It was also designed to help the city of New Orleans renovate its aging infrastructure and increase its inventory of hotels, restaurants, and other businesses required to promote tourism. Tourists were replacing oil and gas production as the city’s largest source of revenue. Exposition organizers formed a private association to sponsor and promote the event several years before it opened, choosing as their theme “A World of Rivers: Water as a Source of Life.” They scheduled the fair to coincide with the one hundredth anniversary of Louisiana’s first world exposition, the 1884 World Industrial and Cotton Centennial Exposition. A rundown warehouse section of the city adjacent to Canal Street, the main commercial thoroughfare, was renovated, and a new convention center and shopping pavilion were built along the Mississippi River. Ten new hotels were constructed to accommodate visitors. Promoters became excited about the fair’s prospects when the 1982 World’s Fair in Knoxville, Tennessee, exceeded attendance projections and made a profit. Unfortunately, planners in New Orleans did a very poor job of publicizing their exposition, and the press, including local newspapers and television stations, was frequently critical. To make matters worse, promoters were successful in obtaining only $10 million in federal grants, far less than Knoxville had received. Nevertheless, the exposition opened on May 12 to much local fanfare. The expected crowds never materialized, largely because of competition from other events. Both political parties held national conventions that year in other cities, and the 1984 Olympic Games were held in Los Angeles. Many people expended their vacation time and money for the year on those events. As a result, fewer out-of-state tourists attended the
The Eighties in America
fair than might otherwise have done so; instead of eleven million visitors, the exposition drew only a little more than seven million, two-thirds of them from Louisiana. As a result, the organizers had to declare bankruptcy even before the fair closed on November 11. They lost approximately $50 million, and the State of Louisiana, which had backed the project heavily, lost $25 million. Impact Although in the short run the Louisiana World Exposition was a financial failure and had little immediate effect on tourism in New Orleans, the event’s long-term impact was positive. The increased inventory of hotel rooms allowed the city to become one of the country’s major destinations for national conventions, and renovations in the warehouse district turned that area into a mecca for artists, small businesses, and museums, natural destinations for the increased number of tourists that began flocking to New Orleans within a decade after the fair closed. Further Reading
Dimanche, Frédéric. “Special Events Legacy: The 1984 Louisiana World’s Fair in New Orleans.” In Quality Management in Urban Tourism, edited by Peter E. Murphy. New York: Wiley, 1997. Glazer, Susan Herzfeld. “A World’s Fair to Remember.” New Orleans Magazine 38, no. 2 (November, 2003): D4-5. Laurence W. Mazzeno See also
Business and the economy in the United States; Knoxville World’s Fair; Vancouver Expo ’86.
■ Lucas, Henry Lee Identification Serial killer Born August 23, 1936; Blacksburg, Virginia
Lucas’s case shed light on both serial murder and the risks of false confessions. Henry Lee Lucas was arrested in October, 1982, and charged with killing Kate Rich, an elderly woman. Lucas was also questioned about the disappearance of a teen, Frieda “Becky” Powell, with whom he had had an affair. In mid-1983, he confessed to murdering both women, as well as many others. Lucas said that his family had abused him, and it was discovered that he had exhibited early warning signs, such as
Lucas, Henry Lee
■
603
killing pets, before he began to commit homicides. He was convicted of the 1960 murder of his mother. Lucas seemed to be a textbook example of a serial killer. Indeed, he ultimately confessed to killing or helping kill three thousand people with Ottis Toole, Powell’s uncle. These crimes were ostensibly committed throughout the United States, in Texas, Florida, California, Georgia, Maryland, and Michigan, among other states. Lucas claimed to be part of a cult, the Hands of Death, that worshiped Satan and sacrificed people to him. In late 1983, the Henry Lee Lucas Task Force invited Texas law officers to question Lucas in order to close unsolved cases. Some of Lucas’s confessions were demonstrably false. The Lucas Report, a 1986 study from the Texas Attorney General’s Office, shows that Lucas often was elsewhere at the time of one of his supposed killings. Some of his statements contradicted other facts as well. Some experts estimated that he had actually committed around 350 murders. In April, 1984, Lucas retracted his confessions, claiming that he had never killed anyone but his mother and that her death had been an accident. Still, Lucas was convicted of eleven homicides and sentenced to death. Probably, the truth lies somewhere between Lucas’s extreme confessions and his most limited statement. The question remains as to why Lucas falsely confessed to so many murders, and the possible answers are manifold. He was mildly mentally retarded and mentally ill, hearing voices in prison. His initial confession may have been coerced, and police officers probably fed Lucas many of the details that “proved” his guilt. The police and jail personnel also offered Lucas better treatment in exchange for his confessions. Impact More people saw the news coverage of Lucas’s confessions than saw the follow-ups, adding to a national concern with serial killers that had been kindled by Ted Bundy and David Berkowitz (Son of Sam) in the mid- to late 1970’s. Lucas’s purported crimes supported the view that, because serial killers sometimes committed crimes in several states, local police forces were helpless to catch them and thus the Federal Bureau of Investigation (FBI) was required. These views of serial killers and the FBI would become stronger in the 1990’s. Lucas’s story also demonstrates the problem of false confessions and the ease with which an interviewer can unknowingly lead the interviewee to false testimony, a sub-
604
■
The Eighties in America
Ludlum, Robert
ject that also pertains to some of the child-abuse and Satanism accusations of the 1980’s and 1990’s. Further Reading
Cox, Mike. The Confessions of Henry Lee Lucas. New York: Pocket, 1991. Norris, Dr. Joel. Henry Lee Lucas. New York: Zebra Books, 1991. Bernadette Lynn Bosky See also
Atlanta child murders; Crime; McMartin Preschool trials; Missing and runaway children; Night Stalker case.
■ Ludlum, Robert Identification American novelist Born May 25, 1927; New York, New York Died March 12, 2001; Naples, Florida
Ludlum’s best-selling novels revolved around terrorism, international intrigue, and conspiracy, offering readers a glimpse into a dangerous but exciting world. Robert Ludlum hit upon a successful formula with his first book in 1971, and the five novels he wrote in the 1980’s show little variation from that formula. He created a smart and resourceful male protagonist with whom readers could easily identify and then proceeded to pit him against a powerful, mysterious group—a clandestine government agency, say, or a terrorist organization. Ludlum’s books were long and his plots convoluted, but he wrote in a graphic and readily accessible style. His distinctive three-word titles, each of which posed a mystery that would be resolved by book’s end, became one of his trademarks. Ludlum’s first novel of the decade was probably his most famous. The protagonist of The Bourne Identity (1980) has lost his memory, but he gradually discovers that he may be a secret agent, perhaps even a professional assassin. The situation created a dilemma for Ludlum’s readers, since whatever the protagonist’s real identity, he was likable and generous. Ludlum later plunged the same character into
a Chinese political crisis in The Bourne Supremacy (1986), but not before publishing two other successful novels. In The Parsifal Mosaic (1982), an American agent discovers that a well-meaning secret society is engineering a potentially catastrophic plot to blackmail the governments of the world into peaceful coexistence. With The Aquitaine Progression (1984), Ludlum envisioned another cabal—in this case, an international conspiracy of retired military figures plotting to wrest control of the world from its civilian leaders. In his final novel of the decade, The Icarus Agenda (1988), Ludlum returned to the subject of international terrorism but worked two far-reaching conspiracies into the mix as well: An American politician’s heroic antiterrorist activities bring him to the attention of two clandestine groups—a benevolent organization working behind the scenes for the betterment of the country and an opposing cabal whose continued power depends on fear and political disorder. Motion-picture versions of three Ludlum novels also whetted the public’s appetite for his books. The Osterman Weekend and The Holcroft Covenant appeared in 1983 and 1985 respectively, and The Bourne Identity was filmed for television in 1988. Impact Robert Ludlum was widely read throughout the world. Critics dismissed his improbable plots and his melodramatic style, but his many readers found that he offered not only escape but also insight into the sometimes frightening international events described in the morning’s headlines. Further Reading
Ludlum, Robert, and Martin H. Greenberg. The Robert Ludlum Companion. New York: Bantam Books, 1993. Macdonald, Gina. Robert Ludlum: A Critical Companion. Westport, Conn.: Greenwood Press, 1997. “Robert Ludlum.” The Economist, March 31, 2001, p. 103. Grove Koger See also
Action films; Book publishing; Clancy, Tom; Cold War; Literature in the United States; Sequels; Terrorism.
M ■ McEnroe, John
McEnroe became known not only for his extraordinary skill as a tennis player but also for his fiery outbursts of temper in the midst of matches.
in a row, from 1979 to 1981. In 1981, he also won Wimbledon. Because of his court behavior that year at Wimbledon, though, he was fined $1,500. McEnroe went so far as to call the umpire, Ted James, “the pits of the world.” McEnroe would use the phrase “you cannot be serious” whenever he disagreed with a call. The British press gave him the name “SuperBrat.” Although he lost to Connors in the 1982
As a freshman at Stanford University in 1978, John McEnroe won the National Collegiate Athletic Association (NCAA) Tennis Championship. With this victory and other successes as an amateur tennis player, McEnroe decided that he should leave Stanford and become a professional tennis player. In 1979, he won his first Grand Slam title, the U.S. Open singles title, defeating his close friend and fellow American tennis player Vitas Gerulaitis. At the time, McEnroe was the second youngest person to win the title. During the year, he won a total of twenty-seven titles, including ten in singles and seventeen in doubles. McEnroe was quickly establishing himself as one of the best tennis players in the world. He was also gaining a reputation for his violent outbursts on the court. By March of 1980, McEnroe had become ranked as the number one player in the world. In addition to his skill as a singles player, he and his doubles partner Peter Fleming constituted one of the most potent doubles teams in the history of tennis. McEnroe was one of the few top singles players to compete in doubles. He believed that his overall game was improved by competing in both singles and doubles. McEnroe was noted for the unique corkscrew motion of his serve and for his brilliant volleying skills, honed in doubles play. As a left-handed player, he took inspiration from a great Australian player of the 1960’s, Rod Laver. McEnroe was the dominant figure in men’s tennis during the early 1980’s. Some of his most memorable matches were against such fierce competitors as Jimmy Connors, Björn Borg, and Ivan Lendl. McEnroe won the U.S. Open three years
John McEnroe celebrates his victory over Björn Borg to win the 1981 men’s singles championship at Wimbledon. (AP/Wide World Photos)
Identification American tennis player Born February 16, 1959; Wiesbaden, West
Germany (now in Germany)
606
■
McKinney Homeless Assistance Act of 1987
Wmbledon final, McEnroe came back to win the title in 1983 and 1984. The year 1984 was a banner year for McEnroe: He won not only Wimbledon but also the U.S. Open. However, he lost to Lendl in a marathon match at the French Open. During 1984, McEnroe compiled a record of eighty-two wins to only three losses, and he won thirteen singles titles. By the late 1980’s, McEnroe was no longer the force he had been earlier in the decade. Impact John McEnroe will be remembered for being a fiery champion who demanded the best from himself and those around him. He was a student of the game of tennis and was very vocal about how the game should be played and evolve. While his antics on the court alienated him from many, it was his forthrightness that pushed tennis to become a more professional sport. During his career, he captured seventy-seven singles titles and seventy-eight doubles titles. In 1999, he was inducted into the International Tennis Hall of Fame. Further Reading
Adams, Tim. On Being John McEnroe. New York: Crown, 2003. Evans, Richard. McEnroe: Taming the Talent. 2d rev. ed. Lexington, Mass.: S. Greene Press, 1990. McEnroe, John, with James Kaplan. You Can’t Be Serious. New York: G. P. Putnam’s Sons, 2002. Jeffry Jensen See also
Navratilova, Martina; Sports; Tennis.
■ McKinney Homeless Assistance Act of 1987 Identification Federal legislation Date Signed on July 22, 1987
This federal legislation addressed homelessness, a major public concern in the 1980’s, by providing shelter, food, and health care. Historically, Americans have addressed homelessness at the grassroots level. The presidential administration of Ronald Reagan sought to continue this pattern with the argument that states and localities were best equipped to solve their own homeless problems. Nevertheless, Stewart B. McKinney (19311987), a Republican from Connecticut who had served in the U.S. House of Representatives since
The Eighties in America
1971, had a long-standing interest in housing problems. As homelessness worsened in the 1980’s, McKinney sought federal aid to address the problem. McKinney began to lobby in 1986 for passage of the Urgent Relief for the Homeless Act to provide emergency provisions for shelter, food, health care, and transitional housing. A study prepared in September, 1986, by the Department of Health and Human Services had cited “eviction by landlord” as the prime cause of homelessness, while another major cause was the release of mentally ill people from institutions into communities that lacked services for them. McKinney’s bill sought to address these problems. Large bipartisan majorities in both houses of Congress passed the legislation in 1987. McKinney then died in office in that May. In tribute, the legislation was renamed in his honor. President Reagan reluctantly signed the bill into law on July 22, 1987. The McKinney Homeless Assistance Act provided $550 million nationwide for dealing with problems related to homelessness. It supported affordable housing, emergency shelters, rent subsidies, food distribution, health and mental health care, job training, and treatment of drug and alcohol abuse. The legislation represented a large infusion of federal funds to address homelessness as well as an attempt to deal with its root causes. It also required that homeless children be provided transportation to school and other educational opportunities. Problems with the legislation appeared fairly quickly. No one knew exactly how many Americans were homeless. Many who lived on the streets did not register for government services, and no clear definition of “homeless” existed. While the National Bureau of Economic Research suggested that, at any given time, between 250,000 and 400,000 Americans had no home, the National Coalition for the Homeless claimed that, over the course of a year, three million people were homeless. Impact The Interagency Council on the Homeless, the federal government’s chief coordinating agency for homeless assistance programs, subsequently came under heavy attack for being inadequate and ineffective. The council, under the Housing and Urban Development umbrella, had been established as a clearinghouse for information about federal government programs. It was faulted as an apologist for the Reagan administration’s failure to implement the McKinney Act. The McKinney legislation re-
The Eighties in America
quired that excess federal property be offered as temporary, emergency sanctuaries for homeless people, but seventeen months after Reagan signed the law, only two federal sites had been so used. Despite difficulties in implementing it, the McKinney Act, amended several times in subsequent years, remained in effect as of 2008. Further Reading
Alker, Joan. Unfinished Business: The Stewart B. McKinney Homeless Assistance Act After Two Years. New York: National Coalition for the Homeless, 1989. Liff, Sharon R. No Place Else to Go: Homeless Mothers and Their Children Living in an Urban Shelter. New York: Routledge, 1996. Caryn E. Neumann See also Homelessness; Income and wages in the United States; Reagan, Ronald; Reaganomics; Recessions; Unemployment in the United States; Welfare.
■ McMartin Preschool trials The Event
Seven preschool workers are tried for sexually abusing their charges Date 1987-1990 Place Manhattan Beach, California The McMartin Preschool case set off a wave of hysteria in the 1980’s, as parents—who were working longer hours than ever to make ends meet—worried about the dangers of leaving their children in the hands of strangers. When it was later demonstrated that the allegations against the preschool’s staff were unfounded, the case raised a new set of concerns over political manipulation of the justice system, media sensationalism, and the appropriate level of credence to give to young children’s testimony. When Judy Johnson’s two-year-old son Matthew returned home one day in 1983 from the McMartin Preschool in Manhattan Beach, California, his bottom appeared red, so she took him to a hospital for an examination. Although hospital personnel found no conclusive evidence of sexual molestation, she filed a police report, charging Raymond Buckey, grandson of McMartin Preschool owner Virginia McMartin with sexual molestation, including bondage and rape. Police, after referring Johnson to
McMartin Preschool trials
■
607
Children’s Institute International (CII) at the University of California, Los Angeles, began an investigation of Buckey. They sent letters to the parents of the other McMartin Preschool students to inquire whether they had heard of any misconduct. Hysteria ensued, and many parents, certain that horrible things had occurred at the school, went to CII for counseling. Politics and False Testimony Eager for a rallying issue to focus his reelection campaign, Los Angeles district attorney Robert Philibosian assigned the McMartin case to Assistant District Attorney Joan Matusinka, whose specialty was sexual abuse of preschoolers, and to another assistant district attorney, Lael Rubin, who handled the case in court. Matusinka urged parents to see her friend at CII, social work grantwriter Kee MacFarlane, who in turn intimidated children into admitting falsely to abusive conduct by members of the preschool staff. The case ultimately involved three full-time assistant district attorneys, fourteen investigators from the District Attorney’s Office, twenty-two task force officers, two full-time and twenty part-time social workers, and one full-time and four part-time detectives. Investigators searched twenty-one residences, seven businesses, three churches, two airports, thirty-seven cars, and a farm. They found no concrete evidence to corroborate the wild allegations made by 450 children, who accused teachers of slaughtering horses, forcing children into hidden tunnels, attending satanic rituals in locked churches, and taking naked photographs. The media and the prosecution, however, focused on these fantastic charges without making any effort to verify them. Nevertheless, the coerced confessions were referred to a grand jury, which indicted the entire preschool staff of seven on more than 150 counts. The district attorney added more counts, bringing the total to 208. After their arrest in 1984, bail was denied to all the suspects except elderly Virginia McMartin, when Rubin falsely alleged that the defendants threatened the parents and children with harm. Jailed pending trial, they were abused by prison guards and prisoners. Before the trial, evidence was presented at a nineteen-month preliminary hearing, the longest on record. The defendants lost all of their assets paying their defense lawyers’ fees. During the hearing, the defense identified inconsistencies in the children’s
608
■
The Eighties in America
McMartin Preschool trials
stories. One child even identified actor Chuck Norris, a city attorney, and four nuns whose photographs had been taken in the 1940’s as among those who had molested them at the preschool. One expert found that photographs of the children’s anuses were inconclusive. Nevertheless, in January, 1986, after the $4 million preliminary hearing was concluded, the judge charged Ray Buckey with eighty-one felony counts, his mother Peggy Buckey with twenty-seven, and the others with various subsidiary crimes. Soon, two members of the prosecution team resigned, claiming that the evidence against the defendants was phony; one testified that Rubin had intentionally lied to deny bail to the McMartin Seven. Ira Reiner, who had defeated Philibosian to become the new district attorney in 1984, then reviewed the evidence and dropped charges against all but Ray and Peggy Buckey. Jury selection began in April, 1987. At trial, of 360 alleged victims of child abuse, only 11 children testified. The defense was denied the opportunity to view all the videotaped interviews with the children. A prison snitch testified that Buckey had admitted guilt in prison, but then admitted that his testimony was not credible. A professor of medicine testified that there was anal and vaginal penetration, but his testimony was discredited because he had no formal training in diagnosis. Social worker MacFarlane’s testimony in August, 1988, was refuted by a psychiatrist, who reported that the children were saying whatever MacFarlane wanted them to say. However, the judge refused to allow other crucial rebuttal testimony. The $15 million trial concluded in January, 1990, becoming the longest criminal trial in American history. Peggy Buckey was acquitted, and the jury hung on thirteen of the counts against Ray. Ray underwent a second trial in June, 1990, which resulted in another hung jury. The defense then moved that all charges against Buckey be dismissed. The motion was granted. Impact In 1985, while the trial received television coverage, there was widespread hysteria around the country about child abuse in preschools. Approximately one million people were falsely accused of child abuse. Teachers who made physical contact with their students were sometimes fired and even imprisoned. Conferences on child abuse focused on satanic ritual abuse. Several persons were arrested
and convicted of participating in child abuse rings. Another celebrated trial in 1986, involving a day-care center in Massachusetts, resulted in three convictions after prosecutors consulted with the McMartin prosecutors for ideas. Subsequent Events Some of those falsely accused across the country were able to rebuild their lives, though Peggy Buckey developed agoraphobia. Among thirty who were imprisoned for allegedly molesting children in Bakersfield, one died in prison, seven served their time (a twenty-year sentence in one case), and twenty-two others had their convictions reversed on grounds of judicial or prosecutorial misconduct. The latter sued and won several million dollars in compensation. Once the falsity of the charges in the McMartin case became more widely known, the hysteria over abuse subsided somewhat and was replaced by concern over the ease with which young children’s trial testimonies can become tainted. Further Reading
Eberle, Paul, and Shirley Eberle. The Abuse of Innocence: The McMartin Preschool Trial. New York: Prometheus, 1993. Detailed examination of the evidence in the case. Nathan, Debbie, and Michael Snedecker. Satan’s Silence: Ritual Abuse and the Making of a Modern American Witch Hunt. New York: Basic Books, 1995. Analysis of the children’s fabricated claims in the McMartin case and how ordinary citizens vilified the McMartin Seven outside court. Pride, Mary. The Child Abuse Industry. Westchester, Ill.: Crossway, 1986. Identifies how the hysteria over child abuse adversely affected nearly one million persons around the country. Showalter, Elaine. Hystories: Hysterical Epidemics and Modern Media. New York: Columbia University Press, 1997. Points out that anyone is susceptible to hysteria, especially when the media identify a phenomenon that enables individuals to imagine secret conspiracies and to fabricate claims, which Showalter calls “hystories.” Michael Haas See also
Atlanta child murders; Brawley, Tawana; Central Park jogger case; Crime; Journalism; Rape; Sexual harassment; Tabloid television; Television.
The Eighties in America
■ Madonna Identification
American singer, songwriter, producer, actor, and entertainer Born August 16, 1958; Bay City, Michigan Madonna was a leading singer and entertainer who entered onto the world stage in the 1980’s and exerted a strong influence on popular culture, including music, videos, and fashion. Coming from humble roots, a determined Madonna along with her boyfriend Steve Bray formed a band in 1981 that was called Madonna. She had previously performed with other groups as a singer and drummer. With the formation of the group Madonna, she began to play the guitar and write music. In 1982, her funky music, which also showed rhythm and blues influences, won her a contract with Sire Records, a subsidiary of Warner Bros. Madonna’s first record with Sire, “Everybody,” was released in 1982 and did not make the Billboard Hot 100 chart. Two other singles, “Burning Up” and “Physical Attraction,” were released in 1982. These early singles established a fan base that prompted Sire Records to offer Madonna the opportunity of recording an album. In 1983, her first album, Madonna, was released to critical acclaim; early sales totaled over three million albums. Eventually, the album sold over six million copies, with “Holiday,” “Borderline,” and “Lucky Star” becoming key singles. Madonna developed a fan base among teenage girls, who also followed her fashion style, which included wearing lingerie outside her clothing, in addition to other overtly sexual outfits. This style was considered by many to be too provocative, and the singer was criticized as a bad role model for young girls. The videos that accompanied the release of her albums often presented sexuality and other themes that many parents thought inappropriate for their children. Madonna was followed in 1985 by Like a Virgin. With the release of this second album, Madonna became a household name around the world, as millions of fans and critics followed her music and her cultural influence. Like a Virgin was Madonna’s first number one album in the United States, as twelve million copies were sold upon its release, and eventually over seventeen million copies were sold. Key singles included the songs “Like a Virgin” and “Material Girl.” Madonna wrote five songs for this album, as her strength as a music writer began to be
Madonna
■
609
recognized. Fashioning a moniker from Madonna’s hit song, the media began to refer to Madonna as the Material Girl. Madonna’s Career Broadens In 1985, Madonna married an up-and-coming actor named Sean Penn. The couple was often featured in fan and tabloid magazines, but the fast-moving singer and the actor soon separated and divorced in 1989. Early nude pictures of Madonna caused a public stir when they appeared in Playboy and Penthouse in 1985. The blackand-white art photographs had been taken in the late 1970’s, before Madonna had achieved stardom. She tried to block publication of the photographs but was unsuccessful. Madonna’s album True Blue was released in 1986, continuing her streak of successful releases in the 1980’s. In addition to writing many of the songs on
Madonna at the 1984 MTV Video Music Awards. (AP/Wide World Photos)
610
■
The Eighties in America
Magnet schools
True Blue, Madonna served for the first time as the album’s producer. The album spawned three number one hit singles: “Live to Tell,” “Papa Don’t Preach,” and “Open Your Heart.” The initial sales of True Blue topped eleven million albums. Madonna made her film acting debut in 1985, with a brief appearance as a singer in Vision Quest (1985) and playing the title role in Desperately Seeking Susan (1985). A low-budget but high-quality film directed by Susan Seidelman, Desperately Seeking Susan was a commercial success and extended Madonna’s career into the film industry. She followed it up, however, with a major flop, Shanghai Surprise (1986), costarring Penn. The film cost $17 million and grossed only $2.3 million at the box office. Madonna also starred in Who’s That Girl (1987), which was named after one of her hit singles. By the end of the decade, she was recognized as being a capable actress. In 1987, Madonna conducted her Who’s That Girl World Tour, which included stops in Japan, France, Canada, the Netherlands, Italy, Germany, the United Kingdom, and the United States. The thirty-eight-show tour set a record as the highestgrossing world tour to that date, earning about $20 million. Controversy also followed the tour, as program elements included comments on religion that were aimed at President Ronald Reagan and Pope John Paul II. As a result, the pope instructed Catholics to boycott the tour’s Italian performances. Madonna’s final album of the 1980’s was Like a Prayer. This album included hit songs “Express Yourself,” “Like a Prayer,” “Cherish,” and “Keep it Together.” Risqué videos accompanied many of the singles, as Madonna continued to stir up controversy with her music and visuals. The video accompanying “Like a Prayer” used religious symbols in controversial ways that were condemned by the Catholic Church. However, the controversy increased sales, demonstrating that Madonna in the 1980’s had also become a very shrewd businesswoman. By the end of the 1980’s, Madonna had established herself as the leading woman solo performer in the United States and possibly the world. During the decade, she released nine number one songs. She changed the music industry, and she began what would prove to be a continuing process of selfreinvention, as she strove to keep her music and her persona fresh and relevant in a rapidly changing culture. Her influence on popular culture during the decade was second only to Michael Jackson, as young
women mimicked her hair, sexuality, clothing, and dance styles. Impact Trends started by Madonna in the 1980’s influenced music, fashion, videos, children’s books, and popular culture into the early twenty-first century. Her continuing effects on popular culture could be measured as each new style or persona she adopted would be reflected in the larger American culture in which she participated. Madonna would go on to sell more records than any other woman in history. Further Reading
Guilbert, Georges-Claude. Madonna as Postmodern Myth: How One Star’s Self-Construction Rewrites Sex, Gender, Hollywood, and the American Dream. Jefferson, N.C.: McFarland, 2002. In-depth analysis of Madonna, her strategically planned rise to stardom, and its effects upon American culture and American identity. Morton, Andrew. Madonna. New York: St. Martin’s Paperbacks, 2002. Popular biography of Madonna. St. Michael, Mick. Madonna Talking: Madonna in Her Own Words. London: Omnibus Press, 2004. A collection of quotations by Madonna. Sexton, Adam, ed. Desperately Seeking Madonna: In Search of the Meaning of the World’s Most Famous Woman. New York: Delta, 1993. Collection of writings, cartoons, and other works reflecting on Madonna and her influence. Douglas A. Phillips See also Film in the United States; Live Aid; MTV; Music; Music videos; Pop music; Women in rock music.
■ Magnet schools Definition
Public schools that offer specialized curricula or facilities in order to attract students from throughout their districts
Magnet schools were developed in the early 1980’s in an attempt to encourage voluntary desegregation. They were based on the theory that students of all races would be attracted by the schools’ exceptional curricula and would therefore be motivated to attend, contributing to the schools’ diversity.
The Eighties in America
Magnet schools have several distinguishing characteristics: an enrollment policy that opens the school to students beyond the normal, limited geographic area; a student body that enrolls by choice; and a curriculum based on a specific instructional method or theme (for example, science or art). Magnet schools in the 1980’s were assisted by federal funding under the Elementary and Secondary Education Act (1965). This funding gave the federal government leverage in determining how the schools would function. Although they were conceived as a means to desegregate the American public school population, the schools were instructed by the U.S. Department of Education not to set racial quotas to determine enrollment. The department believed that, although racial segregation had been harmful, race should be used only as a last resort in determining admission to the schools. Some Notable Magnet Schools Many school districts, especially in urban areas, sought to limit or reverse the racial segregation of their schools. Therefore, once funding was made available for magnet schools, many districts availed themselves of these funds to create pioneer magnet programs. For example, in the mid-1980’s, McMillan Junior High School, in Omaha, Nebraska, became Omaha Public Schools’ first magnet junior high school, featuring special courses in computers and mathematics. Communication arts were added to the magnet curriculum in the early 1990’s, and the name of the school was changed to McMillan Magnet Center. In the late 1960’s, George Richmond started the MicroSociety program in Brooklyn, New York, with fifth graders, for the purpose of providing them with more educational motivation. This program was tested in Lowell, Massachusetts, when its project director developed a plan to begin two city-wide magnet schools. One was a school of the arts and the other would use the MicroSociety curriculum. Both opened in 1980. It was hoped that students from all over the city would choose these schools. Admissions were on a first come, first served basis. When the school was filled, waiting lists were developed, one for minority and one for majority students, so the racial balance could be maintained. The schools represented the beginning of a voluntary desegregation program in Lowell called “schools of choice.” In the 1980’s, the number of individual schools offering magnet programs nearly doubled, and the
Magnet schools
■
611
number of students enrolled in these programs nearly tripled. Some magnet programs were part of an existing school. They were known as magnet “school within a school” programs. Other magnet schools operated on completely separate facilities within a school district. Magnet schools remained mainly an urban phenomenon, as more than half of large, urban school districts would develop magnet programs, as compared to only 10 percent of suburban districts. By the end of the 1980’s, more than 1.2 million students were enrolled in magnet schools in 230 school districts. Impact In the late 1980’s, the Supreme Court adopted a fundamentally different approach to civil rights than it had employed in previous decades. The William H. Rehnquist court adopted the assumption that the history of discrimination had been successfully addressed and that the Court’s previous orders mandating school desegregation should therefore be rescinded. In three decisions in the 1990’s, the Court defined desegregation as a “temporary” remedy and found that school boards released from their orders could reinstate segregated schools. The Rehnquist court felt that policies taking race into account for the purpose of creating integration were suspect. Such policies, to be considered legal, had to both demonstrate a compelling motive and prove that this goal could not be realized without considering race. The Court’s decisions led some lower courts to forbid even voluntary action for desegregation, such as magnet schools in which desegregation guided their admissions policies. Such orders were handed down, for example, in Virginia, Maryland, and Boston. Significant federal aid aimed at helping interracial schools succeed ended early in the 1980’s. Many states then abandoned the offices, agencies, and policies they had set up to produce and support interracial education. This led to a significant decrease in the creation of new magnet schools, although preexisting schools continued to flourish. Magnet schools, however, proved the inspiration for charter schools. Like magnets, charter schools offered alternatives to “traditional” public education and sometimes allowed students from a wider geographic area to attend. Further Reading
Brooks, Robert G., et al., eds. Definitive Studies of Magnet Schools: Voices of Public School Choice. Washing-
612
■
Magnum, P.I.
ton, D.C.: Magnet Schools of America, 1999. The senior author has been executive director of Magnet Schools of America, so he has a positive view of the magnet school movement; however, this study is one of the most comprehensive looks at magnet schools throughout the United States. Henig, Jeffrey R. Rethinking School Choice: The Limits of the Market Metaphor. Princeton, N.J.: Princeton University Press, 1994. Carefully researched book looking at the historical background of many school choice plans, including magnet schools. Includes criticisms of choice plans, as well as information to support them. Metz, Mary H. Different by Design: The Context and Character of Three Magnet Schools. New York: Teacher’s College Press, 2003. This historical study analyzes the organizational and political pressures that helped make three magnet schools distinctive social environments. Discusses school choice, curricular reform, and school equity and looks at the effects of the programs over two decades. Mary C. Ware See also Education in Canada; Education in the United States; Multiculturalism in education; National Education Summit of 1989; Standards and accountability in education.
■ Magnum, P.I. Identification Television series Creators Donald P. Bellisario (1935-
) and Glen A. Larson (1937) Date Aired from December 11, 1980, to May 1, 1988 Magnum, P.I. helped revise the detective show genre, as well as Americans’ perceptions of Vietnam veterans. The show made Tom Selleck a major television star, although his efforts to parlay his fame into a film career met with only modest success. One of the initial reasons for setting Magnum, P.I. in Hawaii was so that the Columbia Broadcasting System (CBS) could continue using the sets created for Hawaii Five-O. As the series evolved over the next eight years and 157 episodes, the executive producers, Donald P. Bellisario and Glen A. Larson, achieved something unique in the detective genre, creating a series that transcended the detective story
The Eighties in America
and that became a cultural metaphor for Americans’ attempt to understand the Vietnam War. In some respects, the narrative construction of the series, a detective drama featuring significant comedic elements, continuity between episodes, and recurrent characters, was similar to other detective shows of the 1980’s, such as Simon and Simon, Murder, She Wrote, and Matlock. The show’s title character, Thomas Sullivan Magnum IV, played by Tom Selleck, is a former U.S. Navy SEAL, Naval Intelligence officer, and prisoner of war. Magnum’s military background, combined with his apparent lack of direction in life and his ambivalence toward the service, enables him easily to transition into the role of private investigator. He also works as a security expert for a very successful mystery writer, Robin Masters. In return for his security advice, Magnum is allowed to live in the guest house of Masters’s Hawaiian estate, Robin’s Nest—a luxurious, beachfront complex on Oahu—and to drive the estate’s Ferrari 308 GTS. Most episodes revolve around Magnum’s efforts to resolve clients’ problems by conducting investigations in which he is often aided by his friends, T. C. (Roger E. Mosley) and Rick (Larry Manetti). Both are former U.S. Marines who served with Magnum in an elite unit in Vietnam. Magnum is also both aided and thwarted by the majordomo of Robin’s Nest, Jonathan Quayle Higgins III (John Hillerman), an Englishman and retired sergeant major in the British army. Humorous conflicts often erupt between Magnum and Higgins, sometimes involving Higgins’s Doberman pinschers, Zeus and Apollo, or stemming from superficial disputes about Magnum’s privileges on the estate. Despite the nearly episodic conflicts between the two, Magnum and Higgins develop a deep friendship over the course of the series. Magnum, P.I. surpassed viewers’ expectations, distinguishing itself from other detective shows of the 1980’s, not only because it was both more dramatic and more humorous than the average such show, but also because it featured complex characters whose present lives were haunted by their pasts. Many episodes featured flashbacks to Magnum’s past, particularly to his Vietnam War experiences. These flashbacks, triggered by relevant events in the present, both expanded the scope of the narrative and reconstructed the characters’ past, fleshing out their motivations and psyches. According to critic Rodney Buxton, although
The Eighties in America past actions might not have an immediate impact on any individual weekly narrative, the overall effect was to expand the range of traits which characters might invoke in any given situation . . . the cumulative strategy offered a richness of narrative, moving beyond the simpler “who-done-it.” Impact Magnum, P.I. introduced viewers to a new kind of Vietnam veteran, someone unlike the Rambo vigilante, someone scarred by Vietnam but not lost. Magnum’s heroic appeal was enhanced by his humanity and imperfections, and his investigations provided viewers with diverting mysteries to solve. The series captured Americans’ struggle to understand the past and the legacy of the Vietnam conflict, by insistently making reference to that past in order to make sense of the characters’ present. Further Reading
Brooks, Tim, and Earle Marsh. “Magnum P.I.” The Complete Directory to Prime Time Network and Cable TV Shows: 1946-Present. 8th rev. ed. New York: Ballantine Books, 2003. Haines, Harry W. “The Pride Is Back: Rambo, Magnum P. I., and the Return Trip to Vietnam.” In Cultural Legacies of Vietnam: Uses of the Past in the Present, edited by Richard Morris and Peter Ehrenhaus. Norwood, N.J.: Ablex, 1990. Renée Love See also
Full Metal Jacket; Miniseries; Platoon; Rambo; Television; Vietnam Veterans Memorial.
■ Mainstreaming in education Definition
Instructional practice in which all students of the same age learn together, regardless of capability
Mainstreaming was developed to ensure students with disabilities equal access to public education. Making it work required significant resources and reform of existing educational strategies. From the 1920’s to the 1970’s, students with disabilities were taught separately from other students. That practice changed when Congress passed laws to ensure that disabled students were not discriminated against. In 1973, Congress passed the Rehabilitation Act . By 1975, that law was joined by the Education for All Handicapped Children Act. These laws
Mainstreaming in education
■
613
aimed to allow disabled children to benefit from social associations with their peers (and vice versa). The strategy was called “mainstreaming.” Children with disabilities were allowed in regular classrooms for all or part of the school day. Activists began to explore the feasible boundaries of such inclusiveness. They took their cues from the civil rights era, hoping that no child would be sidelined into unacceptable “separate but equal” learning. They wanted to lessen the stigma for such children, advocating that each student receieve an individualized education plan (IEP) tailored to his or her needs. These plans engaged all the stakeholders in a disabled child’s education: the child, the relatives, the school, and medical experts. The Courts Weigh In In 1982, the Supreme Court issued a decision in Board of Education v. Rowley. The Court determined that schools are not obliged to provide services to maximize a child’s potential and that schools, rather than external arbiters, should decide what is appropriate educationally. During the 1980’s, the states further defined inclusion in such U.S. Court of Appeals cases as Roncker v. Walter (1983), Devries v. Fairfax County School Board (1989), and Briggs v. Board of Education (1989). By 1989, the federal courts began to order schools to institute mainstreaming. The U.S. Court of Appeals for the Fifth Circuit in Daniel R. R. v. State Board of Education (1989) clearly expected schools to do more than make a token gesture. Instead, it said that the schools must be prepared to teach children with disabilities, by providing themselves in advance with the necessary aids and services. Educators Face Problems in Mainstreaming
While the courts were deliberating, educators saw three problem areas. First, boys were being classified as disabled much more often than were girls. Minority students, such as African Americans, Latinos, and non-native English speakers, were also classified more often. Educators wondered how poverty, gender, and race factored into the situation: Were these students really disabled, or were the schools’ assessment tools skewed? Second, there were limits to the feasibility of mainstreaming. Perhaps a student with a behavioral disorder, despite much assistance, remained too disruptive to the rest of the class. Schools had to figure out what to do in such cases. Many schools tried “resource rooms,” where mainstreamed students could
614
■
go for part of the day for individualized tutoring or small-group instruction. Some schools brought another teacher into the classroom to give extra assistance to students who needed it. Throughout the decade, educators and parents sought to define the legal phrase “least restrictive environment,” which specified one of the requirements for disabled students. For each child, they tried to find out the type of setting that gave the greatest exposure to other students but that still provided that child with the best instruction. Third, most American schools were simply not yet prepared to institute mainstreaming in practice, even once they figured out how to do so in theory. As a result, there could be no quick fix to the problem. Even those who were convinced that mainstreaming was the right thing to do did not yet understand what it entailed in terms of necessary resources and expenditures. Thus, schools instituting mainstreaming had to take careful stock of its impact on the classroom. They had to make sure that their teachers were adequately trained, that the classrooms were equipped with necessary physical aids, and that appropriate teaching assistance was provided. Teachers had to begin to learn how to teach in more dynamic, sensory ways. Schools started to realize that there were other issues to work on, such as engaging parents in the process, arranging transportation for students, funding programs in urban and rural areas, hiring qualified staff, and providing home tutoring. There was no consensus as to what kind of diploma a graduating mainstreamed student should receive. Several institutes were set up to address all these questions. The Badger School, in Madison, Wisconsin, was created as a school for the severely handicapped, while the Juniper Garden Project at the University of Kansas conducted research focused on disabled students of color. Congress passed more laws to aid the adoption of mainstreaming. In 1983, Public Law 98-199 included funds to prepare disabled students for the transition from school to the workplace. In 1986, Public Law 99-457 expanded governmental intervention to aid disabled children, instituting programs designed to help such children from birth. Impact
The Eighties in America
Mainstreaming in education
Mainstreaming became the preferred method for educating students with disabilities. The courts defined the parameters of such students’ inclusion in mainstream classrooms and began to order that such inclusion take place. Schools started to
offer services from birth through the transition to employment, helping students with disabilities to become productive members of society without segregating them from the general student population. Further Reading
Allen, K. Eileen, and Glynnis Edwards Cowdery. The Exceptional Child. 5th ed. Clifton Park, N.Y.: Thomson Delmar Learning, 2004. Practical guidebook for parents and teachers; geared toward understanding and providing developmentally appropriate educational strategies. Burns, Edward. The Special Education Consultant Teacher. Springfield, Ill.: Charles C Thomas, 2004. Explains how consultants help disabled children learn with their peers. Grenot-Scheyer, Marquita, Mary Fisher, and Debbie Staub. At the End of the Day. Baltimore: Brookes, 2000. Case studies of mainstreamed students with diverse disabilities, from preschoolers to high schoolers. Rawson, M. Jean. A Manual of Special Education Law for Educators and Parents. Naples, Fla.: Morgen, 2000. Explains the legal requirements governing education of disabled children. Rief, Sandra F. M. A., and Julie A. Heimburge. How to Reach and Teach All Children in the Inclusive Classroom. 2d ed. San Francisco: Jossey-Bass, 2006. Practical guide for mainstreaming, focused on adressing the needs of both students with and students without disabilities. Sands, Deanna J., Elizabeth Kozleski, and Nancy French. Inclusive Education for the Twenty-First Century. Belmont, Calif.: Wadsworth, 2000. Introduces the ecological approach to mainstreaming in education. Ysseldyke, James E., and Bob Algozzine. Working with Families and Community Agencies to Support Students with Special Needs. Thousand Oaks, Calif.: Corwin, 2006. Practical advice for teachers that focuses on extracurricular resources that they can bring to bear. Jan Hall See also Disability rights movement; Education in Canada; Education in the United States; Multiculturalism in education; Nation at Risk, A; National Education Summit of 1989; Racial discrimination; Standards and accountability in education; Supreme Court decisions; White, Ryan.
The Eighties in America
■ Malathion spraying The Event
An insecticide is widely used on crops and released over populated areas Date July, 1981 Though used since the 1950’s and considered safe, malathion’s application by aerial spraying over a large populated area of California to kill Mediterranean fruit flies created public furor.
Malathion spraying
■
615
antine California produce, a threat to the $14 billion industry and to U.S. food prices in general, the governor changed his mind. Officials argued that the economic benefits far outweighed any possible risks to humans. They pointed out that because malathion was mixed with sugar and molasses in the aerial spray, it was difficult to inhale, and the director of the California Conservation Corps publicly drank a small quantity of very dilute malathion to demonstrate its safety. Aerial spraying of 1,400 square miles began in mid-July. During the spraying, residents followed simple rules, such as remaining indoors during the actual spraying, closing windows, removing toys and household articles from yards, and covering cars with sheeting. Few went to Red Cross shelters established as refuges. In the summer of 1982, the Medflies were discovered in the fruit-growing regions and malathion spraying commenced immediately.
A neurotoxin, malathion is an organophosphate insecticide. Because mammalian liver enzymes neutralize malathion and because it degrades quickly in the environment, it is considered to be the least harmful organophosphate. Used in both ground and aerial spraying, it controls crop pests such as the Mediterranean fruit fly (Medfly), aphid, and cotton boll weevil, as well as home garden pests. Before the 1980’s, malathion was widely used, notably on citrus crops in Florida. Medflies infest more that two hundred different Impact In September, 1982, officials announced crops and have the potential to cause great ecothat the Medfly had been eradicated. Spraying with nomic damage. In June, 1980, a male Medfly was malathion continued off and on in California and found in a trap in California. By July, a commisFlorida for the rest of the decade. Studies of children sion had quarantined large areas in Los Angeles born to women pregnant during the spraying found and Santa Clara counties. When the release of sterile male Medflies failed to curb the infestation, the U.S. Department of Agriculture proposed spraying with malathion. Fearing adverse effects on public health, some county officials voted to ban the spraying. This action spurred the California State Department of Health Services to survey literature on the effects of malathion on human health. When the department found little evidence of harm to humans, ground spraying commenced. However, in June, 1981, Medfly larvae, which eat the fruit, were discovered. Sensitive to the demands of environmentalists, Governor Jerry Brown opposed the recommendation for aerial spraying of both agricultural and inThe Mediterranean fruit fly, or Medfly, is a worldwide agricultural pest. In 1981, Calihabited areas. When the federal fornia governor Jerry Brown ordered the spraying of malathion insecticide to combat government threatened to quarMedfly infestation. (USDA/Scott Bauer)
616
■
Mamet, David
no additional birth defects or premature deliveries. However, it was revealed that some of the male Medflies released were actually fertile. Another study suggested that aerial spraying might not have been necessary because the belief that ground spraying had failed resulted from miscalculations of the extent of infested areas, rather than actual failure. Further Reading
Marco, Gino, Robert Hollingworth, and William Durham, eds. Silent Spring Revisited. Washington, D.C.: American Chemical Society, 1987. Pimental, D., and H. Lehman, eds. The Pesticide Question: Environment, Economics, and Ethics. New York: Chapman & Hall, 1993. Kristen L. Zacharias See also
Agriculture in the United States; Biopesticides; Environmental movement.
■ Mamet, David Identification
American playwright, film director, and author Born November 30, 1947; Chicago, Illinois Mamet changed the style and substance of American drama, introducing a new, distinctively stylized idiom of speech to the stage and emphasizing the perspectives of ordinary working people. During the 1980’s, Mamet continued to write plays, as he also began to branch out into other media. By 1980, David Mamet had written more than a dozen plays. During the 1980’s, he became even more prolific: He more than doubled the number of plays to his credit, while he also began to write screenplays, nonfiction, and even children’s literature. The early 1980’s was a highly productive period for him. Mamet’s Glengarry Glen Ross (pr., pb. 1983) earned both a New York Drama Critics Award for best American play and a Pulitzer Prize. His second screenplay, for Sidney Lumet’s The Verdict (1982), was nominated for an Academy Award. Mamet also began to write children’s picture books with his first wife, actress Lindsey Crouse, and published several essay collections treating such diverse topics as American theater and culture, film directing, religion, politics, and friendship. For all his phenomenal productivity in varied
The Eighties in America
genres, Mamet will be best remembered for his great 1980’s stage dramas, particularly Glengarry Glen Ross. He established himself as a “language playwright,” an artist with a finely tuned ear for American workingclass speech, especially the inflections and rhythms of his hometown, Chicago, where he perfected his craft. He possessed the ability to both capture and transform that speech, raising it to the level of art. Mamet was relatively indifferent to the commercial allure of Broadway, perhaps in part because his hardedged, often profane dialogue limited his mass appeal, as did his sharp, intellectual critiques of American society and mores. Even Mamet’s films tended to have a wicked satiric edge, beginning most notably with the devious House of Games (1987). This film, written by Mamet and starring Crouse, was also the playwright’s directorial debut. Mamet’s first exploration of confidence games and criminal manipulations, the film follows Crouse as an overconfident psychoanalyst involved with a con man; she learns that she too is being bilked and, worse, that she enjoys these mind games and betrayals. Ultimately, the audience too is conned, rooting for characters who are far different from how they seem. At the time a seeming departure into new territory, House of Games was nevertheless attuned to the playwright’s central interests: the sound and sense of American working-class speech (delivered brilliantly by a Mamet favorite, Joe Mantegna, as the con man), ruthless economic exploitation (as epitomized in Glengarry Glen Ross), and the rhetorical manipulation inherent in competitive social situations. (Mamet famously said that people may or may not tell the truth, but they always say things to advance their interests.) As he moved into screenwriting and film direction, Mamet, ever the enfant terrible, satirized Hollywood in Speed-the-Plow (pr., pb. 1988), which ran on Broadway featuring pop superstar Madonna in the central role. Impact Although he became a prolific writer in a variety of genres and an energetic, inventive filmmaker, Mamet’s greatest influence has been on the language and style of live American theater. He brought new speech patterns to the stage: frank, often rude, but real and familiar. Like England’s Harold Pinter, he also changed what was done on stage, focusing on the emotional wrestling matches behind seemingly clichéd, inane speech.
The Eighties in America
Marathon of Hope
■
617
Further Reading
Bigsby, Christopher W., ed. The Cambridge Companion to David Mamet. New York: Cambridge University Press, 2004. Brewer, Gay. David Mamet and Film: Illusion/Disillusion in a Wounded Land. Jefferson, N.C.: McFarland, 1993. Kane, Leslie. David Mamet’s “Glengarry Glen Ross”: Text and Performance. New York: Garland, 1996. Andrew Macdonald See also
Literature in the United States; Madonna;
Theater.
■ Marathon of Hope The Event
Canadian Terry Fox runs more than 2,300 miles in 143 days to raise money for cancer research Date April 12-September 1, 1980 Place St. John’s, Newfoundland, to Thunder Bay, Ontario, Canada Despite falling short of his goal to run across Canada, Terry Fox raised millions of dollars while becoming a national icon and a pioneer of charity running. A multisport athlete in high school, Terry Fox was diagnosed with bone cancer in 1977. After doctors amputated his right leg above the knee, Fox resolved to accomplish an unprecedented athletic feat, a solo run across Canada, to raise funds for the development of cancer treatments. On April 12, 1980, Fox began his run by dipping his prosthetic leg into the Atlantic Ocean on the Newfoundland coast and embarked on a circuitous route that was to cover approximately 5,300 miles through some of the country’s largest urban areas, accompanied by a small entourage of volunteers who provided support and collected money from donors. His initial goal was to raise $1 million for cancer research, but the early success of the run prompted Fox to amend his fundraising goal to $25 million—roughly one dollar for each Canadian citizen. As news of the run spread across Canada, Fox was met with increasingly large crowds at the towns and cities along his route. For the duration of his run, which he dubbed the “Marathon of Hope,” Fox proposed to run the approximate distance of a marathon (42 kilometers, or 26.1 miles) each day, stopping in populated areas to
Terry Fox runs in his Marathon of Hope in 1980. (AP/Wide World Photos)
address spectators and collect donations. By the end of August, Fox had averaged more than twenty-three miles a day with very few rest days and was nearly halfway to his destination. However, on September 1, day 143 of his run, Fox began experiencing severe chest pains and was forced to stop running. Tests revealed that Fox’s cancer had returned and spread to his lungs. Fox was hospitalized, succumbing to pneumonia on June 28, 1981. Impact In addition to the $24.17 million that Fox raised along his Marathon of Hope route, millions more continued to pour into the coffers of his Terry
618
■
Mariel boatlift
Fox Foundation following his death. Grants from the foundation to cancer researchers have been cited as critical in the development of several innovations in the detection and treatment of cancers. The foundation continued to grow into the twenty-first century, receiving annual funding from “Terry Fox Runs” conducted in numerous localities across Canada. Charity road races inspired by the Terry Fox Runs would become a staple of the running boom of the 1980’s in the United States and Canada. Terry Fox became a legend in his native Canada, the subject of a plethora of books, films, and documentaries and a role model for cancer patients and survivors. Further Reading
Coupland, Douglas. Terry: Terry Fox and His Marathon of Hope. Vancouver, B.C.: Douglas & McIntyre, 2005. Scrivener, Leslie. Terry Fox: His Story. Toronto: McClelland & Stewart, 2000. Michael H. Burchett See also
Cancer research; Medicine; Sports.
■ Mariel boatlift The Event
Massive influx of Cuban immigrants to the United States Date April 1-September 26, 1980 The Cuban government opened the port of Mariel, 119 miles from Key West, Florida, to massive migration from the island. During the next six months, some 125,000 Cubans left for the United States, including an estimated 5,000 forcibly deported former convicts, jailed criminals, and those formerly confined to mental health facilities. The undesirables were confined in American institutions for up to twenty-five years. The Mariel exodus had a historic pattern. After Cuban leader Fidel Castro seized power in 1959, he resorted to large-scale emigration to rid Cuba of his opponents. Some 200,000 Cubans left for the United States between 1960 and 1962. On September 28, 1965, Castro opened the port of Camarioca to boats with Miami exiles seeking their relatives and friends. The United States and Cuba reached an agreement on November 6, 1965, for an orderly airlift of 3,000 to 4,000 refugees from Cuba to the United States each month. The Freedom Flights,
The Eighties in America
paid by the U.S. government, brought 260,561 Cubans to America before ending on April 6, 1973. On April 1, 1980, six Cubans seeking asylum crashed a bus into the Peruvian embassy in Havana. Cuban gendarmes outside the embassy opened fire on the vehicle and one guard was killed by a ricochet bullet. Castro responded by publicly announcing the removal of the sentries. Within twenty-four hours, 10,800 Cubans had crowded into the embassy grounds. Castro then invited the exile community abroad to pick up their relatives at the port of Mariel. A huge makeshift flotilla sailed from Florida to Mariel in late April. Those seeking their relatives were forced by Cuban authorities to overload their boats with strangers and were told that their family members would later depart in other vessels. Dozens of unseaworthy boats capsized on the return trip, with scores of people drowning, and the U.S. Coast Guard had to be enlisted to perform an average of twenty rescues a day. Castro soon authorized the forced deportation of former convicts, jailed criminals, known homosexuals, prostitutes, and those formerly confined to mental institutions. U.S. president Jimmy Carter, who nine days earlier had welcomed the refugees to the United States with “open heart and open arms,” ordered a halt to the flotilla to exclude undesirables and offered a government-run sealift or airlift if Cuba agreed. Nearly two hundred boats were seized by the U.S. Coast Guard, but Castro scoffed at the cutoff proposal. Some forcibly expelled refugees hijacked commercial planes to Cuba while the boatlift was still in progress. There were thirty-nine successful skyjackings during the next three years. Castro closed the port of Mariel on September 26, 1980, out of concern that the exodus had damaged Carter’s bid for reelection against Ronald Reagan. The refugees were accommodated in U.S. military bases until they could be resettled. In spite of the knowledge that criminals and the mentally disturbed were being sent along with families, minors, and unaccompanied males, no effort was made to segregate those groups. This population mix created disturbances within the camps. Eventually, more than 62,500 refugees were interned in Eglin, Florida (10,025), Fort Chaffee, Arkansas (19,060), Fort Indiantown Gap, Pennsylvania (19,094), and Fort McCoy, Wisconsin (14,362). Approximately 71 percent of the exiles were blue-
The Eighties in America
Mariel boatlift
■
619
A boatload of Cuban refugees departs from Mariel, Cuba, as a soldier watches them wave good-bye on April 28, 1980. (AP/Wide World Photos)
collar workers, and another 8.7 percent were at the professional-managerial level. Males made up a lopsided majority of 70.2 percent. Of the total refugee population, 68.5 percent were less than thirty-six years old. Their average education level was the ninth grade. The majority of the refugees, 28.5 percent of whom had relatives in the United States, eventually settled in Miami. Legal Status, Repatriation, and Deportation
The resettled Mariel refugees received legal status in February, 1984, under the Cuban Adjustment Act of 1966. To prevent “a second Mariel,” the Reagan administration signed an immigration agreement with Cuba on December 14, 1984, for the repatriation of 2,746 Mariel undesirables and agreed to provide Cubans with twenty thousand immigrant visas annually. Castro suspended the agreement on May 20, 1985, over a political issue. Meanwhile, Mariel refugees convicted of crimes in the United States were held
for deportation after completing their sentences. The day after the immigration agreement was renewed on November 20, 1987, about one thousand Cuban inmates, outraged at the prospect of being deported, seized the federal detention center in Oakdale, Louisiana, and took twenty-eight employees hostage. Three days later, another one thousand Cuban prisoners in the Atlanta federal penitentiary also rioted and held 102 hostages. Three days later, the Reagan administration issued a deportation moratorium for seventy-six hundred Mariel detainees and agreed to review each case individually. Half of them had completed their sentences and were being held in indefinite detention. Within two years, 3,200 detainees were set free, another 2,000 remained incarcerated, and 122 were deported to Cuba. The repatriations would continue at a trickle until January 12, 2005, when the U.S. Supreme Court ruled against the indefinite detention of the 747 Mariel undesirables imprisoned
620
■
The Eighties in America
Marriage and divorce
since 1980 and ordered them released. By then, another 1,700 had been returned to Cuba. Impact The Mariel crisis reflected deficiencies in U.S. immigration and foreign policy and was partly responsible for President Jimmy Carter’s failed reelection bid. For the second time in fifteen years, Castro, in utter defiance of U.S. laws, took advantage of the American government’s vacillating policy to decree who could come to the United States. It was the largest wave of Cuban refugees to arrive in America, at a cost of $2 billion to the U.S. government. Further Reading
Doss, Joe Morris. Let the Bastards Go: From Cuba to Freedom on God’s Mercy. Baton Rouge: Louisiana State University Press, 2003. Memoir about two Episcopal priests who helped rescue more than four hundred Cuban immigrants during the Mariel boatlift. Engstrom, David Wells. Presidential Decision-Making Adrift: The Carter Administration and the Mariel Boatlift. Lanham, Md.: Rowman & Littlefield, 1997. Analysis of the Carter administration’s mishandling of the Mariel crisis. Hamm, Mark S. The Abandoned Ones: The Imprisonment and Uprising of the Mariel Boat People. Boston: Northeastern University Press, 1995. Examines the 1987 Oakdale and Atlanta prison riots. Larzelere, Alex. Castro’s Ploy—America’s Dilemma: The 1980 Cuban Boatlift. Washington, D.C.: National Defense University Press, 1988. Analysis of the crisis by a U.S. Coast Guard captain. Antonio Rafael de la Cova See also
Cold War; Crime; Elections in the United States, 1980; Foreign policy of the United States; Immigration to the United States; Latinos; Reagan, Ronald.
■ Marriage and divorce Definition
Social institution under which two people become legally united, and the legal dissolution thereof
During the 1980’s, many couples chose to delay—or seek alternatives to—traditional marriage, and single parenthood became a significant aspect both of American demographics and of popular debates about marriage. Counter-
vailing forces generated by these debates rendered marriage both more and less conventional than it had been in earlier eras. In the 1980’s, there was a reaction against many of the cultural changes of the 1960’s and 1970’s. Ronald Reagan’s election as president of the United States signaled a shift from the “anything goes” attitude that seemed to characterize those decades, to one that sought to turn back the clock to what were portrayed as more traditional values in everything from politics to family. Nancy Reagan’s Just Say No campaign against drugs quickly generalized to include premarital sexual permissiveness as well. Jerry Falwell and his Moral Majority sought to reinstitute conservative values across most of social life. Nevertheless, the forces supporting change saw continued movement of women into the workforce and the first female vice presidential candidate nominated by a major party, Geraldine Ferraro. Such popular movies as Fatal Attraction (1987) and sex, lies, and videotape (1989) portrayed the dangers of non-marital relationships, while others, such as When Harry Met Sally . . . (1989), demonstrated that even those relationships begun non-traditionally could work. Meanwhile, television shows like Dallas, Dynasty, and thirtysomething portrayed the marital woes of otherwise successful people. Marriage
The premarital permissiveness that characterized prior decades continued in the 1980’s, albeit with uneven acceleration. While births out of wedlock increased from 650,000 in 1980 to over 1 million by the decade’s end, premarital cohabitation increased by only 80 percent—a far slower rate than the 300 percent growth of the 1970’s. Increased cohabitation contributed to a significant increase in age at first marriage during the decade, from 24.7 to 26.2 for men and from 22.0 to 23.8 for women. Partly as a result of these factors, the proportion of married Americans decreased from 66 percent in 1980 to 62 percent in 1989. These figures varied by gender and race. In 1989, 64 percent of men were married, while only 60 percent of women were married. During the decade, the percentage of married whites declined from 67 percent to 64 percent, while the percentage of married African Americans fell from 51 percent to 46 percent. Overall, marriage rates declined during the first half of the decade and rebounded a bit during the second half.
The Eighties in America
Marriage and divorce
Divorce
Marital dissolution had risen for the two decades preceding the 1980’s. However, the divorce rate stabilized at the beginning of the 1980’s, and it went on to experience a slight decline. In 1980, the divorce rate was 22.6 divorces per 1,000 married women. In 1989, the rate had fallen to 20.9. This decrease reflected more conservative social mores, as well as increased cohabitation weeding out some of those most likely to divorce. Again, a gender and racial disparity was evident, with 7 percent of men and 9 percent of women being divorced in 1989. The
■
number of divorced persons per 1,000 married persons was 92 for whites and 203 for African Americans at the beginning of the decade and 133 and 282, respectively, at its end. Canadian divorce rates, meanwhile, continued to climb until the latter part of this decade before they stabilized. Remarriage While increased cohabitation and stable but high divorce rates seemed to indicate disenchantment with marriage, this disenchantment was mostly with particular marriages rather than the in-
Marital Status, by Race and Hispanic Origin, for People Eighteen and Older, 1980 and 1990 Marital Status
1980*
% of Total
1990*
% of Total
All races
159,528
100.0
181,849
100.0
Married
104,564
65.5
112,552
61.9
54,964
34.5
69,297
38.1
Never married
32,342
20.3
40,361
22.2
Widowed
12,734
8.0
13,810
7.6
Unmarried:
Divorced
9,886
6.2
15,125
8.3
139,480
100.0
155,454
100.0
Married
93,800
67.2
99,450
64.0
Unmarried:
45,681
32.8
56,004
36.0
Never married
26,405
18.9
31,633
20.3
Widowed
10,938
7.8
11,730
7.5
White
Divorced
8,338
6.0
12,640
8.1
16,638
100.0
20,320
100.0
Married
8,545
51.4
9,302
45.8
Unmarried:
8,093
48.6
11,018
54.2
Never married
5,070
30.5
7,141
35.1
Widowed
1,627
9.8
1,730
8.5
Divorced
Black
1,396
8.4
2,146
10.6
Hispanic origin**
7,888
100.0
13,560
100.0
Married
5,176
65.6
8,365
61.7
Unmarried:
2,711
34.4
5,195
38.3
Never married
1,901
24.1
3,694
27.2
Widowed
350
4.4
548
4.0
Divorced
460
5.8
952
7.0
* Numbers in thousands ** Persons of Hispanic origin may be of any race. Source: U.S. Census Bureau.
621
622
■
Married . . . with Children
stitution itself, as indicated by the fairly high rate of remarriage. At the end of the 1980’s, approximately half of all marriages were remarriages for at least one of the participants. This figure was a bit lower than it had been at the beginning of the decade, but it seemed to indicate that most Americans—even those without that personal experience—believed that marriages could work. Impact American ambivalence about traditional marriage and family status, as well as about emerging alternatives, was accentuated during the 1980’s. This ambivalence was focused in large part on working women, single mothers, and other people living “nontraditional” lifestyles. It therefore both reflected and played a part in the growing “culture wars” in U.S. society. Further Reading
Amato, Paul, et al. “Continuity and Change in Marital Quality Between 1980 and 2000.” Journal of Marriage and the Family 65 (2003): 1-22. A longitudinal survey study by some prominent family researchers on the factors that influence marital quality and how these changed during the 1980’s. Binstock, Georgina, and Arland Thornton. “Separations, Reconciliations, and Living Apart in Cohabiting and Marital Unions.” Journal of Marriage and the Family 65 (2003): 432-443. Panel study that tracks the timing of, and contributions to, the formation and dissolution of marriage and like arrangements. Coontz, Stephanie. The Way We Never Were: American Families and the Nostalgia Trap. New York: Basic Books, 1992. A prominent historian debunks popular misconceptions of families of the past. _______. The Way We Really Are: Coming to Terms with America’s Changing Families. New York: Basic Books, 1997. In this follow-up to her initial offering, Coontz clarifies the contemporary circumstances of U.S. families. Porter, Eduardo, and Michelle O’Donnell. “More Singles, and Mostly Men.” Star Tribune, Paper of the Twin Cities, August 6, 2006, p. A4. Two New York Times journalists use census data from 1980 and twenty-first century interviews to document increases in singlehood. Saluter, Arlene F. “Marital Status and Living Arrangements, March, 1992.” Current Population Reports. Washington, D.C.: U.S. Bureau of the Census, 1992. Comparative census statistics on U.S.
The Eighties in America
family and two-unrelated-adult households from 1980 to 1992. Schoen, Robert, and Vladimir Canudas-Romo. “Timing Effects on Divorce: Twentieth Century Experience in the United States.” Journal of Marriage and the Family 68 (2006): 749-758. Critique of conventional methods for calculating the probability of divorce and use of divorce data from throughout the twentieth century to provide more accurate estimates. Thornton, Arland, and Linda Young-Demarco. “Four Decades of Trends in Attitudes Toward Family Issues in the United States.” Journal of Marriage and the Family 63 (2001): 1009-1037. Historical analysis of opinion research relevant to family issues such as marriage and divorce. Wahdhera, Surinder, and Jill Strachan. “Demographic Trends of Marriages in Canada, 19211990.” Health Reports 4, no. 4 (March, 1992): 403421. Charts changes in the propensity to marry, the age at first marriage, and the tendency toward remarriage of previously divorced persons in Canada. Zinn, Maxine Baca, and D. Stanley Eitzen. Diversity in Families. Boston: Allyn and Bacon, 2002. Two prominent social-conflict theorists detail changes in function and structure that characterize racial and ethnic minority families in the United States. Scott Magnuson-Martinson See also Dallas; Demographics of Canada; Demographics of the United States; Dynasty; Falwell, Jerry; Family Ties; Fatal Attraction; Ferraro, Geraldine; Homosexuality and gay rights; Just Say No campaign; Reagan, Ronald; sex, lies, and videotape; thirtysomething; When Harry Met Sally . . . ; Women in the workforce.
■ Married . . . with Children Identification Television comedy series Date Aired from April 5, 1987, to June 9, 1997
As the first FOX network prime-time sitcom, Married . . . with Children marketed itself as an anti-sitcom. Instead of featuring a wholesome, likable family and gentle humor, the show employed dark, raunchy humor to depict a less sanitized version of the American family. The working name for the sitcom Married . . . with Children was appropriately Not the Cosbys, because it
The Eighties in America
was conceived as the antithesis to the idealized portrayal of a middle-class family in most 1980’s sitcoms, especially The Cosby Show. Married . . . with Children differentiated itself from such other sitcoms by pushing the limits of what was desirable or permissible on television. It employed cruder humor with a pointed undercurrent of satire. As the first FOX sitcom, Married . . . with Children made FOX a competitor to the Big Three television networks, and its tone became a major aspect of the fledgling network’s attempt to develop a coherent brand identity. Thus, the edgy rejection of idealizing American institutions evident in the sitcom became a trademark of the FOX network itself. The show became FOX’s longest-running live-action sitcom, running for a total of eleven seasons. Married . . . with Children was conventional in one respect: It focused on the home life of a single Chicago family. The father, Al Bundy (Ed O’Neill), outwardly displayed discontent with his tragically dissatisfying life as a shoe salesman; his wife, Peggy (Katey Sagal), refused the role of the typical housewife but also refused to work; his daughter, Kelly (Christina Applegate), was portrayed as stupid and promiscuous; and his son, Bud (David Faustino), was defined largely by his inexperience with women, as well as his propensity for exploiting his sister’s lack of intelligence. The Bundys’ dog, Buck, was also a significant character, whose thoughts were heard in voice-over. Also typical of many of the sitcoms that Married . . . with Children skewered, the Bundys’ next-door neighbors were recurring characters. Marcy (Amanda Bearse) was the breadwinner for her household and was frequently Al’s nemesis. Her first husband, Steve (David Garrison) hatched get-rich-quick schemes. When Garrison left the show, his character was replaced by a second husband, Jefferson (Ted McGinley), who was portrayed as a male bimbo and trophy husband. Married . . . with Children made near caricatures of its central characters and poked fun at familial expectations and social roles; it defied the family ideal by treating family as a curse. Indeed, most episodes focused on the “Bundy Curse,” the endless stream of bad luck that thwarted Al at every turn and prevented him from ever living a satisfying life. The humor of the show rested on Al’s inability to succeed, and Al was often forced to be
Married . . . with Children
■
623
content with his family and his dismal yet comfortable life. Al humorously avoided sex with Peggy, overused the toilet, attended strip clubs (Peggy did the same), sent his son to strip clubs, and was known by the trademark move of putting his hand in the waistband of his pants as he sat in front of his television. Still, there were moments of redemption for the character, when he convinced his family to work together (often for one pessimistic cause), when he would grudgingly admit to loving his wife, or when he defended his daughter by beating up her boyfriends. Even with some traditional sitcom characteristics, the show was successful primarily because of its explicit attack on the saccharine idealization of the family perpetrated by other sitcoms, as well as its embrace of vulgar humor that other sitcoms avoided. Both the coarse humor and the gleeful embrace of ugliness in its portrayal of family values connected to a different side of viewers from that addressed by the competing networks. The show’s exaggerated stereotypes and crude yet honest characters set it—and FOX—apart. Impact
Married . . . with Children put the FOX network into the running with other prime-time television networks by providing a new type of sitcom that focused on the humor of pessimism. Indeed, perhaps its greatest function was to give voice to American pessimism at a time when the other networks were largely in agreement with President Ronald Reagan that it was “Morning in America.” The show made dysfunction acceptable, precisely because the dysfunction it portrayed was recognizable to a generation that could not see itself in the sitcoms of the Big Three networks. The series thereby opened new doors for sitcoms that strayed from traditional familial roles.
Further Reading
Jacik, Anton. The Official “Married . . . with Children” Trivia Book. Charleston, S.C.: BookSurge, 2004. Lasswell, Mark. TV Guide: 50 Years of Television. New York: Crown, 2002. Jean Prokott See also Cheers; Cosby Show, The; Designing Women; Facts of Life, The; Family Ties; FOX network; Golden Girls, The; Sitcoms; Television; Wonder Years, The.
624
■
The Eighties in America
Martial arts
■ Martial arts Definition
Specialized forms of hand-to-hand combat and self-defense that are also practiced as sports
During the 1980’s, a wide variety of international martial arts became popular in the United States—as disciplines in which to train, as spectator sports, and as the subjects of movies and television programs. There are dozens of martial arts schools and forms. These various schools originated in a number of regions including Europe, Asia, Africa, and the Americas. Most often, they were originally developed as forms of hand-to-hand combat, or as combined combat styles and philosophical disciplines. Martial arts are often associated with unarmed, bodily combat, but many also employ weapons, including, staffs, swords, clubs, and bows and arrows. Skilled martial artists may also be trained to employ random objects to augment the offensive and defensive capabilities of their hands and feet. The major forms, such as boxing (Europe), karate (Japan), kung-fu (China), and tae kwon do (Korea) have been developed over a period of centuries. Each of the major types of martial arts has given rise to many variations institutionalized in hundreds of schools practicing derivative forms around the world. Immigrant populations brought martial arts into the United States from the country’s beginnings, but most were practiced in military and in private schools and did not enter the public limelight. With the advent and spread of various media, the martial arts were developed in dojos, or schools, across the country. They were popularized in print, film, and television. As they became popular, martial arts provided new cultural input and exchange and increased the diversity of fighting arts and awareness of foreign ideologies in the United States. As the process of globalization accelerated during the 1980’s, hundreds of derivative schools were developed to satisfy the interests of those wanting to learn either simple self-defense methods or an entire way of life based upon the philosophical systems underlying each of the forms practiced. Impact The martial arts expanded enormously in the 1980’s as a result of their popularization in the media. Actors such as Bruce Lee in the 1970’s created both a movie and a television platform, origi-
nating primarily in the staged operas of Beijing and transferred via Hong Kong to Hollywood. In the 1980’s, the next generation of performers took center stage. These included Jackie Chan in Hong Kong, Chuck Norris—who originated the American Style of karate and became an American icon in the television series Walker, Texas Ranger—and other popular action movie stars such as Jean-Claude Van Damme and Steven Seagal. Each of the many stars and practitioners during the 1980’s had his or her own unique background, school, and style of martial arts. Together, they exposed the public to ancient elements of cultures from around the world. Many action films and television shows were produced in the 1980’s that, combined with the emergence of thousands of schools across the country, completed the integration of martial arts into mainstream American society. Further Reading
Borkowski, Cezar. The Complete Idiot’s Guide to Martial Arts. Royersford, Pa.: Alpha Press, 1998. West, David. Chasing Dragons: An Introduction to the Martial Arts Film. New York: I. B. Tauris, 2006. Michael W. Simpson See also Action films; Asian Americans; Boxing; Film in the United States; Globalization.
■ Martin, Steve Identification
American comedian, writer, producer, actor, and entertainer Born August 14, 1945; Waco, Texas Martin, a leading comedian, writer, producer, and actor, had a strong influence on 1980’s popular culture through his comedic movies, recordings, and television performances. Steve Martin greatly extended his reach in entertainment in the 1980’s. As a youth, he had sold magic gadgets at the magic shop on Disneyland’s Main Street, where he learned to do magic and balloon tricks. Martin became a comedy writer for the Smothers Brothers in the late 1960’s and rose to fame as a “wild and crazy guy” and a zany comedian in the 1970’s with his trademark arrow-through-thehead prop. Martin was a frequent guest on Saturday Night Live and gained wide recognition in 1979 by
The Eighties in America
Martin Luther King Day
■
625
Roxanne, and he won a Writers Guild of America award for his contribution to the screenplay of Roxanne. Martin’s stand-up comedy also continued to attract a strong following, as he was equally popular as a guest on television and as a live stage performer. He appeared as a frequent guest on shows such as Saturday Night Live, The Tonight Show Starring Johnny Carson, Late Night with David Letterman, and a wide range of other television programs. He also released an album in 1981 called The Steve Martin Brothers, in which he performed comedy routines and demonstrated his significant skill as a banjo player. Impact Steve Martin’s films and comedy reached people around the United States and the world. His rise from a stand-up comedian to film in the 1980’s carried him to recognition by his peers who in 2005 cited him as one of the top fifteen comedy acts in history. During the 1980’s, he began to expand beyond simple comedy, building a platform for more serious and complex endeavors as an actor and writer in later decades. Further Reading
Steve Martin performs a parody of Michael Jackson’s “Billy Jean” music video in January, 1984. (AP/Wide World Photos)
Martin, Steve. Born Standing Up: A Comic’s Life. New York: Scribner, 2007. Walker, Morris Wayne. Steve Martin: The Magic Years. New York: S.P.I. Books, 2001. Douglas A. Phillips See also
Comedians; Film in the United States;
Television. starring in the movie The Jerk, which he also cowrote. By the 1980’s, Martin had a following comparable to those of rock stars of the era. His entertainment activities also broadened greatly, as he added to his work as a stand-up comedian, writer, producer, and television personality. Mainstream movies followed in the 1980’s, as he played starring roles in Pennies from Heaven (1981), Dead Men Don’t Wear Plaid (1982), The Man with Two Brains (1983), The Lonely Guy (1984), All of Me (1984), Movers and Shakers (1985), ¡Three Amigos! (1986), Little Shop of Horrors (1986), Roxanne (1987), Planes, Trains, and Automobiles (1987), Dirty Rotten Scoundrels (1988), and Parenthood (1989). Martin also was a writer on many of these films and served as the executive producer of Roxanne and ¡Three Amigos! His acting career also branched out from comedy with All of Me and
■ Martin Luther King Day Identification U.S. federal holiday Date Established in 1986; celebrated each year
on the third Monday in January In 1986, the United States established a federal holiday celebrating the life and achievements of the internationally revered human rights activist Martin Luther King, Jr. Following the assassination of Martin Luther King, Jr., in April, 1968, Democratic representative John Conyers of Michigan introduced legislation calling for the establishment of a national holiday commemorating the civil rights leader’s life and achievements. Congress failed to act on Conyers’s bill, despite the lobbying efforts of the Southern Christian
626
■
M*A*S*H series finale
Leadership Conference (SCLC) in 1971, when the organization presented Congress with three million signatures supporting the establishment of a national holiday in King’s honor. Civil rights activists continued to campaign for a King holiday throughout the 1970’s. Movement toward a King holiday progressed at the state level as well when, in 1973, Illinois became the first state to enact a King holiday law. Massachusetts and Connecticut enacted similar laws the following year, while a 1975 New Jersey State Supreme Court decision ruled that the state provide a paid holiday for state employees in honor of King. By the late 1970’s, pressure on Congress to create a federal holiday intensified, with organized marches held in Washington, D.C., as well as lobbying by the King Center. President Jimmy Carter called on Congress to pass King holiday legislation in 1979, and King’s widow, Coretta Scott King, testified before joint hearings of Congress in support of the legislation. In 1980, Stevie Wonder released a hit song called “Happy Birthday” that celebrated King and promoted the King holiday movement. Two years later, Wonder and other activists presented more than six million signatures in support of a national holiday to Tip O’Neill of Massachusetts, the Democratic Speaker of the House of Representatives, in an effort to push for legislative action. The King holiday bill, which proposed designating the third Monday of every January as a celebration of King’s January 15 birthday, passed the House of Representatives on a bipartisan vote of 338 to 90 in August, 1983. In October of that year, Democratic senator Ted Kennedy of Massachusetts sponsored a corresponding bill in the Senate, which, after a vigorous debate, passed by a vote of 78 to 22. Republican senator Jesse Helms of North Carolina opposed the legislation and had sought unsuccessfully to generate opposition to the proposed holiday by denouncing King’s anti-Vietnam War stance and by insisting that King had maintained communist connections. Despite his own misgivings, President Ronald Reagan signed the King holiday bill into law on November 3, 1983, establishing the third Monday of every January starting in 1986 as the Martin Luther King, Jr., National Holiday. Impact Several states, notably New Hampshire, Arizona, and Utah, resisted enacting corresponding King holidays for state employees and institutions. However, public pressure eventually resulted in all
The Eighties in America
fifty states officially observing Martin Luther King Day. After Congress passed the King Holiday and Service Act in 1994, Martin Luther King Day was additionally designated as a national day of volunteer service for Americans. Further Reading
Dyson, Michael Eric. I May Not Get There with You: The True Martin Luther King, Jr. New York: The Free Press, 2000. Hanson, Drew D. The Dream: Martin Luther King, Jr., and the Speech That Inspired a Nation. New York: Ecco, 2003. Brooke Speer Orr See also
Affirmative action; African Americans; Conservatism in U.S. politics; Liberalism in U.S. politics; Racial discrimination; Reagan, Ronald.
■ M*A*S*H series finale The Event
The last episode of a long-running dramatic comedy is aired Date February 28, 1983 The finale of M*A*S*H was seen by over 100 million viewers, making it the most-watched television episode in U.S. history. M*A*S*H followed the exploits of the staff of the 4077th Mobile Army Surgical Hospital (M*A*S*H), a fictional unit stationed near the front lines of the Korean War from 1950 to 1953. Produced by Twentieth Century-Fox, the thirty-minute weekly series premiered September 17, 1972. M*A*S*H was based in part on a 1968 novel by Richard Hooker (a pseudonym of former Army doctor H. Richard Hornberger) but more directly on Robert Altman’s 1970 motion picture adaptation of the novel. Like the film, the initial series coincided with the growing unpopularity of the Vietnam War among Americans, and its Korean setting was interpreted by many as a stand-in for Vietnam. Four original cast members remained with the series for its entire eleven-year run: Alan Alda as chief surgeon Captain Benjamin Franklin “Hawkeye” Pierce; William Christopher as Chaplain Francis John Patrick Mulcahy; Loretta Swit as head nurse Major Margaret Houlihan; and Jamie Farr as company clerk Maxwell Q. Klinger. The ensemble decided to
The Eighties in America
end the series after ten seasons, but the Columbia Broadcasting System (CBS) and Twentieth CenturyFox persuaded them to participate in an abbreviated eleventh season. The finale followed 250 half-hour episodes, which were produced over a time span almost four times as long as that of the Korean War. As the series progressed, it tempered its early comedic satire with quite serious and dramatic story lines. In particular, star Alan Alda, who directed the finale, used his character to espouse liberal political views and address more serious topics during the latter years of the series. The two-and-a-half-hour finale reflects the series’ evolving tone and focus. Titled “Goodbye, Farewell, and Amen,” it begins in familiar fashion. A helicopter bringing wounded to the unit is met by dedicated M*A*S*H personnel rushing to the landing pad. (The show’s opening theme music remained the same for eleven years, “Suicide Is Painless,” which
M*A*S*H series finale
■
627
had been performed in the 1970 movie.) Through a series of flashbacks, viewers learn that Hawkeye is being treated at a psychiatric facility after suffering a breakdown in response to witnessing a terrible event. Through conversations with recurring character Sydney Freedman, an Army psychiatrist, Hawkeye remembers the event in stages: He was on a bus full of people that had to hide from an enemy patrol. One of his fellow passengers killed a chicken she was carrying in order to keep it quiet so they would not be found. Finally, Hawkeye remembers that she killed, not her chicken, but her infant. The emergence of the suppressed memory allows him to regain a modicum of sanity. The entire episode is full of such drama. Klinger futilely searches for his fiancée’s displaced parents. Chaplain Mulcahy suffers hearing loss. A forest fire causes the unit to dismantle its camp and relocate. Major Charles Emerson Winchester III (David
Members of the cast of M*A*S*H take a break from filming during the show’s final season. From left: William Christopher, Harry Morgan, Mike Farrell, Alan Alda, and Jamie Farr. (AP/Wide World Photos)
628
■
Max Headroom
Ogden Stiers) works with a five-piece Chinese orchestra, which is subsequently killed in a truck bombing as they leave for an anticipated prisoner exchange. Impact M*A*S*H’s finale was the most-viewed episode of a regular television series in history. It enjoyed a Nielsen share of 77, meaning that 77 percent of the televisions that were switched on at the time it was broadcast were tuned to the program. The finale was seen by more than 60 percent of American households. Not willing to squander this soapbox on fictional drama alone, the show’s producers included statistics on the Korean War as part of the episode. As the cease-fire took effect, the camp’s public address system announced that 2 million people were killed or wounded in a war that cost $22 billion. It also listed U.S. military casualties and missing-inaction totals. Further Reading
Hooker, Richard. M*A*S*H. New York: William Morrow, 1968. Kalter, Suzy. The Complete Book of M*A*S*H. New York: Harry N. Abrams, 1988. Randy Hines See also
Sitcoms; Television.
■ Max Headroom Identification Futuristic television series Creators Annabel Jankel (1955) and Rocky
Morton (1955) Aired from March 31, 1987, to May 5, 1988
Date
Max Headroom, a biting satire of the state of television, journalism, and popular culture in the 1980’s, was the first American television show to depict postmodern cyberpunk culture. Based on a television movie that aired in England in 1984, Max Headroom debuted in the United States on the American Broadcasting Company (ABC) in 1987. The show was set “twenty minutes into the future,” and it featured television news reporter Edison Carter. Played by the Canadian actor Matt Frewer, who also starred in the earlier British production, Carter works for a major television channel, Network 23, in a future society defined by class disparity, violence, powerful corporations, and total
The Eighties in America
dependence on credit. Television is pervasive in this culture, as it is illegal to equip a television with an “off ” switch, and networks see their ratings in real time, so they know second by second if they are gaining or losing viewers and act immediately in response to such changes. People in this future society are tracked through massive databases that catalog their every purchase and every movement, and those few who have escaped being tracked by the system are called “blanks.” The blanks live on the fringes of society, inhabiting a nearly post-apocalyptic inner-city slum and trading for the commodities they need to survive on a lowtech black market. Carter is assisted in his field assignments by his “operator,” Theora Jones (Amanda Pays), who generally remains at the network offices, where she uses her computer hacking skills to allows Carter access to restricted areas to obtain news stories. The first episode of the series set the tone of pointed satire directed against television networks. It deals with “blipverts,” split-second advertisements that pack a full commercial’s worth of information into the blink of an eye, so viewers have no time to change channels before the commerical ends. The blipverts have the minor side effect of causing some viewers to explode. While investigating these mysterious deaths, Carter has a motorcycle accident and goes into a coma. The last thing he sees before losing consciousness is a warning about low clearance on an exit barrier: “Max Headroom 2.3 Meters.” In an attempt to discover what secrets Carter knows about Network 23, Bryce Lynch (Chris Young), the boy genius behind research and development for the network, uploads a copy of Carter’s brain into a computer. Thus Max Headroom, a cybernetic version of Carter, is born. Max is sentient and able to roam at will around cyberspace—an extremely important skill in a world completely controlled by databases. Max Headroom was represented as a stylized, animated version of Frewer’s head, which appeared on screens within the show against a background of angled neon lines. His characteristic stuttering speech represented glitches in the computer system. Max became a sort of agitator, questioning the practices of the network and passing along sensitive information he discovered in cyberspace to Jones and the fully recovered Carter. Subsequent episodes challenged television staples such as game shows, advertising, ratings, and televangelism.
The Eighties in America Impact The series was short lived, lasting only fourteen episodes, but it attracted a cult fan base and garnered critical praise for its postmodern examination of television and society. After Max Headroom was canceled, Coca-Cola picked up the Max Headroom character as a spokesperson for its products, specifically New Coke, apparently unconcerned by the sharp critique of both multinational corporations and advertising present in the original show. Max, still played by Frewer in many layers of makeup, went on to have a short-lived talk show on Cinemax. Further Reading
Abbott, Rebecca L. “Selling Out Max Headroom.” In Video Icons and Values, edited by Alan M. Olson, Christopher Parr, and Debra Parr. Albany: State University of New York Press, 1991. Bukatman, Scott. Terminal Identity: The Virtual Subject in Postmodern Science Fiction. Durham, N.C.: Duke University Press, 1993. Lentz, Harris M. Science Fiction, Horror, and Fantasy Film and Television Credits, Supplement 2, Through 1993. Jefferson, N.C.: McFarland, 1994. Roberts, Steve. Max Headroom: The Picture Book of the Film. New York: Random House, 1986. Ross, Andrew. “Techno-ethics and Tele-ethics: Three Lives in the Day of Max Headroom.” In Logics of Television: Essays in Cultural Criticism, edited by Patricia Mellencamp. London: BFI Books, 1990. Lacy Schutz See also
Blade Runner; Cyberpunk literature; In formation age; Journalism; MTV; New Coke; Sciencefiction films; Television; Tron; Virtual reality.
■ Medicine Definition
Medical discoveries and advances during the decade
The 1980’s served as a transition era in which technology developed in the previous decades was applied to medical diagnosis as well as innovations such as genetically engineered vaccines. The period also was one that saw the recognition of new diseases such as acquired immunodeficiency syndrome and toxic shock. The decades immediately preceding the 1980’s saw the development of technologies that, despite their
Medicine
■
629
infancy, showed promise in their application to the diagnosis and treatment of disease. For example, magnetic resonance imaging (MRI) was developed and applied in producing internal images of tissues and organs. Techniques and discoveries to be applied in a new field, molecular biology, altered the study of biology from one primarily of observation to one of understanding at the molecular level. The 1980’s was arguably an “age of innocence” in medicine, as existing antibiotics were felt to be adequate in treating outbreaks of infectious diseases, a historic problem that scientists and physicians felt had been largely contained. Recognition of New Diseases Toxic shock syndrome (TSS), a rare illness of indeterminate cause, was known for much of the twentieth century. In 1980, however, the Centers for Disease Control (CDC) were notified of a sudden outbreak that ultimately affected more than three hundred women; more than two dozen died. The source was ultimately linked to use of a particular brand of tampon: Rely. The disease abated when the manufacturer, Procter & Gamble, removed the tampon from the market. The illness, the result of a product produced by two species of bacteria, Staphylococcus aureus and Streptococcus pyogenes, was triggered by the ability of the material of which the tampon was composed to induce production of the toxin. Of greater significance worldwide was the recognition of what was initially thought to be a new disease, acquired immunodeficiency syndrome (AIDS). Recognition of an immunodeficiency syndrome was first reported in the June 5, 1981, publication Morbidity Mortality Weekly Report. The story described an unusual and rare parasitic lung infection, Pneumocystis carinii pneumonia (PCP), in five homosexual men in Los Angeles. An additional outbreak of a rare illness, Kaposi’s sarcoma, also appeared that summer among homosexual men in New York. Because until then these diseases had been reported only in homosexuals, they were initially referred to as gay-related immunodeficiency disorder (GRID). The illnesses were found to be associated with the loss of immune function in these victims. Reflecting this relationship, the name of the disease was changed to AIDS the following year. Initially, the disease was thought to be associated with behaviors associated with sexual practices, such as the use of “poppers” (amyl nitrite) to enhance
630
■
The Eighties in America
Medicine
sexual feelings. Others argued for a role of “sperm overload.” In 1983 and 1984, a newly discovered virus, eventually named human immunodeficiency virus (HIV), was shown by two independent laboratories to be the etiological agent of AIDS. Tens of thousands of persons in the United States, and larger numbers worldwide, were infected by the end of the decade, and one-third of these persons died. Victims included prominent members of the entertainment community such as Liberace and Rock Hudson. Nevertheless, the administration of President Ronald Reagan largely ignored the outbreak until it had arguably spread out of control. By 1989, the first effective drug for treatment of AIDS, azidothymidine (AZT), had been licensed by the Food and Drug Administration (FDA). The closing of many of the bathhouses linked to spread of the disease in larger cities, as well as education of the homosexual communities, served to slow the spread in this population. However, the increase in numbers of intravenous drug abusers more than offset any reduction in numbers from the gay population. From Prions to Lifestyle Diseases
In 1982, Stanley B. Prusiner, a neurologist at the University of California, San Francisco, proposed that a “proteinaceous infectious substance,” known as a prion, was the etiological agent behind a neurodegenerative disease in cattle and sheep called scrapie. The same agent was also shown to be the cause of a number of human diseases, most notably Creutzfeldt-Jakob disease (CJD). Ingestion of the scrapie agent in contaminated beef by large numbers of the British population during the latter half of the decade resulted several years later in the diagnosis of more than one hundred cases of CJD that developed in this population. The illness became known as “mad cow disease,” reflecting its source. An unusual aspect of prions was the lack of genetic material in their structure, one composed entirely of protein. Since all forms of replicating biological agents were thought at the time to contain either deoxyribonucleic acid (DNA) or ribonucleic acid (RNA), Prusiner’s proposal of a “replicating protein” was highly controversial. He was proven correct and was awarded the Nobel Prize in Physiology or Medicine in 1997. Gastric ulcers were historically linked to effects of lifestyle, particularly diet and stress. Treatments generally consisted of stress therapy or the use of acid inhibitors such as Zantac or antacids to neutral-
ize gastric acid. In 1983, Australian physician Barry Marshall discovered that a bacterium, Helicobacter pylori, was associated with both ulcer formation and (subsequently) certain forms of stomach cancer. Marshall began prescribing antibiotics for treatment of ulcers, noting a significant reduction in recidivism of ulcers. Marshall would be awarded the Nobel Prize in Physiology or Medicine in 2005. The association between smoking and lung cancer, well known since the publication of Smoking and Health by the office of the U.S. surgeon general in 1964, continued in the news. In 1982, Surgeon General C. Everett Koop referred to smoking as “the chief preventable cause of death.” Lung cancer associated with smoking killed more than 100,000 Americans that year, six times the level reported thirty years earlier; deaths from lung cancer in women, most of which were associated with smoking, passed those resulting from breast cancer. Pharmaceuticals
Antacids historically were based upon neutralization of acids produced in the stomach and generally consisted of forms of bicarbonate salts. The exception was Tagamet, an antiulcer drug, introduced by SmithKline in the 1970’s, that blocked acid production. However, it often produced unwanted side effects. In 1981, Glaxo Pharmaceuticals began the sale of Zantac, a drug similar in action to Tagamet but with fewer side effects. Within five years, the popularity of Zantac would place it at the top of prescription drug sales. Zantac was shown to be both safe and effective, and it would eventually be sold over the counter. Over-the-counter pain medication at the beginning of the decade consisted primarily of aspirin and acetaminophen (such as Tylenol). In 1983, ibuprofen, formerly available only through prescription, became available over the counter. Ironically, Nobel laureate Ulf von Euler, whose work contributed to the development of acetaminophen, died in March of that year at age seventy-eight. Extended use of aspirin and acetaminophen in high doses would later be linked to liver damage, and long-term use of ibuprofen in high doses was later shown to induce ulcers. Several anti-AIDS drugs were introduced during the decade: isoprinosine and ammonium-tungstoantimoniate (HPA-23) in 1985 and AZT in 1987. Only AZT would be shown to exhibit long-term efficacy against AIDS. Other antibiotics directed against
The Eighties in America
bacterial infections also made their appearance. A new class of antibiotics known as fluoroquinolones were introduced in 1986, the most prominent of which was ciprofloxacin (Cipro). The name Cipro would become familiar to the general public as an effective treatment during the anthrax scare some fifteen years later. The first genetically engineered vaccine, directed against hepatitis B virus (HBV), was also introduced in 1986. An improved rabies vaccine was introduced in 1980, replacing the version that required as many as twenty-three abdominal injections and that dated to the time of Louis Pasteur in the 1880’s. The first statin drug for lowering cholesterol, lovastatin, was approved in 1987. The antidepressant drug fluoxetine hydrochloride received FDA approval in December that year. Marketed by Eli Lilly under the name Prozac, the drug would be used by more than forty million people by the end of the century. Not all news about pharmaceuticals was upbeat during the decade. In Chicago in 1982, someone placed cyanide in capsules of Tylenol, resulting in seven deaths. In October, 1982, the producer of Tylenol, Johnson & Johnson, recalled all samples of the product. The killer was never identified, but in order to prevent a repeat of the episode, Tylenol packages were triple-sealed to prevent random tampering. Advances in Technology
New technologies developed in the previous decade continued to be improved. MRI, the basis for which was discovered by Nobel laureates Felix Bloch and Edward Mills Purcell in 1946, was applied to imaging of the body. In 1980, obtaining a useful image required an exposure time of five minutes; by 1986, exposure time was reduced to seconds. Though ultrasonic imaging had been practiced for nearly a half century, improvements that allowed for “real-time” imaging were applied in the field of obstetrics. It became possible to determine details ranging from the sex of a fetus to observing fetal malformations; the term “fetal sonography” was coined, representing a new aspect to the field. An unexpected result of the procedure was an increase in abortions of female children in China and India, countries in which birth of males was considered desirable. In 1982, the first viable attempt at an artificial heart transplant, performed by Dr. William DeVries,
Medicine
■
631
took place in December. The patient, Barney Clark, would live nearly four months with the device before his death due to multiple organ failure. Impact The greatest impact of the wide range of medical advances would be in the area of cost. By the end of the decade, more than 11 percent of the U.S. gross national product would go to medical programs, more than $2,750 per capita, double that at the beginning of the decade (and a number that would double again the following decade). By law, hospitals could not turn away patients who lacked insurance coverage, a cost that had to be absorbed by other agencies. Increased costs associated with health insurance, both by the individual as well as corporations, represented a significant portion of these costs. The issue of medical care for the poor would be a political issue for the foreseeable future. The ability to monitor genetic defects also had unexpected consequences. Development of improved methods of amniocentesis meant a wide range of birth defects could be monitored early in a pregnancy. Since it became possible to detect traits such as Huntington’s disease that would prove fatal, the question of whether insurance companies would be required to cover such potential patients had yet to be addressed. Further Reading
Allen, Arthur. Vaccine: The Controversial Story of Medicine’s Greatest Lifesaver. New York: W. W. Norton, 2007. The story behind the development of most significant vaccines, beginning with Edward Jenner and smallpox. The hepatitis B vaccine, developed in the 1980’s, is among those described. Also addressed is the issue of whether autism is a possible side effect. Barlett, Donald L., and James B. Steele. Critical Condition: How Health Care in America Became Big Business—and Bad Medicine. New York: Doubleday, 2004. Story behind the development and evolution of the American health care system. Authors discuss the balance between coverage by both the companies and government, and the costs associated with the business aspects of medicine. Childs, Barton. Genetic Medicine: A Logic of Disease. Baltimore: The Johns Hopkins University Press, 2003. Addresses the growing understanding of the relationship between genetic makeup and the environment in development of disease. Much of the writing is historical.
632
■
Meech Lake Accord
Garrett, Laurie. The Coming Plague: Newly Emerging Diseases in a World out of Balance. New York: Penguin Books, 1995. Chronicles the spread of infectious disease over the past fifty years. Argues that changes in human behaviors have resulted in increased incidence of disease caused by Ebola, HIV, and other organisms. Shilts, Randy, and William Greider. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin’s Press, 2000. Updated description of the outbreak of the AIDS epidemic and how the lack of recognition by the government during the 1980’s contributed to its spread. Shilts was a newspaper reporter who later succumbed to the illness. Richard Adler See also AIDS epidemic; Alternative medicine; Artificial heart; Baby Fae heart transplantation; Cancer research; Fetal medicine; Genetics research; Health care in Canada; Health care in the United States; Health maintenance organizations (HMOs); Homosexuality and gay rights; Hudson, Rock; Koop, C. Everett; Plastic surgery; Transplantation.
■ Meech Lake Accord Identification
A failed attempt to revise the Canadian Constitution Date Put forward June 3, 1987; ratification failed June, 1990 Prime Minister Brian Mulroney attempted in the Meech Lake Accord to complete the process of establishing an independent government for Canada, by gaining acceptance from the province of Quebec of the 1982 Canadian constitution. In 1982, the constitution of Canada was patriated, granting the nation full independence from the United Kingdom. The province of Quebec never formally accepted the new constitution, however. In 1987, Prime Minister Brian Mulroney, who had taken office in 1984, sought to gain Quebec’s formal acceptance. Mulroney was bolstered in this effort both by his own political success and by a change of government in Quebec. The sovereignist Parti Québécois, whose former leader René Lévesque had led opposition to the constitution, had been defeated shortly after Lévesque’s retirement, and Lib-
The Eighties in America
eral Party leader Robert Bourassa had regained the province’s premiership. At the end of April, 1987, Mulroney gathered the premiers of the ten Canadian provinces for a meeting at Meech Lake in Quebec. The meeting resulted in a set of terms designed to gain Quebec’s acceptance of the constitution. Among the chief provisions were the recognition of Quebec as a “distinct society” within Canada; veto power for Quebec on future constitutional amendments; increased provincial powers, especially for Quebec, to regulate immigration; compensation for provinces deciding to opt out of federal programs; and continued discussion of reforms relating to the upper house (Senate) of the federal parliament. The accord required the approval of the ten provincial legislatures within a three-year period, by early June of 1990. Although the accord had widespread support in the early stages of debate, opposition to it grew as the process continued. Among prominent national figures opposing it were former prime minister Pierre Trudeau, who believed that the accord weakened the federal government by giving too much power to the provinces. The use of the phrase “distinct society” to give special status to Quebec was also unpopular, as was the manner in which the accord itself was reached, by “eleven men in suits” (the premiers and the prime minister) meeting behind closed doors. Opposition in two provinces—Newfoundland and Manitoba—proved decisive. In the former, a new government reversed the province’s earlier legislative support. Meanwhile, a First Nations member of Manitoba’s legislature, Elijah Harper, worked to block ratification of the accord, because he believed that it overlooked the rights and needs of aboriginal peoples. In the end, the accord remained unratified when the three-year time limit expired. Impact The most significant impact of the Meech Lake Accord occurred in Quebec, where its defeat led to renewed support for sovereignism. Quebec leader Lucien Bouchard resigned his cabinet post in the Mulroney government and played a key role in the founding of the new sovereignist group Bloc Québécois. In the broadest sense, the failure of the accord continued to leave the long-term question of Quebec’s place within the Canadian federal system uncertain.
The Eighties in America Further Reading
Monahan, Patrick J. Meech Lake: The Inside Story. Toronto: University of Toronto, 1991. Waller, Harold M. “How Not to Govern: Canada’s Meech Lake Mistake.” The New Leader 73, no. 9 (July 9, 1990): 8-10. Scott Wright See also Aboriginal rights in Canada; Bourassa, Robert; Canada Act of 1982; Canada and the British Commonwealth; Canadian Charter of Rights and Freedoms; Lévesque, René; Minorities in Canada; Mulroney, Brian; Quebec referendum of 1980; Trudeau, Pierre.
■ Meese, Edwin, III Identification
Attorney general of the United States from 1985 to 1988 Born December 2, 1931; Oakland, California Meese served in several capacities as an adviser to President Ronald Reagan, both before and during Reagan’s presidency, including serving as U.S. attorney general from February, 1985, to August, 1988. Edwin Meese III was educated at Yale University and the University of California Law School. In 1966, Meese became an adviser to Ronald Reagan’s gubernatorial campaign, and he later joined the governor’s staff as legal affairs secretary, eventually rising to become Governor Reagan’s chief of staff. Meese also served as chief of staff and senior issues adviser to Reagan during his 1980 presidential campaign and headed Reagan’s transition team after the election. Meese, along with White House Chief of Staff James Baker and Deputy Chief of Staff Michael Deaver, were sometimes called the “White House Troika,” the innermost circle of Reagan’s advisers. Meese served in numerous positions under Reagan before becoming attorney general in February, 1985. As attorney general, Meese strongly advocated the doctrine of “original intent” in interpreting the Constitution. Meese oversaw the review of candidates for federal judicial nominations, and critics charged that Meese submitted these candidates to “litmus tests” on certain issues and on the “original intent” principle. Meese, however, denied these claims. In 1985, at President Reagan’s direction, Meese created the Attorney General’s Commission
Meese, Edwin, III
■
633
on Pornography (also known as the Meese Commission). The commission released a massive report in July, 1986, stressing the harmful effects of pornography and the connections between the pornography industry and organized crime. The report was widely criticized as biased and extremist by free-press advocates and those involved in the industry. In November, 1986, Meese led an in-house investigation of what eventually became known as the Iran-Contra affair, the illegal diversion of money from arms sales to Iran to fund the counterrevolutionary Contras in Nicaragua. Meese concluded that the diversion of funds was the work of a small number of people acting without the knowledge or approval of the president or other high-level officials. The Iran-Contra affair was the subject of numerous investigations, including that of the independent counsel, Lawrence E. Walsh. While Meese was never prosecuted for involvement in the affair, Walsh’s final report charged that Meese had prior knowledge of some of the illegal acts. Meese continually faced criticism and charges of corruption while in office, mostly involving gifts he received and allegations of influence peddling. Despite numerous complaints and investigations, no charges were ever filed. Nevertheless, Meese tired of the continual scrutiny and resigned as attorney general in August, 1988.
From left: White House Chief of Staff James Baker, Counselor to the President Edwin Meese III, and Deputy Chief of Staff Michael Deaver in December, 1981. The three men were known as President Reagan’s “Troika.” (Courtesy, Ronald Reagan Library)
634
■
The Eighties in America
Mellencamp, John Cougar
Impact Meese has been called one of the leading figures of the conservative revolution of the 1980’s, especially in regard to legal affairs and jurisprudence. His controversial tenure as attorney general reflected the polarization of American politics during the 1980’s, as, like many other members of the Reagan administration, he was simultaneously celebrated as a heroic public servant and vilified as a criminal corrupting the nation’s government. Further Reading
Edwards, Lee. To Preserve and Protect: The Life of Edwin Meese III. Washington, D.C.: Heritage Foundation, 2005. Meese, Edwin. With Reagan: The Inside Story. Washington, D.C.: Regnery/Gateway, 1992. Mark S. Joy See also
Conservatism in U.S. politics; Iran-Contra affair; Pornography; Reagan, Ronald; Reagan Revolution; Tower Commission.
■ Mellencamp, John Cougar Identification American singer and songwriter Born October 7, 1951; Seymour, Indiana
Mellencamp emerged on the popular music scene in the 1980’s. His popularity and success spread from his midwestern roots and the sense of place at the center of his music, as he wrote and performed songs treating teen and twentysomething relationships, small-town Americana, and the fate of workers and farmers across the United States. During the 1980’s, John Cougar Mellencamp and his band produced six albums and several hit singles. Nothin’ Matters and What If It Did (1980) sold nearly 300,000 copies and included “Cry Baby,” “Tonight,” and “This Time.” American Fool (1982) sold almost three million copies in its first year, and two hit singles from the album, “Hurts So Good” and “Jack and Diane,” ranked in the top ten of the Billboard Hot 100 chart at the same time. It was the first time two songs from the same album had gained simultaneous top-ten rankings since the Beatles accomplished the same feat in the 1960’s. While the album Uh-Huh (1983) produced three hits—“Authority Song,” “Pink Houses,” and “Crumblin’ Down”—it was with Scarecrow (1985) and The Lonesome Jubilee (1987) that Mellencamp succeeded
in bringing populist themes and social realism to the forefront of his music. Dedicating Scarecrow to the memory of his beloved grandfather Speck, Mellencamp captured what he saw as the deleterious effects of Reaganomics on farmers in the hit single “Rain on the Scarecrow.” The album also paid homage to his hometown of Seymour, Indiana, with the popular single “Small Town.” Considered by many to be his most artistically accomplished album, Lonesome Jubilee was certified platinum and incorporated such traditional folk instruments as the fiddle, accordion, dobro, mandolin, and hammered dulcimer to produce “Down and Out in Paradise” and “The Real Life.” Mellencamp concluded a very successful decade with Big Daddy (1989), which featured the haunting ballad “Jackie Brown.” An artist who achieved maturity as a singer and songwriter in the 1980’s, Mellencamp understood his role and responsibility through the lyrics he wrote representing life experiences and the people he knew and loved. Impact By grounding his electronically and acoustically generated music in America’s heartland, John Cougar Mellencamp added a distinctive voice and vision to American popular music. Often compared to New Jersey-born rocker Bruce Springsteen, Mellencamp became one of popular music’s most outspoken activists. In tandem with country-western singer Willie Nelson, Mellencamp helped organize three Farm Aid benefit concerts to raise money for farmers in financial trouble. Indianapolis Monthly voted Mellencamp one of Indiana’s “Favorite Sons,” and he was included on the state’s list of “all-time exemplary men.” Further Reading
Elteren, Mel van. “Populist Rock in Postmodern Society: John Cougar Mellencamp in Perspective.” Journal of Popular Culture 28, no. 3 (1994): 95-123. Harshfield, David. Manchild for Real: The Life and Lyrics of John Cougar Mellencamp. New York: Vantage Press, 1986. Torgoff, Martin. American Fool: The Roots and Improbable Rise of John Cougar Mellencamp. New York: St. Martin’s Press, 1986. West, Evan. “Favorite Sons.” Indianapolis Monthly 28, no. 1 (September, 2004): 138-151. Kevin Eyster See also Farm Aid; Farm crisis; Music; Pop music; Reaganomics; Springsteen, Bruce.
The Eighties in America
■ Meritor Savings Bank v. Vinson Identification U.S. Supreme Court decision Date June 19, 1986
The Supreme Court’s unanimous ruling prohibited an employer from subjecting an employee to a sexually hostile work environment. It confirmed the legality of the EEOC’s sexual harassment guidelines and prompted a significant change in U.S. workplace policies and corporate culture. At the time Meritor Savings Bank v. Vinson was brought to court, the Equal Employment Opportunity Commission (EEOC) defined sexual harassment as unwelcome sexual advances, requests for sexual favors, and other verbal or physical conduct of a sexual nature that met any of three specific criteria. First, harassment would result if submission to such conduct was made a term or condition of an individual’s employment. This was called “quid pro quo” sexual harassment. Second, if submission to or rejection of such conduct was used as a basis for employment decisions affecting the employee in question, that would also constitute quid pro quo harassment. Third, sexual harassment would also result from conduct that had the effect of unreasonably interfering with an individual’s work performance or creating a hostile, intimidating, or offensive working environment. This was known as “hostile work environment” sexual harassment. The case resulted from Mechelle Vinson’s lawsuit against her employer, Meritor Savings Bank. Vinson claimed that on numerous occasions she had submitted to the sexual advances of her supervisor in order to keep her job. Evidence at trial revealed that Vinson was hired as a teller-trainee and subsequently promoted to teller, head teller, and assistant branch manager. She was discharged for excessive use of sick leave. She said that she engaged in nonconsensual sex with her supervisor forty to fifty times in a four-year period in order to keep her job and that he fondled her in front of other employees, followed her into the restroom, exposed himself to her, and forcibly raped her on several occasions. The supervisor and the bank denied these allegations. The EEOC had defined sexual harassment in order to enforce Title VII of the Civil Rights Act of 1964. The Supreme Court was thus faced with the task of determining whether the EEOC guidelines were legitimately authorized by that law. The defendants argued that the law applied only to tangible or
Meritor Savings Bank v. Vinson
■
635
economic discrimination, not something as intangible as the workplace environment. The Court determined that the creation of a hostile work environment through sexual harassment was indeed outlawed by Title VII, whose prohibitions included unwelcome verbal or physical sexual behavior that is so extreme or widespread as to subject the employee to psychological strain and create a hostile working environment, altering the conditions of employment. The Court rejected the idea that there could be no sexual harassment just because the sexual relations between the parties were voluntary. The core of a sexual harassment claim, under the Court’s decision, was that the sexual advances were unwelcome. So long as harassing behavior offended the sensibilities of an individual employee, it could constitute sexual harassment. Therefore, each case was to be analyzed based on its own specific facts. Impact After the decision in Meritor Savings Bank v. Vinson, sexual harassment law evolved, and employees grew increasingly aware of their right to be free from sexual harassment in the workplace. Employers instituted policies against discrimination and sexual harassment and conducted training sessions for human resource officers and managers. The Court’s ruling had far-reaching implications in the workplace and also influenced such later high-profile controversies as the Anita Hill-Clarence Thomas hearings, the Tailhook scandal, and the Bill Clinton impeachment. Further Reading
Cochran, Augustus B. Sexual Harassment and the Law: The Mechelle Vinson Case. Lawrence: University of Kansas Press, 2004. Goldstein, Leslie Friedman. The Constitutional Rights of Women. 2d ed. Madison: University of Wisconsin Press, 1988. Hoff, Joan. Law, Gender, and Injustice: A Legal History of U.S. Women. New York: New York University Press, 1991. Mazey, Susan Gluck. Elusive Equality: Women’s Rights, Public Policy, and the Law. Boulder: University of Colorado Press, 2003. Marcia J. Weiss See also Feminism; Sexual harassment; Supreme Court decisions; Women in the workforce; Women’s rights.
636
■
The Eighties in America
Mexico and the United States
■ Mexico and the United States Definition
Diplomatic relations between neighboring countries
Despite President Jimmy Carter’s efforts to improve U.S. relations with Mexico in the 1970’s, domestic and international political issues divided the two countries. Much of the ruling Institutional Revolutionary Party in Mexico distrusted the United States for controlling too much of the Mexican economy and extracting too much of Mexico’s wealth. U.S. president Ronald Reagan was eager to improve these relations but had to overcome political tensions that divided the two nations. In the 1980’s, U.S. and Mexican policies conflicted over several issues. One was foreign policy. The United States government opposed left-leaning Central American governments in an effort to contain communism. While Mexico opposed communism, it chose to support de facto rulers in Central America. Domestic political issues in both the United States and Mexico became international issues when the Mexican economy collapsed in the early 1980’s and both countries suffered increased unemployment. Further, increased drug trade meant that the U.S. would need more control over its southern border. Bilateral Relations During the 1980’s, the Institutional Revolutionary Party (PRI) controlled the Mexican presidency. This stability allowed the government to implement long-term policies. In an effort to reform the Mexican economy, the administrations of José López Portillo and Miguel de la Madrid Hurtado reformed fiscal policies and privateownership laws and initiated different forms of structural adjustment to conform to prevailing economic theories. To liberalize the economy, the Mexican government discontinued most government subsidies to farmers. Along with economic restructuring shocks, Mexico suffered from oil price fluctuations in the early 1980’s that drastically affected its economic planning. In 1976, Mexico had discovered vast oil reserves and began negotiations to build a U.S.-Mexico oil pipeline. Mexico was to finance this pipeline, but it suffered economic hardships from a drop in oil revenue, owing to a decrease in world oil prices. In August of 1982, the Mexican government announced that it had depleted its foreign-exchange reserves, and foreign investment disappeared from Mexico.
Rural and urban Mexicans suffered from this economic downturn, and many fled to the United States. At the same time, increased drug use and violence in the United States led some to draw connections between illegal drugs and illegal immigrants. State and federal officials, particularly from border states, demanded that the United States and the Mexican governments stop the flow of illegal drugs and immigrants. Two of the most outspoken U.S. officials were Senator Jesse Helms and Ambassador to Mexico John Gavin. While Helms and Gavin chastised the PRI for Mexico’s flagging economy, other Republicans met with the main opposition to the PRI, the National Action Party (PAN). Both governments responded to the increase in immigrants and drugs. President de la Madrid responded to U.S. pressure over the drug trade and in 1985 devoted twenty-five thousand regular army soldiers to disrupt narcotics networks. However, the military was not immune to corruption, and the drug trade continued. In 1985, U.S. Drug Enforcement agent Enrique Camarena was abducted, tortured, and killed by drug lords. The event was widely covered by the U.S. media, and evidence suggested that Mexican law enforcement officers collaborated and were present at Camarena’s torture, further distancing the American people. In 1986, Helms held hearings in the Senate over Mexican government corruption. That same year, the Immigration Reform and Control Act was signed into law in the United States to help deter illegal immigration by making it illegal to hire undocumented workers. Foreign Relations
In the early and mid-1980’s, the United States and Mexico also clashed over foreign policy. The Ronald Reagan administration opposed leftist guerrilla movements and governments in Central America, while the Mexican government supported governments in power, whatever their political ideology. In particular, the U.S. and Mexican governments disagreed over Guatemala, Cuba, El Salvador, and Nicaragua. In 1983, Mexico, along with Venezuela, Colombia, and Panama, formed the Contadora Group, an organization working to stabilize the region. The United States opposed the Contadora Group because the organization opposed unilateral action by the United States and because it recognized the government of Nicaragua. After heightened tensions through the mid1980’s, U.S. attention shifted away from Central
The Eighties in America
America. Ambassador Gavin, who had lost favor among many moderate Republicans, was replaced in 1986 by Charles Pilliod. The Iran-Contra affair consumed public attention, and the relations between the United States and Mexico eased. In the 1988 Mexican presidential election, the PAN claimed that PRI candidate Carlos Salinas de Gortari had been fraudulently elected. However, the Reagan administration did not protest. In 1988, George H. W. Bush was elected U.S. president and appointed several other Republicans with ties to Texas and Mexico, further easing U.S.-Mexican relations. Impact Mexico and the United States in the 1980’s set some important precedents for future relations but also continued many of the same patterns from previous decades. The 1982 financial crisis, Mexico’s worst to that date, required a financial bailout that would be replicated twelve years later, when the Mexican economy would falter again. U.S. and Mexican governments actively attacked the drug trade, and the Mexican government worked to undercut domestic corruption. The 1980’s saw the continued trends of increasing Mexican immigration into the United States and lawmakers’ attempts to curtail illegal immigration. Further Reading
Green, Rosario, and Peter H. Smith, eds. Foreign Policy in U.S.-Mexican Relations: Papers Prepared for the Bilateral Commission on the Future of United States-Mexican Relations. San Diego: Center for U.S.-Mexican Studies, University of California, San Diego, 1989. The foreign relations installment of a five-book series discussing Mexican migration, drugs, foreign policy, economics, and stereotypes. Langley, Lester D. Mexico and the United States: The Fragile Relationship. Boston: Twayne, 1991. Historical description of the relations between Mexico and the United States from Mexico’s independence to the early 1990’s. Mazza, Jacqueline. Don’t Disturb the Neighbors: The U.S. and Democracy in Mexico, 1980-1995. New York: Routledge, 2001. Comprehensive look at American foreign policy in Mexico. Velasco, José Luis. Insurgency, Authoritarianism, and Drug Trafficking in Mexico’s “Democratization.” New York: Routledge, 2005. Concentrates on the effects of economic and political changes on social issues from the 1970’s through the 1990’s. Ryan Gibb
MGM Grand Hotel fire
■
637
See also
Conservatism in U.S. politics; Crack epidemic; Demographics of the United States; Foreign policy of the United States; Immigration Reform and Control Act of 1986; Immigration to the United States; Iran-Contra affair; Latin America; Latinos; Reagan, Ronald; Unemployment in the United States.
■ MGM Grand Hotel fire The Event Tragic Las Vegas conflagration Date November 21, 1980
The MGM Grand Hotel fire in Las Vegas killed eighty-five people, making it the deadliest U.S. hotel blaze since 1946. The disaster led to tighter fire regulations and broader deployment of water sprinklers in hotel rooms and common areas. In 1980, the MGM Grand Hotel consisted of a casino, restaurants, nightclubs, and convention rooms in a low-rise section, as well as 2,076 hotel rooms housed in a twenty-six-story tower. Shortly after 7:00 a.m. on November 21, two employees of a coffee shop spotted signs of a fire in the hotel’s empty delicatessen. Within six minutes, fire engulfed much of the casino. The pressure of the hot gases released by the fire forced open the main doors of the casino, and the fire quickly consumed the carport outside. Flames destroyed a plywood covering at the bottom of a stairwell, and smoke shot up the stairs. Heat, smoke, and flames passed into elevator shafts through unsealed doors. Sprinklers put out fires in a hallway, and firefighters doused the blaze on the main floor. However, smoke passed through the twelve-inch-wide seismic joints running from the casino level to the top of the tower. It billowed into open shafts, particularly those that were not properly sealed. Adding to the problem were smoke dampers that were bolted in such a manner as to make them inoperable. Meanwhile, helicopters from government agencies and private businesses flew to the scene to rescue about 250 survivors stranded on the roof or perched on balconies. Seven hundred people were injured by the blaze. Most of those killed by the fire were claimed by the smoke that billowed throughout the two-million-square-foot building. Only eighteen of the victims died on the casino floor, while sixty-one people died on the sixteenth through the twenty-sixth floors. Six other victims died on unspecified floors. Fire investigators discovered that the blaze began
638
■
The Eighties in America
MGM Grand Hotel fire
in a serving station in the delicatessen. The improper installation of an electrical cable and its exposure to warm moisture over a period of years deteriorated the wiring’s insulation. A short-circuit created the first flames. Clark County, Nevada, building inspectors found hundreds of building code violations that contributed to the tragedy, including air shafts that should have had at least a two-hour fire rating, inadequate exit signs and emergency lighting, improperly fire-rated stairways and corridors, poorly vented elevator shafts, and numerous holes in corridor fire walls. The MGM Grand reopened on July 30, 1981, after installing new safety systems, including a computer that monitored sprinkler heads, smoke detectors, and security doors from a command center, as well as smoke-exhaust fans.
Impact The victims of the fire, about thirteen hundred people, divided $140 million, in the largest compensatory damage settlement in U.S. history. Extensive changes in the fire code were designed in the hope that there would never be another hotel fire on the scale of that at the MGM Grand. Further Reading
Coakley, Deirdre, et al. The Day the MGM Grand Hotel Burned. Secaucus, N.J.: Lyle Stuart, 1982. Frieman, Fran Locher, and Neil Schlager. Failed Technology: True Stories of Technological Disasters. New York: ITP, 1995. Caryn E. Neumann See also
Natural disasters.
Firefighters survey the remains of the MGM Grand Hotel’s casino after the 1980 fire that claimed eighty-five victims. (AP/Wide World Photos)
The Eighties in America
■ Miami Riot of 1980 The Event Race riot Date May 17-19, 1980 Place African American sections of Miami,
Florida The Miami Riot of 1980 resulted in eighteen deaths and over $80 million worth of property damage, making it the most destructive U.S. race riot to occur between 1968 and 1992. In contrast to the rioters of the 1960’s, who had primarily attacked buildings and looted shops, rioters in 1980 randomly attacked whites and Hispanics. In 1980, about 37 percent of the African Americans of Miami, Florida, lived below the poverty level. The city’s African American unemployment rate was 17 percent, more than double that for whites. For two decades, Latino immigrants (mostly Cubans) had been taking over many of the jobs traditionally held by African Americans in Miami. The influx of additional Cubans from the Mariel boatlift further increased resentments. As was the case in many other large U.S. cities, the relations between the African American community and the police were frequently confrontational, with most African Americans convinced that the police practiced racial profiling and abused African American citizens. On the morning of December 17, 1979, a thirtythree-year-old African American named Arthur McDuffie was driving in Miami on his Kawasaki motorcycle. Although he was a successful insurance salesman and a former Marine, his license had been suspended because of traffic violations. After running a red light, he was pursued by several police officers in a high-speed chase that lasted eight minutes. According to the official report, he crashed his vehicle, tried to escape, and then forcibly resisted arrest. McDuffie died as a result of the ensuing struggle. Based on the nature of his injuries, however, investigators concluded that his death was the probable result of a brutal beating. Three of the officers, in exchange for immunity, agreed to testify that McDuffie had been noncombative and that a police car intentionally drove over his motorcycle in order to make his death appear accidental. Four of the arresting officers were charged with manslaughter, and one officer’s charge was later elevated to second-degree murder. In addition, the four officers, as well as two others, were charged with fabricating physical evidence. Because of the emo-
Miami Riot of 1980
■
639
tionally charged atmosphere in Miami, the trial was moved to the more sympathetic venue of Tampa, Florida. The lead prosecutor was county attorney Janet Reno. Following a month-long trial, an allwhite jury on May 17, 1980, deliberated for less than three hours and then acquitted the officers on all counts. The Uprising When news of the verdict reached Miami, furious African Americans poured into the streets. Within three hours, rocks and bottles were flying. The National Association for the Advancement of Colored People (NAACP) announced a peaceful protest assembly, but when no speaker showed up, numerous persons left the assembly to join the unruly crowds. By that evening, roving groups of young men were indiscriminately torching buildings and looting stores. The worst of the rioting took place in the African Amerian enclave of Liberty City, but there was also considerable property damage in the Overtown and Black Grove sections of the city. Dozens of rioters attacked whites and Hispanics who happened to be driving through affected neighborhoods. An uncertain number were dragged out of their automobiles and beaten with bricks and pieces of concrete. Several white motorists shot at African Americans who approached their cars. Some residents living near the riots erected armed barricades. Witnesses reported that an armed white man in a pickup randomly shot at the rioters, killing at least one person. Florida governor Bob Graham declared a curfew and eventually ordered thirty-six hundred National Guard troops into the city. By the time calm and order were restored on May 19, about 120 persons had been hospitalized and 18 were dead, including 8 white motorists and 10 black rioters. Of the 10, 7 rioters were killed either by the police or the National Guard, while 3 others were killed by white civilians. A total of 855 persons were arrested, including 777 blacks, 43 whites, and 25 Hispanics. Later Court Actions Prosecutors and courts generally showed leniency toward the rioters. About 85 percent of the defendants had their cases dismissed. Only 135 of the defendants received criminal sentences. Of these, 89 were given probation without any felony appearing on their records, and 27 were sentenced to time already served in jail (usually one night). The only persons to serve additional time in prison were the three found guilty of murder. Al-
640
■
Miami Vice
though one white man was arrested on suspicion of murdering a black rioter, the evidence was insufficient to result in a formal charge. A federal grand jury indicted one of the officers for violating McDuffie’s civil rights, but the trial resulted in an acquittal. McDuffie’s family filed a civil suit against the city for $25 million but agreed to accept a settlement of $1.1 million, of which half was paid to the lawyers and the remainder went to McDuffie’s mother and two daughters. Impact The massive destruction of businesses and property greatly increased the unemployment and poverty rates of the African American community in Miami. The riot underscored the need to improve relations between African Americans and the urban police. Miami and other cities responded by instituting a number of reforms, including stricter guidelines for the police in using force, the training of officers in race relations, and the establishment of citizen oversight committees. Some scholars have suggested that these reforms were one of the reasons that the 1980’s did not see a wave of violent riots similar to those of the 1960’s. Further Reading
Dunn, Marvin. Black Miami in the Twentieth Century. Tallahassee: University of Florida Press, 1997. Excellent analysis that puts the 1980 incident within its broader social and historical context. Fyfe, James, and Jerome Skolnick. Above the Law: Police and the Excessive Use of Force. New York: Simon & Schuster, 1992. Historical analysis of police abuses, arguing that the Miami Riot resulted in positive reforms—a view disputed by other scholars. Porter, Bruce, and Marvin Dunn. Miami Riot of 1980: Crossing the Bounds. New York: Simon & Schuster, 1984. First-rate work of historical sociology, filled with statistics, information about individual persons, and perceptive interpretations. Highly recommended. Stepick, Alex, and Alejandro Portes. City on the Edge: The Transformation of Miami. Berkeley: University of California Press, 1994. Includes much information about African Americans and Hispanics, with an interesting summary of the riot. Thomas Tandy Lewis See also
African Americans; Do the Right Thing; Howard Beach incident; Latinos; Mariel boatlift; Racial discrimination.
The Eighties in America
■ Miami Vice Identification Television series Producers Michael Mann (1943-
) and Anthony Yerkovich (1950) Date Aired from September 28, 1984, to June 28, 1989 Miami Vice featured a bold and distinctive look, both in its cinematography and in its costume design. The show had a significant influence on fashion trends of the 1980’s, as well as on popular music, and it was one of several shows of the decade to demonstrate the possibilities of giving primetime series their own specific look and feel. For many people, the 1980’s conjure images of pink flamingos, turquoise water, and white sports jackets, graced by an auditory background of drums and synthesizers. Many of these representations are rooted in a highly influential drama that premiered on the National Broadcasting Company (NBC) in the fall of 1984. The show was produced by Anthony Yerkovich and up-and-coming director Michael Mann, both of whom had worked as writers on the 1970’s buddy-cop show Starsky and Hutch, which similarly featured hip undercover cops in a fast car.Miami Vice broke new ground in television cinematography by using innovative camera angles, evocative lighting, and aggressively fast-paced editing. The show was more carefully directed than much of the fare being offered in cinema theaters at the time. The sound track was equally innovative and carefully designed. It featured heavily synthesized music by Jan Hammer intermixed with pop hits of the day. The influence of music videos upon the show’s style was apparent, lending credence to the rumor that the series had reportedly been nicknamed “MTV Cops” while in development. Many episodes included dance-club scenes that helped facilitate the inclusion of popular music on the sound track. Beyond the music, Miami Vice showcased the latest in designer clothing, as well as expensive cars, watches, firearms, boats, and even aircraft. The protagonists often posed as wealthy potential clients of upscale drug dealers, necessitating that they wear and drive equipment they could never afford to own on police salaries. Several of the show’s trademark fashions, particularly wearing solid T-shirts under light or pastel suits and intentionally cultivating beard stubble, became popular 1980’s styles. Much of the series was shot on location in Miami, helping
The Eighties in America
popularize the city’s Art Deco and neomodernist architecture, as well as its general Caribbean and Latino cultures. In retrospect, the series’ casting was also impressive, as guest stars included many future cinema heavyweights, such as Annette Bening, Helena Bonham-Carter, Steve Buscemi, Benicio Del Toro, Nathan Lane, John Leguizamo, Bill Paxton, Julia Roberts, Ben Stiller, John Turturro, and Bruce Willis. The basic series revolved around two undercover cops, James “Sonny” Crockett and Ricardo “Rico” Tubbs, played by Don Johnson and Philip Michael Thomas, respectively. The strongest character of the supporting cast was the stony Lieutenant Martin Castillo, brilliantly underplayed by Edward James Olmos. Episodes’ plots typically involved Crockett and Tubbs going undercover to bring down a colorful drug kingpin, usually in a hail of gunfire. The deeper subplot of the series, however, was the continual struggle of the main characters to keep their identities as cops and not to be lured by the lifestyles they adopted to turn into the criminals they hunted. Impact Miami Vice lasted only five seasons before falling ratings and rising production costs put an end to the series, yet it was somehow fitting that the series did not survive the decade. In this way, the show has remained an iconic artifact of the 1980’s. From aqua T-shirts and sockless shoes to wearing day-old beard growth in the office, the show greatly influenced American fashions and American culture.
Michael, George
■
641
■ Michael, George Identification British singer and songwriter Born June 25, 1963; London, England
Within the span of a decade, Michael achieved both popular and critical acclaim by first introducing an optimistic, infectious style of pop music via the musical group Wham! and then following up with a grittier, sexier style as a solo artist. Because of his ability to evolve in both musical style and image, George Michael, born Georgios Kyriacos Panayiotou, became one of the most well-known singer-songwriters of the 1980’s. While still a teen, he formed the group Wham! UK with his school friend Andrew Ridgeley, releasing their first album, the rap-inspired Fantastic, in 1983. Unhappy with their label and eager to reach a wider audience, the pair shortened their name to Wham!, moved to CBS/Columbia, and cultivated a sunny, clean-cut image for the release of their second album, Make It Big (1984). The album yielded several hit singles in Great Britain and the United States, including the jitterbugstyle “Wake Me Up Before You Go-Go” and the earnest “Freedom.” In 1986, Wham! released its third album, Music from the Edge of Heaven, which sold well in Great Brit-
Further Reading
Feeney, F. X., and Duncan Paul, eds. Michael Mann. Cologne, Germany: Taschen, 2006. Janeshutz, Trish. The Making of “Miami Vice.” New York: Ballantine Books, 1986. Trutnau, John-Paul. A One-Man Show? The Construction and Deconstruction of a Patriarchal Image in the Reagan Era: Reading the Audio-Visual Poetics of “Miami Vice.” Victoria, B.C.: Trafford, 2005. Roger Pauly See also Fads; Fashions and clothing; Latinos; MTV; Music; Music videos; New Wave music; Pop music; Synthesizers; Television.
George Michael, left, with Wham! partner Andrew Ridgeley in 1984. (PA Photos/Landov)
642
■
ain but not in the United States. Shortly thereafter, they announced the group’s dissolution. Michael, ready to pursue a solo career, again altered his style and image before releasing Faith at the end of 1987. First, however, he recorded the single “I Want Your Sex” for the 1987 film Beverly Hills Cop II. Although the controversial song was banned by many radio stations, it became a hit. Michael also recorded a duet with Aretha Franklin, “I Knew You Were Waiting for Me,” which garnered a Grammy Award in the rhythm and blues category. With all of the hype surrounding these singles, Faith had much to live up to, and it did not disappoint. The title song’s video introduced Michael’s sexy new image, complete with leather jacket and facial stubble, while “Father Figure,” a slow, sultry tune about forbidden love, was also a hit. Faith won the Grammy for Album of the Year, but Michael’s resulting success and exhausting schedule took its toll, and his subsequent albums did not reach the same heights. In later years, Michael would renounce his image as a sex symbol, due in part to his burgeoning homosexuality as well as the loss of a loved partner. Impact Often described as an autobiographical singersongwriter, George Michael struck a chord with audiences around the world at various stages in his life and career, including the frothy tunes from his Wham! days and the more soulful songs he later recorded as a solo artist. His talent attracted the attention of influential artists such as Aretha Franklin and Elton John, leading to hugely successful collaborative projects, and Michael also used his fame to highlight worthy causes such as famine relief and AIDS awareness. Further Reading
Gold, Todd. George Michael. New York: Paperjacks, 1987. Pond, S. “George Michael, Seriously.” Rolling Stone, January 28, 1988, 28. Wapshott, Nicholas, and Tim Wapshott. Older: The Unauthorized Biography of George Michael. London: Sidgwick & Jackson, 1998. Amy Sisson See also
The Eighties in America
Microsoft
MTV; Music; Music videos; Pop music.
■ Microsoft Identification American software company Date Founded in 1976; incorporated under the
name Microsoft in 1981 In 1988, Microsoft passed Lotus to become the largest software company in the world. It achieved its phenomenal growth during the 1980’s by using a business model that revolutionized the computer industry. The company entered into advantageous strategic partnerships, bought smaller companies whose intellectual property was necessary to complete major Microsoft products, and hired the best employees available to develop software that customers wanted. In 1975 in Albuquerque, New Mexico, Bill Gates and Paul Allen developed a BASIC interpreter for the MITS ALTAIR microcomputer. Later in the 1970’s, they founded Micro-Soft to produce BASIC interpreters for a number of microcomputers. Steve Ballmer joined the team in 1980; a canny executive whose business acumen was important to Microsoft’s success, he would succeed Bill Gates as president in 1988. The company founders changed the name of their venture to Microsoft in 1981 and moved it to Seattle, Washington. Gates and Allen decided to expand beyond concentrating solely on computer languages. They purchased the Quick and Dirty Operating System (QDOS), developed in 1980 by Tim Peterson. Microsoft added some features to QDOS and renamed the resulting operating system MS-DOS. In 1981, Microsoft licensed a version of its DOS operating system, PC-DOS, as well as a BASIC interpreter, to International Business Machines (IBM). PC-DOS became the operating system of the IBM PC. The success of the IBM PC platform resulted in many clones being developed by other companies. Microsoft provided MS-DOS for over fifty of these clones by 1982. The total sales of MS-DOS, added to the large PC-DOS sales, launched Microsoft on the road to becoming the largest software company in the world. Microsoft was interested in marketing a more advanced operating system than MS-DOS, so the company ported to microcomputers a version of UNIX, developed by American Telephone and Telegraph (AT&T). During the early 1980’s, Microsoft licensed this operating system—XENIX—to IBM, Intel, and SCO. Rather than making XENIX its operating system of the future, however, Microsoft opted during the remainder of the 1980’s to develop several other
The Eighties in America
Microsoft
■
643
operating systems, including OS/2, Windows, and Windows NT. Windows
Microsoft was impressed with the XEROX Alto computer, which featured a graphical user interface (GUI) and was developed in 1981. Apple Computer, which was also impressed by Xerox’s innovations, visited the Xerox Palo Alto Research Center (Xerox PARC) and secured permission to utilize some of Xerox’s ideas and technology in its products. Microsoft, in turn, licensed some of the elements of Apple’s GUI (then under development for the Apple Macintosh computer) to use in its own products. Microsoft released the first version of Windows in 1985. A second version followed in 1988. Version 2 of Windows featured overlapping windows, an element of the Apple GUI that Microsoft had not explicitly licensed. This prompted Apple to sue Microsoft. (Apple would eventually lose this lawsuit, based primarily on the fact that it had licensed some of its technology to Microsoft.) However, neither Windows 1.0 nor Windows 2.0 generated much excitement in the marketplace. With lessons learned from developing OS/2’s Presentation Manager, as well as software for the Mac, Microsoft released Windows 3.0 in 1990, finally achieving significant popularity and success with a GUI-based operating system. Microsoft began working with IBM in 1985 to develop OS/2 for IBM to install on the PS2. The two companies publicly announced their joint venture in 1987. The partnership did not work well, however, and Microsoft dropped support for OS/2 in 1990, when Windows 3 proved to be successful. Microsoft began development of Windows NT (new technology) in 1988. Windows up to that time had run on top of DOS, and the company decided it was time to replace the low-level operating system with a more advanced architecture. The shape of this new architecture was defined by Dave Cutler, former architect of the VMS operating system for VAX mainframes. Microsoft hired Cutler in 1988. In 1991, the company hired Richard Rashid to join the project. Rashid had developed the Mach kernel for UNIX, and he added a microkernel to Windows NT, making it very efficient. Windows NT was released in 1993 as the operating system of the future and was an immediate success.
Other Items
Microsoft developed a number of network products during the 1980’s. The NetBIOS net-
Microsoft founder Bill Gates in 1984. (AP/Wide World Photos)
work protocol was developed by Sytec in 1983 for IBM and Microsoft. This broadband local area network (LAN) used proprietary Sytec protocols on the IBM PC network. LAN Manager was initially developed in 1987 as a network to use with OS/2. Microsoft, in cooperation with 3COM, continued to develop LAN Manager after the OS/2 project was canceled. In 1990, Microsoft included much of the LAN Manager technology in Windows NT. Microsoft Office was also developed during the 1980’s. The first version was released for Apple’s Mac in 1989; the first version for Windows followed in 1990. Office combined a series of different applications with different development histories in a single package. Microsoft had developed its first application, during the early part of the decade: a spreadsheet called MultiPlan, released in 1983. In the same year, Microsoft put together a team, led by Richard Brodie, that produced Microsoft Word for
644
■
XENIX, DOS, and the Mac. During the early 1980’s, Microsoft developed Chart and File for the Mac, and these were later included in Office. Microsoft Works, a junior version of Office, was released in 1986. In 1987, Microsoft purchased Forethought Incorporated, the company that developed PowerPoint, and added this product to its inventory. Also in 1987, Microsoft announced Excel as an upgrade of Multiplan for Windows. In 1988, Microsoft and AshtonTate began work on a relational database, Microsoft SQL Server, based on a relational database management system licensed by Microsoft from Sybase and enhanced by Microsoft and Ashton-Tate. The first version was released in 1989. Impact The software that Microsoft developed during the 1980’s revolutionized the computer industry by demonstrating that a company could be a success if it specialized in computer software. During the 1980’s, Microsoft became increasingly profitable and powerful, growing to become an international company with facilities in Ireland, Mexico, and elsewhere. As personal computers and microcomputers became increasingly ubiquitous, Microsoft’s operating systems came to define many people’s experience of those computers. Indeed, both Microsoft’s greatest supporters and its greatest detractors agree on this point: For the majority of casual users, Microsoft Windows defines the possibilities and limitations of the human-computer interface. It was the company’s growth and strategies of the 1980’s that brought it to such a dominant position in the industry. Further Reading
Manes, Stephen, and Paul Andrews. Gates: How Microsoft’s Mogul Reinvented an Industry—and Made Himself the Richest Man in America. Carmichael, Calif.: Touchstone Press, 1994. Biography of Bill Gates, focused on his professional career. Tsang, Cheryl. Microsoft: First Generation. Somerset, N.J.: John Wiley, 1999. Description of the twentyfive-year development of Microsoft. Wallace, James, and James Erickson. Hard Drive: Bill Gates and the Making of the Microsoft Empire. Carmichael, Calif.: HarperCollins, 1993. Account of how Microsoft developed as a company. George M. Whitson III See also
The Eighties in America
Middle East and North America
Apple Computer; Computers; Information age; Science and technology.
■ Middle East and North America Definition
Diplomatic relations between two world regions
An inconsistent and frequently contradictory American policy in the Middle East saw an increase in terrorism while leaving the settlement of the basic Israeli-Palestinian issue unresolved. Caught by the constraints of the Cold War, the radicalization of the Islamic movement, the Iran-Iraq conflict, and the Israeli-Palestinian issue, the United States ended up pursuing a zigzag and often contradictory course in Middle Eastern diplomacy in the 1980’s, but with a distinct pro-Israeli tilt in contrast to President Jimmy Carter’s earlier attempts to be an honest broker. Accordingly, America’s relations with the Middle East—except for a few Arab moderate states—continued to be indecisive and turbulent. The replacement of U.S. secretary of state Alexander M. Haig by George P. Shultz in July, 1982, initially prompted President Ronald Reagan to take a more active role in the Arab-Israeli peace initiative, especially in the light of rising tensions because of the increase in Jewish settlements in the Israelioccupied territories of the West Bank and Gaza, the consequent militancy of Palestinian guerrillas, the heating up and greater involvement of outsiders in the Lebanese Civil War, and the escalation of the Iran-Iraq conflict. Arab-Israeli Conflict Reagan’s call for a freeze on Jewish settlements, re-echoing Carter’s call, was mingled with the characterization of Israel as America’s “strategic ally,” which tended to mute requests for a slowdown if not halt of such settlements in the face of a determined conservative Israeli Likud Party government. Even the tepid American peace initiative—the 1985 proposal to establish negotiations between a joint Palestinian-Jordanian delegation on one hand and Israel to implement the Reagan Plan of September 1, 1982, for a “Palestinian entity” on the other—was deflected by the worsening situation in Lebanon. On December 9, 1987, the first Palestinian intifada (uprising) broke out in the Israelioccupied territories. One reason for the intifada was Israeli prime minister Yitzhak Shamir’s intransigence in accepting the American-proposed land-for-peace formula as a basis for settling the forty-year-old conflict. Washing-
The Eighties in America
ton, not fully appreciating the importance of the issue in the region in general, decided not to press Israel too hard. However, there were a few rough patches in the relationship. In the fall of 1981, the Reagan administration decided to sell airborne warning and control systems (AWACs) to Saudi Arabia, implementing a policy initiated by Carter. Israel opposed the sale, as it did the eventually aborted plan to supply advanced American weapons to Jordan. (Both of these Arab countries were considered to be moderate and thus friendly to the United States.) On the American side, Washington objected to Israel’s air strike against a nuclear power plant in construction on the outskirts of Baghdad, Iraq, in June of 1981, as well as to a number of Israeli air attacks against alleged
Middle East and North America
■
645
guerrilla bases in Lebanon and the full-scale invasion of southern Lebanon by Israel in June, 1982. The United States also rebuked Israel’s annexation of Syria’s Golan Heights in December, 1981. These disputes were interspersed with the intermittent suspension of American arms deliveries to Israel or of U.S.-Israeli strategic cooperation talks. At any rate, on November 15, 1988, the Palestinian National Council proclaimed an independent Palestinian state in the West Bank and Gaza and accepted U.N. Security Council Resolutions 242 and 338 implicitly recognizing Israel. When Palestine Liberation Organization (PLO) chairman Yasir Arafat tried to do the same at the U.N. General Assembly in the fall of 1988, he was denied a U.S. entry visa on the grounds that he represented a terrorist
Palestinian demonstrators attack Israeli troops during an uprising in Nablus, West Bank, on December 13, 1987. (AP/Wide World Photos)
646
■
Middle East and North America
organization. The international body thus had to meet in Geneva, Switzerland, in December, 1988, where Arafat not only confirmed the Palestinians’ position but also renounced terrorism as a political instrument. Thereupon the United States began discussions with the PLO. It was now Israel’s turn to express its disappointment with this substantive change in U.S. foreign policy. Still, because of Israeli objections, an international conference on an ArabIsraeli settlement was not to occur during Reagan’s tenure. Lebanon
This formerly French-mandated territory became independent in 1943. Its postwar history epitomizes how Washington got entangled in a Middle Eastern morass by its lack of clear understanding of the region and poorly implemented policies. The civil war in Lebanon under way since 1975 involved all the various local sects—Maronite Christians, Orthodox Christians, Sunni Muslims, Shiite Muslims, Druzes, Kurds—as well as Palestinians (mostly refugees and guerrilla groups), Syrians, and Israelis. With the Israeli invasion in June, 1982, the situation had become serious enough to cause the United States to send Marines to Beirut on several occasions. However, various deadly acts of antiAmerican terrorism by militias in 1983 and 1984, interspersed with the taking of American hostages, led to the ultimate withdrawal of American forces followed by retaliatory bombardments by the U.S. Sixth Fleet. In the meantime, Syria, a Soviet ally, managed to establish a hegemony in Lebanon with the tacit approval of the United States, moderate Arab countries, and possibly even Israel. In fact, Syria was to remain the stabilizing arbiter of Lebanese affairs for many years to come.
Libya There was increasing tension between the Reagan administration and the Libyan regime of Colonel Muammar al-Qaddafi, who, according to Washington, was using some of his country’s oil wealth to finance terrorists in Europe targeting Americans, among others. Consequently, the Libyan embassy in Washington was closed in May, 1981, and there followed a number of incidents involving the two countries. The most notorious was the 1988 bombing of Pan Am Flight 103 over Lockerbie, Scotland, killing 270, most of whom were Americans. Earlier, the hijacking of TWA Flight 847 in 1985 had also been blamed on Libya. Also, there was disagreement between the two countries over whether the
The Eighties in America
Gulf of Sidra was part of Libya’s territorial waters, as Libya had claimed, or international waters, as the United States did. On several occasions, U.S. Navy vessels entered that contested area and combat planes its airspace to make that point. Incidents involved aerial dogfights, the shooting down of Libyan aircraft, and the sinking of Libyan naval vessels. President Reagan’s executive order of January 7, 1986, severed trade and transportation between the two countries, froze all Libyan assets in the United States, and mandated American oil companies to end their operations in Libya. On April 5, 1986, a bomb explosion in a West Berlin nightclub killed and injured American servicemen. Libya was again held responsible. On April 14-15, 1986, American combat planes attacked Libyan targets in Tripoli and Benghazi, causing casualties and extensive collateral damage. Bilateral relations did not improve in the 1980’s. Iran and Iraq The Carter Doctrine, spelled out in the president’s state of the union address of January 23, 1980, had been precipitated by the toppling of the pro-American shah of Iran’s regime in late 1979. It reaffirmed the United States’ interest in the Persian Gulf and its determination to protect its vital interests there. The Reagan administration was to perpetuate this doctrine, especially now that Ayatollah Khomeini, Iran’s spiritual leader and head of its revolution, identified the United States as “the Great Satan.” Reagan’s policy highlighted the role that American forces were to play in safeguarding the Gulf region and ensuring the uninterrupted flow of oil even under the new radical Islamic regime of the fundamentalist clerics. However, the war between Iran and Iraq under way since September 22, 1980, posed additional problems for U.S. foreign policy. At first, Washington viewed the conflict as not necessarily a bad thing, since both hostile contestants were countervailing each other and getting weaker from their reciprocal mayhem. After Iranian militants took American diplomats and citizens hostage in Tehran in late 1979 (they were released on Reagan’s assumption of office in January, 1981), the United States imposed an arms embargo on Iran. In characteristically contradictory manner, the National Security Council, represented by Robert McFarlane and his aide Lieutenant Colonel Oliver North, concocted a scheme that was supposed to achieve two objectives: establish ties with
The Eighties in America
moderate Iranian elements that could lead to regime change from the existing radical anti-American religious fundamentalists, and obtain the release of American hostages in Lebanon considered to be the captives of Iranian-controlled Shiite militias. By August, 1985, McFarlane, despite the opposition by Secretary of State Shultz and Secretary of Defense Caspar Weinberger, persuaded the president to approve delivery of fifty antitank missiles to Iran. Later, in November, 1985, eighty HAWK missiles were shipped. During this interlude, there were behind-the-scenes U.S.-Iranian negotiations. All of this happened under cover of a presidential directive of January 17, 1986, authorizing direct but covert arms sales to Iran despite the arms embargo against all countries involved in “international terrorism,” earlier defined as being sponsored by Syria, Libya, and Iran. By this time there was a third reason for the Iranian arms sales: The proceeds would be diverted to support the anticommunist Contra movement against Daniel Ortega’s leftist Sandinista government in Nicaragua. Meanwhile, Washington was also trying to play its Iraq card. Accordingly, in November, 1984, the United States renewed its diplomatic relations with Baghdad, a Soviet client. As both Iran and Iraq came to threaten the flow of Persian Gulf oil by attacking tankers at various ports, a U.S. task force began escorting tankers and even reflagging some. Washington also provided limited amounts of intelligence to President Saddam Hussein’s Iraqi regime in its “balanced” approach to the two combatants. These strategies and tactics were largely ineffective. Fewer than five American hostages were ultimately released in Lebanon, Iran’s war with Iraq was fought to an indecisive conclusion after the exhaustion of the adversaries in 1988, and the Sandinista regime was toppled not by the Nicaraguan Contras but as a result of subsequent elections. Canadian Foreign Policy
Canada continued its earlier tradition of making contributions to various peacekeeping forces in the Middle East and elsewhere. Even though it was a member of the Western, American-led alliance, Canada’s conciliatory trend in foreign policy as well as its status as a middle-sized power made it espouse a much less muscular policy than the United States. Indeed, in 1985, Canada’s external affairs minister, Joe Clark, issued a major
Middle East and North America
■
647
official statement regarding his country’s foreign policy striving for international security and economic competitiveness in addition to world peace, political freedom, social justice, human rights, and national unity. Impact By the time President George H. W. Bush replaced Ronald Reagan in the White House in January, 1989, the earlier attempts by the United States and the Soviet Union to improve their respective positions in the Middle East, mostly through surrogates, had been replaced by more cooperative relations, heralding the conclusion of the Cold War. With the Palestinian uprising in the occupied territories and the increasing power of Iran-sponsored Hezbollah in Lebanon, Middle Eastern foreign policy was being transformed into a kind of social movement, often made in the “Arab street” rather than in foreign offices. The fact remained that, as long as Washington continued to consider its Israeli connection more important than Palestinian peace—an infectious issue with region-wide implications—the ambivalence, indirection, and contradictions of the 1980’s could be expected to be unavoidable. Further Reading
Gerges, Fawaz A. America and Political Islam: Clash of Cultures or Clash of Interests. New York: Cambridge University Press, 1999. Explains how the multiplicity of elite thinking in the American foreign policy establishment as well as among opinion makers and pressure groups accounted for its contradictions witnessed especially under President Reagan. Hadar, Leon. Sandstorm: Policy Failure in the Middle East. New York: Palgrave Macmillan, 2005. A contrarian view of American foreign policy in the Middle East covering the 1980’s. Hallowell, Gerald, ed. The Oxford Companion to Canadian History. New York: Oxford University Press, 2004. Norman Hillmer’s essay entitled “peacekeeping” is the most relevant to the 1980’s. Laham, Nicholas. Crossing the Rubicon: Ronald Reagan and U.S. Policy in the Middle East. Burlington, Vt.: Ashgate, 2004. How and why the Reagan administration veered from the evenhanded foreign policy of President Carter to a decidedly pro-Israeli one in the 1980’s. Little, Douglas. American Orientalism: The United States and the Middle East Since 1945. Chapel Hill: University of North Carolina Press, 2002. A very
648
■
Military ban on homosexuals
lucid and scholarly account of the policies of Secretaries of State Haig and Shultz. Less critical than most sources. Taylor, Alan R. The Superpowers and the Middle East. Syracuse, N.Y.: Syracuse University Press, 1991. Focuses on the difficulties of various American administrations in maintaining regional strategic partners while promoting Arab-Israeli peace. Yetiv, Steve A. Crude Awakenings: Global Oil Security and American Foreign Policy. Ithaca, N.Y.: Cornell University Press, 2004. The importance of oil in American foreign policy in the Middle East. The detailed global energy chronology (1973-2003) is especially useful, as is the table showing the growing dependency of the United States on Middle Eastern oil imports in the 1980’s. Peter B. Heller See also Beirut bombings; Cold War; Foreign policy of Canada; Foreign policy of the United States; Haig, Alexander; Iran-Contra affair; Iranian hostage crisis; Israel and the United States; Jewish Americans; Libya bombing; North, Oliver; Pan Am Flight 103 bombing; Reagan Doctrine; Shultz, George P.; USS Vincennes incident; Weinberger, Caspar; West Berlin discotheque bombing.
■ Military ban on homosexuals Definition
Official policy of the United States government prohibiting gay and lesbian Americans from serving in the armed forces
During the 1980’s, the U.S. Department of Defense maintained an official hiring policy of discrimination against homosexuals. Additionally, it prohibited its employees from discussing or practicing homosexuality. Despite the efforts of many gay rights organizations, the Department of Defense maintained an official policy against hiring gay Americans during the 1980’s. Without offering any scientific proof of their premise, the Joint Chiefs of Staff insisted that homosexuality negatively affected the mission of the U.S. armed forces. Consequently, military commanders continued their long-standing tradition of discrimination and persecution against homosexuals. Recognizing that many young people would become fully aware of their sexual orientation only after initiating employment, the military actively pro-
The Eighties in America
hibited its employees from practicing or discussing homosexuality, even when off duty. During this decade, military employees were barred from attending nightclubs or businesses that catered to gay clienteles, such as the Bonham Exchange dance club in downtown San Antonio, Texas, adjacent to the Alamo. With five large Army and Air Force bases in that city, military police frequently monitored or raided local gay nightclubs seeking military personnel. If they were discovered in the clubs, those personnel could be dishonorably discharged from the service. Fearing official persecution, many gay soldiers discreetly gathered for social occasions in private residences and avoided openly patronizing gay businesses. Military police at many bases engaged in clandestine surveillance of employees suspected of being gay, even eavesdropping on phone calls and inspecting discarded messages for any evidence of “unbecoming” conduct. Despite favorable performance reviews or service commendations, gay soldiers dishonorably discharged for engaging in homosexual behavior forfeited their employment and all future financial benefits. Several landmark court cases involving gay military officers who had been discharged from the military gained national attention during the 1980’s. Forced from employment in 1975 by the U.S. Army after being awarded a Purple Heart and a Bronze Star for service in Vietnam, openly gay Sergeant Leonard Matlovich was awarded employment and benefits reinstatement by the U.S. Supreme Court in 1980. Fearing persecution from other soldiers, Matlovich retired, opting for a financial settlement from the Army. Impact Official discrimination against gays and lesbians by the American military aroused considerable public debate during the 1980’s. Gay rights advocates continued active campaigns to pass statutes guaranteeing gay men and lesbians the ability to serve in the military. In addition, legislation was introduced repeatedly to eliminate housing and workplace discrimination on the basis of sexual orientation, but it was never able to win majority support in Congress. By decade’s end, full civil rights for gay Americans was still an elusive goal, as was the ability to serve openly in the military. These goals were largely thwarted by opposition from the American military hierarchy and organizations heavily funded by fundamentalist, evangelical Christians.
The Eighties in America Further Reading
Baird, Robert M., and M. Katherine Baird, eds. Homosexuality: Debating the Issues. Amherst, N.Y.: Prometheus Books, 1995. Defense Personnel Security Research and Education Center. Lesbians and Gays in the Military. Monterey, Calif.: Author, 1988-1989. Shawyer, Lois. And the Flag Was Still There: Straight People, Gay People, and Sexuality in the U.S. Military. New York: Harrington Park Press, 1995. Hayes K. Galitski See also Affirmative action; Bakker, Jim and Tammy Faye; Conservatism in U.S. politics; Falwell, Jerry; Homosexuality and gay rights; Moral Majority; Murphy, Eddie; Navratilova, Martina; Reagan, Ronald; Sexual harassment; Supreme Court decisions; Swaggart, Jimmy; Weinberger, Caspar.
■ Military spending Definition
Financial expenditures by the U.S. government on national defense and security
The United States vastly increased its military spending in the 1980’s to challenge the Soviet Union for dominance in the Cold War. The massive defense funding altered the strategic balance with the Soviet Union, introducing several significant advances in weapons technology and forcing the Soviets to spend more than they could afford in order to compete. An increase in defense spending during the 1980’s symbolized America’s recommitment to the Cold War in the 1980’s after a long period of detente following the Vietnam War. Inaugurated by President Ronald Reagan and continued by President George H. W. Bush, the increased military spending of the 1980’s reflected a new philosophy for the Cold War. Instead of trying to achieve balance and parity with the Soviet Union, the United States sought to reassert its military and technical prowess. The increased military spending assisted in this process by expanding existing military forces, fielding new weapons systems, and funding research for advanced weapons technology. Money and Weapons Beginning with a relatively low budget in 1980, U.S. military spending climbed rapidly by the mid-1980’s before tapering off in the
Military spending
■
649
latter part of the decade. Starting at $303 billion actual dollars in 1980, defense spending reached a peak of $427 billion by 1987. This new spending represented a significant portion of the gross national product (GNP). Military spending consumed about 5.8 percent of the GNP, which was a significant percentage but only about half the percentage of GNP spent on defense during the Vietnam War. The increased funding of the 1980’s did not result in a significantly larger military in terms of personnel or facilities. In 1980, the United States employed 2,050,067 active-duty military personnel. That number increased to a peak of 2,174,217 in 1987, an increase of only 6 percent. The number of military bases around the world also increased only slightly. Indeed, the Pentagon initiated a process of closing or realigning unneeded military bases at the end of the 1980’s in order to channel funding to other projects. The greatest expenditure of defense dollars was on new weapons systems to replace obsolete Vietnamera weaponry. In most cases, the Pentagon acquired weapons that it had developed before the 1980’s but had previously lacked enough money to purchase in large numbers. The Air Force, for example, purchased new F-15 Eagle and F-16 Fighting Falcon fighters, as well as B-1 Lancer bombers, all of which had been in development since the mid-1970’s. The Army acquired new Abrams tanks, Bradley infantry fighting vehicles, and Patriot air defense missiles, all conceived in the 1970’s. The Navy increased its fleet with additional Nimitz-class aircraft carriers, Los Angeles-class attack submarines, and Ohio-class ballistic missile submarines, first introduced in the 1970’s. Other weapons systems that began in the 1970’s but figured prominently in 1980’s defense spending included the F-14 Tomcat fighter, the cruise missile, “smart” bombs, and low-observable aircraft (better known as “stealth” aircraft). In addition, many older but still capable weapons systems received upgrades to maintain their military effectiveness. Some of the B-52 Stratofortress bomber fleet, for example, received upgrades that kept them in service, despite some aircraft being older than the pilots who flew them. The Navy in particular kept elderly ships in service in order to reach the goal of a six-hundred-ship fleet proposed by President Reagan. Beside its new ships, the Navy kept in service the two obsolescent Midway-class aircraft carriers, and it even reactivated the four World
650
■
The Eighties in America
Miller, Sue
War II-era battleships of the Iowa class to serve as armored cruise-missile platforms. The Pentagon also spent more money on nuclear forces in the 1980’s, with all service branches acquiring new nuclear weapons. The Air Force, in addition to the new B-1 and upgraded B-52 bombers, replaced old Titan and Minuteman intercontinental ballistic missiles (ICBMs) with the new MX missile, also known as the “Peacemaker.” The Air Force also fielded cruise missiles armed with nuclear warheads. The Navy introduced the Trident submarinelaunched ballistic missile (SLBM) on its Ohio-class submarines, vastly increasing the number of nuclear warheads deployed at sea. The Army deployed the new short-range Pershing II missile at bases in Europe, despite vocal opposition from local antinuclear groups. The most controversial nuclear spending, however, was on the Pentagon’s efforts to create a space-based defense system against Soviet nuclear attack. Officially called the Strategic Defense Initiative (SDI), but more commonly referred to as “Star Wars” in the popular press, this proposed high-tech system would feature ground- and space-based defenses capable of shooting down incoming Soviet nuclear weapons. The project ceased, however, with the end of the Cold War. Impact The impact of military spending in the 1980’s is difficult to assess. Some believed that spending was excessive and that the resulting military buildup threatened to turn the Cold War hot. Others believed, however, that the military challenge to the Soviets caused the Soviet Union to go bankrupt in its attempt to match American military spending, thus speeding the end of Soviet communism. Regardless of either effect, military spending in the 1980’s certainly corrected the equipment and manpower deficiencies of the American military after Vietnam, as well as contributing to the massive budget deficits of the 1980’s. The reinvigoration and reequipping of the military also paid dividends at the end of the decade, when Iraq invaded Kuwait. Fully prepared after a decade of restructuring, the American military easily defeated Iraq’s army in the Gulf War of 1991. Further Reading
Conway, Hugh. Defense Economic Issues. Washington, D.C.: National Defense University, 1990. Examines 1980’s defense spending in the context of post-Cold War changes to the U.S. military.
Krc, Miroslav. Military Expenditures During and After the Cold War. Prague: Institute of International Relations, 2000. Comparative study of defense spending on both sides of the Iron Curtain during the Cold War, with several chapters on spending in the 1980’s. Mintz, Alex. Political Economy of Military Spending in the United States. New York: Routledge, 1992. Study of the political decisions governing which weapons are procured and how much to spend on them. Weinberger, Caspar W. Fighting for Peace: Seven Critical Years in the Pentagon. New York: Warner, 1990. As secretary of defense for President Reagan, Weinberger oversaw the massive defense spending of the 1980’s. Steven J. Ramold See also
Bush, George H. W.; Business and the economy in the United States; Cold War; Foreign policy of the United States; Goldwater-Nichols Act of 1986; Grenada invasion; Intermediate-Range Nuclear Forces (INF) Treaty; Panama invasion; Reagan, Ronald; Reagan Doctrine; Stealth fighter; Strategic Defense Initiative (SDI); Weinberger, Caspar.
■ Miller, Sue Identification American author Born November 29, 1943
Through her fiction, Miller explored concerns and attitudes prevalent in the 1980’s. Her first novel and major work of this decade, The Good Mother, reflected a societal recognition that women could now shape their own lives as they moved beyond previously prescribed gender roles and that all people could experience, as a result of self-understanding, more control over their lives. Sue Miller’s fiction reflects her own heritage, as well as concerns of the broader culture in which she wrote. Her father was an ordained minister and taught church history, and both of her grandfathers were Protestant clergymen. Her work often evinces a preoccupation with the moral implications of individual decisions, a concern that shapes the plot of The Good Mother (1986). Miller has attributed this concern to her family background. Writing The Good Mother when she was a divorced, single mother, Miller was aware of the difficulties and promises of
The Eighties in America
an independent life. Miller described characters in an era preoccupied with self-awareness and a subsequent sense of empowerment. Americans were coming to believe that introspection made it possible to understand one’s motivations, influences, and actions. New possibilities for women indicated that a woman could simply redirect her energies to create a new life based on her personal values and guidelines and perhaps have it all. For women, “having it all” generally meant having a happy family, satisfying work, and sexual fulfillment. For Anna Dunlap in The Good Mother, her recent divorce from Brian, a man she considered cold and repressed, means a new measure of joy and freedom. She looks forward to life alone with her four-yearold, dearly beloved daughter, Molly. She sees Molly and herself as a pair, with Molly as her sidekick. When she falls in love and finds emotional and sexual fulfillment with her new lover, Leo, she sees the possibility of even more happiness. As she and Leo spend more time together, they draw Molly into their relationship in what seems like an idyllic family, one that makes its own rules. Not having anticipated the missteps by her lover, nor the power of moral judgment against their unconventional lifestyle, Anna feels devastated when her ideal life crumbles. Her former husband sues for custody of Molly, and Anna faces the worst possible outcome, losing her custody battle. Inventing the Abbotts (1987), a collection of short stories, reflects the decade’s preoccupation with self. Characters evince awareness of self and of social class differences, and they negotiate the complex family relationships resulting from divorce and remarriage. As Miller explores the conflicts arising from the new possibilities of freedom and independence, her fiction asserts a moral perspective and recognizes the fallacy of control in shaping a life. Impact Miller’s studies of ordinary people represent the ideal life as a surprisingly complex and perhaps illusive goal, despite the freedoms of the 1980’s and the positive effects of conscious self-exploration and good intentions in its pursuit. Rather than critiquing 1980’s American culture directly, then, her fiction portrays the limits of the celebration of personal freedom at the decade’s core. Further Reading
Herbert, Rosemary. “Sue Miller.” Publishers Weekly 229, no. 18 (May 2, 1986): 60-61.
Minimalist literature
■
651
Humphreys, Josephine. “Private Matters.” The Nation 242, no. 18 (May 10, 1986): 648-650. Miller, Sue. The Story of My Father: A Memoir. New York: A. A. Knopf, 2003. Bernadette Flynn Low See also
Literature in the United States; Marriage and divorce; Psychology; Women in the workforce; Women’s rights.
■ Minimalist literature Definition
Literary movement characterized by the paucity and simplicity of its language and its focus on essential meanings and structures
Literary minimalism in the 1980’s was a reaction against what some saw as the excesses of postmodernism, particularly its increasingly tenuous links with everyday reality and its antirealist modes of representation. Minimalism also represented a reaction against the excesses of Ronald Reagan’s economic policies. Even before the 1980’s began, American literature had become increasingly theory-laden, seeking refuge in universities that began to cultivate crops of professional “creative writers.” A bevy of self-referential postmodern theories were on the ascendance, even as Susan Sontag and other critics railed against the encroachment of theories at the expense of representation and engagement. At the center of these interpretive wars, writers created languagecentered, self-reflexive self-parodies, cutting truth and reality loose in a maze of fictional and metafictional antinarratives. American minimalist literature of the 1980’s was a response to postmodernism’s aggressively antirealistic and antimimetic approach to subject matter and narrative technique. It is difficult to characterize the 1980’s minimalism as a cogent movement, however, because during the decade in which most of its practitioners achieved literary recognition, they never formed a unified school or published a universally adopted manifesto. Even at its peak, therefore, American literary minimalism was never more than a loose alliance of a small group of mostly West Coast writers who reacted against the social malaise of Reaganomics and spiritual malaise of the American way of life. Dusting up the techniques of realism of which postmodernism was so suspicious, they proceeded to cre-
652
■
The Eighties in America
Minimalist literature
Raymond Carver was among the most successful of the literary minimalists. (© Marion Ettlinger)
ate collections of short but poignant tales of average despondent Americans waiting for their Godot. Minimalist Authors and Influences The 1980’s literary minimalists—sometimes disparaged as “K-mart realists” or “dirty realists” because of their fascination with the humdrum, commercial monotony of everyday lives—wore their populist fascination with social lowlifes and outcasts with pride. With an uncompromising grimness of vision redolent of the muckraking era, and with the repetitious, pareddown, haunting cadences of Samuel Beckett, minimalist writers focused their attention on the mundane lives of characters addicted to alcohol, drugs, welfare, trailer park blues, or intellectual malaise. Setting their literary compass on architect Ludwig Mies van der Rohe’s dictum of “less is more,” they crafted prose miniatures (though some attempted to write novels, as well) that baffled with plotless narration and staggered with concentrated imagery. In a style stripped of verbal excess, with plots in abey-
ance and moral judgments suspended, Marilynne Robinson, Ann Beattie, and Richard Ford aired the kitchen side of America betrayed by the trickledown economic miracle that never happened. None of them, however, was as successful (both in the aesthetic and commercial sense) as Raymond Carver, whose disparate short stories became unified and exquisitely filmed in 1993 by Robert Altman in Short Cuts. In another sense, the minimalism of the 1980’s was but a contemporary phase in a much older power struggle between American literature of profusion and literature of restraint—between the poetry of Walt Whitman and that of Emily Dickinson, or between the prose of Henry James and that of Ernest Hemingway. Carver’s writing, as he himself acknowledged, owed in fact to the latter’s “iceberg” aesthetic, whereby seven-eighths of a narrative may take place beneath the surface of the textual representation. As Hemingway himself noted, the roots of his success in paring down the narrative lay in what he termed a “theory of omission.” One can omit or suppress any part of the narrative as long as this representational ellipsis or synecdoche makes readers feel things they might only vaguely understand. Hence the essence of the 1980’s literary minimalism: economy with words, focus on surface description, preponderance of noun and verb over adjective and adverb, unexceptional characters and mundane situations, and the expectation that readers will take an active role in the shaping of the emotional and moral import of a story, based on oblique hints and innuendo rather than direct presence of the author. In their short, open-ended narratives, American literary minimalists sought to present a “slice of daily bread and life” during the decade victimized by the fictional Gordon Gekko’s mantra of “greed is good.” Impact Although minimalism failed to stem the tide of self-centered postmodernistic antinarratives, it revitalized the realistic impulse in American literature and provided a keen, if depressing, record of the lowlights of life under Reaganomics. Further Reading
Baym, Nina, et al., eds. The Norton Anthology of American Literature. New York: Norton, 2007. An invaluable resource containing works by most of the eminent authors in American history—including the minimalists.
The Eighties in America
McDermott, James Dishon. Austere Style in TwentiethCentury Literature: Literary Minimalism. Lewiston, N.Y.: Edwin Mellen Press, 2006. Itself an example of a minimalist approach to language and analysis, this short book traces the minimalist aesthetic in twentieth century literature and philosophy. Strickland, Edward. Minimalism: Origins. Bloomington: Indiana University Press, 1999. A valuable interdisciplinary study of the cultural history as well as technical and formal aspects of minimalism across various art forms. Swirski, Peter, with David Reddall. “American Literature.” In Dictionary of American History, edited by David Hollinger. Rev. ed. Vol. 5. New York: Charles Scribner’s Sons, 2003. A condensed overview of the main phases of development of American literature, from the pre-Columbian times to minimalism and other contemporary formations. Peter Swirski See also
Art movements; Beattie, Ann; Deconstructivist architecture; Literature in Canada; Literature in the United States; Reaganomics; Theater.
■ Miniseries
Miniseries
■
653
funds in their yearly budgets for long-form development and production, which included two-hour original movies for television as well as miniseries. Because a miniseries required significant broadcast time, executives studied the ideal placement of the miniseries in the networks’ overall prime-time schedule. Anticipating that miniseries would receive higher ratings than regular prime-time series, the networks scheduled most miniseries during the so-called sweeps periods in November, February, and May. These periods were the months in which the networks set their advertising prices based on ratings, so it was important to their bottom line that shows aired during sweeps receive the highest ratings possible. Miniseries were thus expected to attract and keep large enough audiences over the course of their airing to justify the expense of producing them. One key to a successful miniseries was a “high concept” that could be easily promoted, generating a large audience, but one that would also be capable of sustaining audience interest through the conclusion of the show. The material that worked most effectively would therefore include a complex story. It would also feature compelling characters able to generate strong viewer sympathy, or antipathy, as appropriate. Best-selling books were the primary source of material. Such books and their authors brought preconstituted audiences to television.
Definition
Television format in which a unified story is told in two or more episodes broadcast in relatively close succession
At the height of its popularity in the 1980’s, the miniseries was a lynchpin of network programming reaching millions of viewers. As special events taking place over successive nights, miniseries were ideal programming for “sweeps” periods, garnering increased viewership for networks when they needed it most. The American miniseries originated in the mid1970’s with the twelve-part “novel for television” adaptation of Irwin Shaw’s Rich Man, Poor Man on the American Broadcasting Company (ABC). Alex Hailey’s Roots followed in 1977, enjoying the largest television viewing audience ever for a dramatic program. One hundred million viewers saw the final episode of Roots. Network Strategy By the beginning of the 1980’s, miniseries had become a vital part of prime-time programming. The broadcast networks apportioned
Source and Subject Matter
The subject matter of miniseries varied from melodramas, such as the National Broadcasting Company (NBC) series Rage of Angels (1983) and its 1986 sequel, to historical biography, such as the Columbia Broadcasting System (CBS) series George Washington (1984) and its 1986 sequel, to true crime dramas such as CBS’s Murder Ordained (1987). NBC’s Little Gloria, Happy at Last (1982) and Poor Little Rich Girl: The Barbara Hutton Story (1987), based on best sellers, explored the unhappy and dramatic lives of wealthy families. Another NBC miniseries, V (1983), was based on an original idea about an alien invasion. In 1980, NBC aired the twelve-hour Shogun, an adaptation of the epic James Clavell novel. The story followed a seventeenth century British sea navigator, or pilot (Richard Chamberlain), stranded in Japan. The pilot adapted so completely to the Japanese culture that he became a samurai for a shogun, or warlord. The producers and director insisted on shooting the miniseries in Japan, and much of the dialogue
654
■
The Eighties in America
Miniseries
Selected 1980’s Television Miniseries Year
Title
Network
1980
Scruples
CBS
Shogun
NBC
1981
Brideshead Revisited
PBS (originally aired in United Kingdom)
1982
Little Gloria, Happy at Last
NBC
1983
Princess Daisy
CBS
The Thorn Birds
ABC
Chiefs
CBS
The Winds of War
ABC
V
NBC
Rage of Angels
NBC
The Jewel in the Crown
PBS (originally aired in United Kingdom)
V: The Final Battle
NBC
George Washington
CBS
Hollywood Wives
ABC
Anne of Green Gables
CBC (aired on PBS in 1986)
North and South
ABC
North and South: Book Two
ABC
Rage of Angels: The Story Continues
NBC
George Washington: The Forging of a Nation
CBS
If Tomorrow Comes
CBS
Amerika
ABC
Murder Ordained
CBS
Poor Little Rich Girl: The Barbara Hutton Story
NBC
The Billionaire Boys Club
NBC
I’ll Take Manhattan
CBS
The Singing Detective
PBS (originally aired in United Kingdom in 1986)
The Bourne Identity
ABC
War and Remembrance
ABC
Lonesome Dove
CBS
Till We Meet Again
CBS
1984
1985
1986
1987
1988
1989
was in Japanese with English subtitles. Clavell’s book had been a best seller, but the networks took a big risk in adapting the period, non-American subject matter to the small screen. The gamble paid off, with a total of 125 million viewers over five nights, as well as three Emmy Awards. Best-selling authors Sidney Sheldon and Judith Krantz wrote about wealth, power, and sex. The authors’ names were so influential that they were tied contractually to their books and their television adaptations. Five of Sheldon’s novels were adapted as miniseries between 1980 and 1989. In 1985, CBS reportedly offered Sheldon the unheard-of sum of one million dollars for If Tomorrow Comes without executives having read the book. Four of Krantz’s titles were made into miniseries. Krantz and Sheldon frequently wrote novels whose lead characters were beautiful women. The networks preferred stories with female appeal, because advertisers relied on the buying power of women viewers. The networks favored casting regular-series stars in miniseries, but Chamberlain was the miniseries star of the 1980’s, second only to former Charlie’s Angels star Jaclyn Smith. In 1988, the two appeared together in a miniseries based on Robert Ludlum’s The Bourne Identity. Stories based on true crimes increasingly became the subject of miniseries throughout the latter half of the 1980’s. The networks sought out scripts based on contemporary stories and headlines. The Billionaire Boys Club, a fact-based story about murderous wealthy young men, aired on NBC in 1987 amid controversy. Attorneys sued NBC and the production company to stop the miniseries from airing, because an appeal and a jury trial of the young men were scheduled to take place. The courts decided in favor of NBC, however, and the miniseries aired on schedule.
The Eighties in America
In the mid-1980’s, all three networks changed ownership. The large media corporations that acquired the networks instituted cost-cutting measures and examined the miniseries as a source of overspending, particularly in the light of a recent decline in viewership. Two miniseries in the late 1980’s rep resented the changing network attitudes about miniseries. Lonesome Dove was based on Larry McMurtry’s Pulitzer Prize-winning novel. Feature film actors Robert Duvall and Tommy Lee Jones played “Gus” McCrae and Woodrow Call. The miniseries aired on CBS over four nights in 1989 to critical acclaim and high ratings. The network developed other Western miniseries, trying to recapture the popularity of Lonesome Dove. Throughout 1988-1989, ABC aired the thirty-four-hour War and Remembrance, the sequel to 1983’s The Winds of War. It took up much of the network’s schedule and was estimated to cost $100 million. The number of viewers fell below the estimate ABC had given to advertisers, and the network lost $30-40 million as a result. Impact
In the wake of the networks’ new corporate ownership, broadcast networks continued to program miniseries as important events, but not as frequently. They also began to limit each miniseries to only four hours in length. Production costs were too high for more frequent and longer miniseries, and the networks were also concerned to attract younger viewers with shorter attention spans. The miniseries began to move to cable television channels that were more hungry for special-event and “quality” programming. Miniseries also affected book-publishing strategies. Publishers recognized that a miniseries based on a book generated renewed interest in a title, and they developed tie-ins, placing paperback editions of a book “soon to be a miniseries” at grocery-store checkout counters. Further Reading
Auletta, Ken. Three Blind Mice: How the TV Networks Lost Their Way. New York: Random House, 1991. Details the factors that led to the precipitous decline of broadcast network viewership. Excellent behind-the-scenes descriptions. Marill, Alvin. Movies Made for Television: The Telefeature and the Mini-series, 1964-1986. New York: New York Zoetrope, 1987. Listing of television movies and miniseries. Roman, James. Love, Light, and a Dream: Television’s Past, Present, and Future. Westport, Conn.: Praeger,
Minivans
■
655
1996. Offers a useful perspective on different eras of televison. Vane, Edwin T., and Lynne S. Gross. Programming for TV, Radio, and Cable. Boston: Focal Press, 1994. Excellent information on ratings and networks. Nancy Meyer See also
Advertising; Literature in the United States; Ludlum, Robert; Sequels; Television.
■ Minivans Definition
Compact motor vehicles designed for transport of passengers and light cargo Date Introduced in North America in 1983 Designed with emphasis upon practicality and convenience, minivans were introduced in the early 1980’s and by the end of the decade had become a popular choice of transportation for American families. The origins of the minivan can be traced to the compact vans of the 1950’s and 1960’s, most notably the Volkswagen “microbus,” and to the station wagons popularized by American families of the post-World War II era. Early minivans were distinguished from conventional vans primarily by their use of underpinnings resembling a standard automobile chassis as opposed to a larger, specialized van chassis, but also differed from conventional vans by the presence of three rows of seats, sliding rear passenger doors, and single rear doors hinged at the top instead of at either side of the van’s tailgate. Many of the characteristics of the minivan, such as rear hatches and abundant seating, rendered it more akin to a station wagon than to a conventional van. The first vehicles to be labeled “minivans,” the Toyota Van and the Dodge Caravan, were introduced in North America in 1983. The Dodge Caravan, with its front wheels located in front of rather than beneath the front doors of the vehicle, set a precedent for conventional minivan design. The Caravan was the brainchild of Chrysler Corporation chief executive officer Lee Iacocca, who proposed it in the late 1970’s as inexpensive family transportation to replace the station wagon, which had become unappealing to mainstream consumers because of its association with blandness and conventionality. In addition to the capacity of the station wagon for transporting people and cargo, the minivan would
656
■
Minorities in Canada
provide increased headroom for passengers and seats that could be removed or folded for the transport of cargo. Because of its short drive shaft, the minivan could be designed low to the ground for maximum convenience in loading and unloading passengers (especially small children) and cargo. Although often criticized for its lack of sportiness and maneuverability, the minivan proved appealing to consumers seeking practical transportation while avoiding the relatively high cost of vans and sportutility vehicles and the social stigma of station wagons. By the end of the 1980’s, the minivan had become the vehicle of choice for many suburban families, and along with the sport-utility vehicle became symbolic of the busy suburban “soccer mom.” Impact The minivan completely eclipsed the station wagon as the stereotypical mode of transportation for middle-class American families during the 1980’s. Minivans remained in production throughout the 1990’s, but their sales slowly decreased in the United States as demand for sport-utility vehicles increased. The introduction of “crossover” vehicles combining the most popular features of minivans and sport-utility vehicles in the early twenty-first century decreased sales of traditional minivans while attesting to their lasting influence on American culture. Further Reading
Hinckley, Jim. The Big Book of Car Culture: The Armchair Guide to Automotive Americana. St. Paul, Minn.: Motorbooks, 2005. Levin, Doron P. Behind the Wheel at Chrysler: The Iacocca Legacy. New York: Harcourt Brace Jovanovich, 1995. Michael H. Burchett See also Chrysler Corporation federal rescue; Iacocca, Lee.
■ Minorities in Canada Definition
Racial, ethnic, cultural, and linguistic segments of the Canadian population
During the 1980’s, Canada strove to safeguard the rights and cultural heritage of its minority populations. The Charter of Rights and Freedoms guaranteed the right of both English and French speakers to be educated in their na-
The Eighties in America
tive language, even if they were in the linguistic minority in their province, while the Official Multiculturalism Act sought to recognize and promote the various cultures and heritages contributing to Canada’s national identity. In 1988, Canada adopted a policy of official multiculturalism. The Official Multiculturalism Act defined a framework by which to provide all Canadians equal access to economic, social, and political institutions. Canada’s evolution into a multicultural society had been gradual: The initial population of the nation was predominantly British, French, and aboriginal. After World War I, Ukrainians and Germans migrated to Canada, and they were followed by Italians and Hungarians after World War II. Later, less restrictive immigration policies opened Canada’s doors to groups from all over the world, creating an even more diverse population. The Official Multiculturalism Act, then, recognized and reinforced the existing diversity of the nation and defined institutional norms designed to preserve and promote that diversity. Earlier in the decade, the French Québécois and aboriginal, or First Nations, minorities were given major recognition with the implementation of the Constitution Act, 1982, including the Canadian Charter of Rights and Freedoms. For the Québécois, the French language was accorded equal status with the English language, making both the official languages of Canada. Furthermore, a section according minority language rights in education guaranteed children in the linguistic minority of their province the right to an education in their primary language. Meanwhile, the 1982 act formally defined aboriginal peoples of Canada as Indian, Inuit, and Métis. As part of these changes, aboriginal leaders were guaranteed participation in political and constitutional discussions affecting their peoples. Canada’s diverse population often renders political underpinnings complex, particularly in terms of issues of equality and fairness. For instance, Canada is multinational, meaning the nation incorporates groups that had previously governed themselves, such as the Québécois and First Nations peoples. It is also polyethnic, meaning that people of various ethnicities have entered the population by immigrating from other countries. Thus, the adoption of multicultural policies was seen in the 1980’s as a matter of necessity in order to ensure the full and proper incorporation of Canada’s diverse citizenry into the
The Eighties in America
social, political, cultural, and economic life of the nation. Multiculturalists endorsed these policies as embracing a politics of recognition, highlighting racial, linguistic, and other differences to ensure equality for all Canadians. Multicultural Citizenship
Policymakers were particularly concerned to promote multiculturalism, because Canadian society and its institutions were historically inexperienced in dealing equitably with ethnic groups, whether visible minority groups or white ethnic minorities. Initially, as both European and non-European migration brought people to Canada that did not conform to the Anglo-Saxon culture and physical appearance, certain groups experienced systemic discrimination that hindered their economic, social, and political progress. Thus, having implemented an immigration policy encouraging diversity, policymakers acknowledged a need to accommodate that diversity. Equally important, the province of Quebec contained a majority of French Canadians who sought recognition of the distinct nature of their society within Canada. Quebec had refused to ratify the new constitution in 1982, and incorporating the majority French province fully within the majority English nation became a high priority in the 1980’s. Canada’s government sought to give both the public and private sectors a model for policies designed to produce multicultural equality. Scholars suggested that equality was an important context within which to advocate for the recognition of difference. Such recognition alone could be detrimental to a vital national culture, but when it was understood to be in the service of equality it took on a constructive role, strengthening Canadian identity and society.
Impact Canada’s multicultural policies were not universally embraced. Indeed, they polarized Canadian politics, as a debate raged over the proper relationships among the nation, the provinces, and their minority populations. Opponents of official multiculturalism argued that nations such as France, the United States, and the United Kingdom were able to achieve equality of treatment without officially sanctioning cultural diversity. They argued that Canadian institutions such as the courts, the police, and the health care system were already equipped to deal equitably with Canada’s populace without instituting official multicultural policies. Furthermore, they claimed that the Canadian system faced a grave dan-
Miracle on Ice
■
657
ger if it followed ideals of multicultural citizenship, arguing that multiculturalism was divisive rather than unifying. Many suggested that equality could not be guaranteed merely by recognizing culture as a point of reference. Proponents, meanwhile, supported the policy by referring to Canada’s history, recalling a time when race and language were used to hinder equality of progress within society. They feared that such oppression could continue without an official recognition of the value of difference to Canada’s well-being as a nation. Further Reading
Abu-Laban, Yasmeen, and Daiva Stasiulis. “Constructing ‘Ethnic Canadians’: The Implications for Public Policy and Inclusive Citizenship.” Canadian Public Policy 26, no. 4 (December, 2000): 477-487. Bissoondath, Neil. Selling Illusions: The Cult of Multiculturalism in Canada. Toronto: Penguin Books, 1994. Mackey, Eva. The House of Difference: Cultural Politics and National Identity in Canada. Toronto: University of Toronto, 2002. McRoberts, Kenneth. “Quebec: Province, Nation, or ‘Distinct Society?’” In Canadian Politics in the Twenty-first Century, edited by M. Whittington and G. Williams. Scarborough, Ont.: Nelson Thomson Learning, 2000. Esmorie J. Miller See also
Aboriginal rights in Canada; Canada Act of 1982; Canada and the British Commonwealth; Canadian Charter of Rights and Freedoms; Education in Canada; Immigration to Canada; Multiculturalism in education; Trudeau, Pierre.
■ Miracle on Ice The Event
The U.S. ice hockey team defeats the Soviet team in the 1980 Olympics Date February 22, 1980 Place Lake Placid, New York The U.S. ice hockey team was a lightly regarded squad, consisting largely of current and former college players, yet its stellar play against more experienced teams, highlighted by its stunning upset of the highly favored team from the Soviet Union, led to one of Olympic hockey’s most unlikely victories.
658
■
Miracle on Ice
The Eighties in America
The U.S. hockey team celebrates its miraculous victory over the Soviet Union in the 1980 Winter Olympics. (AP/Wide World Photos)
The U.S. hockey team had won only a single Olympic medal since 1960 going into the 1980 Winter Olympics, and the 1980 squad appeared to have little chance of adding to that total. The team had lost to the Soviet Union 10 to 3 in a pre-Olympic exhibition game, and it was placed in a preliminary pool with the highly regarded teams from Sweden and Czechoslovakia. In its opening match, however, it stayed close to the Swedes throughout the contest and scored a goal in the game’s final minute to salvage a 2-2 tie. Two days later, the United States surprised the Czechs in a lopsided 7-3 victory, then went on to complete pool play with relatively easy wins over Norway, Romania, and West Germany. Though the team had played well throughout the tournament under the leadership and inspiration of coach Herb Brooks, its first match in the medal
round, on February 22 against the Soviet Union, was not expected to be close. The Soviets had won four straight gold medals and had beaten their opponents in pool play by a combined score of 51 to 11. However, the first period of the U.S.-Soviet match ended with the score tied at two goals apiece after a last-second goal by the United States. The Soviet coach responded by pulling his star goalie from the game, replacing him with the backup goalie. The change helped shut down the U.S. squad until nearly halfway through the third period, when a U.S. goal tied the score at 3 to 3. Less than two minutes later, another goal gave America its first lead of the game, and the U.S. defense, led by goalie Jim Craig, held off a furious Soviet assault to complete the stunning 4-3 victory, prompting sportscaster Al Michaels’s famous comment, “Do you believe in miracles? Yes!”
The Eighties in America
Two days later, still needing a victory over Finland to secure the gold medal, the United States once again came from behind in the final period to earn a 4-2 win and achieve its surprising Olympic victory. Impact The year 1980 was a difficult one for the United States. A deep recession had taken a toll on the nation’s economy, and the Iranian hostage crisis and the Soviet invasion of Afghanistan had left the nation feeling helpless and vulnerable in a dangerous world. In what would be the first of several incidents of the Cold War bleeding into the athletic arena during the 1980’s, the U.S. team’s victory restored a large measure of national pride to America’s damaged psyche. Further Reading
Bernstein, Ross. America’s Coach: Life Lessons and Wisdom for Gold Medal Success—A Biographical Journey of the Late Hockey Icon Herb Brooks. Eagan, Minn.: Bernstein, 2006. Coffey, Wayne. The Boys of Winter: The Untold Story of a Coach, a Dream, and the 1980 U.S. Olympic Hockey Team. New York: Crown, 2005. Powers, John, and Arthur C. Kaminsky. One Goal: A Chronicle of the 1980 U.S. Olympic Hockey Team. New York: Harper & Row, 1984. Devon Boan See also
Cold War; Hockey; Olympic Games of 1980; Soviet Union and North America; Sports.
■ Missing and runaway children Definition
Abducted children or children who run away from home
The issue of missing and runaway children became widely identified as a growing problem as print and electronic media covered stories of these children during the 1980’s. Individuals who endured personal and sometimes tragic accounts of missing children became vocal advocates and helped to put a public face on missing children by mounting a campaign to put missing children’s photos on milk cartons at the supermarkets. Public sympathy for victims and families created political pressure. “Missing children” is a term for a societal problem that gained popular attention following Senate hearings on exploited children in 1981 and 1982.
Missing and runaway children
■
659
Missing children were categorized as children missing through abduction by family, usually parental abduction during or after divorce proceedings; children missing through abduction by strangers; runaways, children who either elect or are forced through some family situation to leave their home and parental care; throwaways, usually older children who are abandoned by parents or kicked out of a home; and the lost, injured, or otherwise missing. Parental and familial abduction issues gained traction in the media and public eye in the early 1980’s. Georgia K. Hilgeman’s daughter was abducted by her former husband and taken to Mexico. The experience motivated Hilgeman to establish the nonprofit Vanished Children’s Alliance in 1980. In response to the growing number of international parental abduction cases, in the same year the Hague Convention on the Civil Aspects of International Child Abduction established protocols for dealing with custody issues that crossed international boundaries. The U.S. Congress enacted the Parental Kidnapping Prevention Act, which assured that full faith and credit was given to child custody determinations. In 1981, six-year-old Adam Walsh was abducted from a shopping mall in Hollywood, Florida. John Walsh, Adam’s father, mounted a vigorous media campaign to locate his son. Adam was found murdered in another part of Florida one month after his disappearance. John and Revé Walsh established the Adam Walsh Child Resource Center in Florida and lobbied to help pass the Missing Children Act of 1982. The media exposure helped transform the issue of missing children into a national cause. John Walsh became a spokesman for victims’ rights and a television personality as host of America’s Most Wanted, which first aired in 1988. The emotional appeal of missing children sold magazines and newspapers and brought millions of viewers to local television news, talk shows, and prime-time programming. The public primarily identified missing children as victims of parental or stranger abduction. News media attention, along with comprehensive statistics on missing children, motivated politicians to pass the Missing Children’s Assistance Act of 1983, which established a national toll-free telephone line for missing children and a national resource center and clearinghouse, the National Center for Missing and Exploited Children (NCMEC) in Washington, D.C.
660
■
The Eighties in America
Mommy track
Impact The missing children campaign fostered growth of nonprofit organizations with protocols for registration and tracking of missing and runaway children, and some children were reunited with parents and guardians. Informational guidelines for child safety were created for children and parents. Sadly, runaway children whose backgrounds reflected complicated social ills such as abuse, neglect, drug use, and family disintegration were a harder story to sell to the public, and society lost the opportunity for a discussion of these issues. Further Reading
Finkelhor, David, Gerald T. Hotaling, and Andrea Sedlak. Missing, Abducted, Runaway, and Thrownaway Children: First Report—Numbers and Characteristics National Incidence Study. Darby, Pa.: Diane, 1990. Forst, Martin L., and Martha-Elin Blomquist. Missing Children: Rhetoric and Reality. New York: Lexington Books, 1991. Kryder-Coe, Julee H., et al. Homeless Children and Youth: A New American Dilemma. New Brunswick, N.J.: Transaction, 1991. Walsh, John, and Susan Schindehette. Tears of Rage: From Grieving Father to Crusader for Justice—The Untold Story of the Adam Walsh Case. New York: Pocket Books, 1997. Nancy Meyer See also AIDS epidemic; America’s Most Wanted; Crime; Lucas, Henry Lee; Tabloid television; Television.
■ Mommy track Definition
Nickname for a slower, lower-paid career path tailored to women who combined motherhood with employment
The mommy track controversy increased awareness in corporate America of the value of women employees and of the need to create innovative strategies to use all workers’ talents by accommodating career and family demands for both men and women. Prior to the 1980’s, the baby boom of the 1950’s had created large entry-level labor pools, and corporations seeking to fill executive positions almost always hired men who graduated from prestigious universi-
ties in the top 10 percent of their class. With the postboom drop in birth rates, the potential male workforce declined as well, leading corporations to recruit and train women for managerial positions. Furthermore, women were obtaining more degrees from top-tier universities and were entering the workforce in greater numbers. Women generally cost more to employ in managerial positions than do men, because, in addition to bearing children, they are more likely to bear the brunt of raising children and taking care of family members than are men. These familial obligations cause women to experience more career interruptions and greater job turnover than do men. Consequently, time and money invested in recruiting and training women produce fewer corporate executives on average than do comparable resources spent on men. To maximize productivity and recognize talented, creative women, 1980’s corporations had to consider how to recruit and retain women while minimizing the costs resulting from women’s biological and social roles. In 1989, Felice N. Schwartz proposed constructing more flexible work environments to balance the personal and professional obligations of women, thus maximizing the chances of success for corporate women. In an article published in the Harvard Business Review intended for a corporate readership, Schwartz asserted that businesses increasingly depended on women workers. She classified these workers into two types: “career primary” women, who focused on professional advancement over family in a fashion similar to most male employees, and “career and family” women, who sacrificed some career growth and compensation in order to spend more time fulfilling family commitments. Many women entering middle management, Schwartz contended, wanted accommodations to help balance their familial and career responsibilities. Schwartz proposed judging and rewarding “career primary” women on the same criteria as men, while allowing flexibility for “career and family” women. For example, employers could offer jobsharing, part-time work, and child-care support to maximize productivity while accommodating these working mothers. Opportunity or Insult?
Although Schwartz never used the term “mommy track” in her article, the media quickly applied the term to the “career and fam-
The Eighties in America
Mommy track
■
661
primary familial breadwinners and perpetuated the idea that women could not have both career and family. Schwartz’s Perspective
The criticism her article received surprised Schwartz, and perhaps rightly so. As the founder and president of Catalyst, a firm designed to further career options for women, Schwartz had devoted her life to advocating for women in the corporate world. Schwartz dismissed the term “mommy track” as useless and claimed her work was intended to focus on ways corporations could retain women employees and remove barriers to their productivity. She supported flexibility in the workforce and responsiveness to family demands to maximize women’s potential. She believed that identifying women as “career primary” and “career and family” would allow corporations to protect their investments in outstanding women employees.
Felice N. Schwartz in January, 1990. Schwartz became associated with the term “mommy track,” although she never used it in her article advocating greater flexibility for female professionals. (AP/ Wide World Photos)
ily” path she proposed, and controversy over the concept of a mommy track swept across the United States. One perspective viewed the mommy track as creating an ideal situation, allowing women employment in rewarding professions while at the same time providing them with time for family. The mommy track in this view was an opportunity, permitting women choice in employment. Others believed that the very concept of a mommy track demeaned women’s work, demonstrated a throwback to attitudes of the 1950’s, and served as another term for sex discrimination in the workplace. Some feminists, in particular, sharply criticized the mommy track, claiming it reinforced the stereotype of men as
Impact The popular media debate over the idea of a mommy track represented one episode in a general cultural discussion during the 1980’s over the proper roles of women, both in the workforce and in society at large. The debate demonstrated that women in the 1980’s had become firmly entrenched in the workforce: The question was no longer whether to incorporate them into the public sphere, but how to do so most effectively, both from the standpoint of employers and from the standpoint of women employees. However, the debate also demonstrated the extent to which this question raised issues that were still extremely sensitive in American society, relating to the cultural expectations that all women become mothers, that all mothers be primary caregivers, and that career and motherhood— while combinable—were necessarily in competition with each other in American women’s lives. These expectations remained sources of great contention throughout the decade. Further Reading
Castro, Janice. “Rolling Along the Mommy Track.” Time 133, no. 13 (March, 1989): 72. Addresses reactions to the idea of mothers choosing between career and family through a two-tiered workforce identifying women as mothers or achievers. Ehrlich, E. “Is the Mommy Track a Blessing or Betrayal?” Business Week, no. 3105 (1989): 98-99. Provides reaction to an initial Business Week article proposing two career paths for women. Offers
662
■
some examples of how businesses accommodated corporate mothers. “Harvard Business Review” on Work and Life Balance. Boston: Harvard Business School Press, 2000. Examines the balance between personal and professional lives, including the mommy track, telecommuting, and burnout. Quinn, Jane Bryant, et al. “Revisiting the Mommy Track.” Newsweek 136, no. 3 (July, 2000): 44. Discussion of the trend of young women choosing motherhood over career. Raab, Phyllis Hutton. “The Organization Effects of Workplace Family Policies: Past Weaknesses and Recent Progress Toward Improved Research.” Journal of Family Issues 11, no. 4 (1990): 477-491. Demonstrates how implementing mommy track policies leads to improved earnings and achievement of women, as well as reducing stress in Western Europe and the United States. Schwartz, Felice. “Management Women and the New Facts of Life.” Harvard Business Review 67 (January/February, 1989): 65-76. Controversial article outlining a two-tier track for women in managerial positions. Schwartz, Tony. “In My Humble Opinion: While the Balance of Power Has Already Begun to Shift, Most Male CEO’s Still Don’t Fully Get It.” Life/ work 30 (November, 1999). Written from the perspective of time and distance, Felice Schwartz’s daughter reexamines her mother’s original proposal, which fueled the mommy track controversy. Skrzycki, Cindy. “‘Mommy Track’ Author Answers Her Many Critics.” The Washington Post, March 19, 1989, p. A1. Reports Schwartz’s response to her criticism. Documents her feminist work in the corporate world. Barbara E. Johnson See also
The Eighties in America
Mondale, Walter
Affirmative action; Biological clock; Business and the economy in the United States; Feminism; Glass ceiling; Political correctness; Sexual harassment; Women in the workforce.
■ Mondale, Walter Identification
U.S. vice president from 1977 to 1981 and 1984 Democratic presidential nominee Born January 5, 1928; Ceylon, Minnesota Mondale was one of the most visible leaders of the Democratic Party in the first half of the decade. His choice of Geraldine Ferraro as his 1984 vice presidential running mate was both a practical and a symbolic milestone in the rise of American women politicians. Walter Frederick Mondale (nicknamed “Fritz”) was a member of the Democratic-Farmer-Labor Party, a composite party unique to Minnesota. Mondale first became involved in national politics when he helped organize Hubert Humphrey’s successful senatorial campaign in 1948 at the age of twenty. After serving in the Army during the Korean War, Mondale graduated from the University of Minnesota Law School in 1956 and practiced law for four years. In 1960, Minnesota governor Orville Freeman appointed Mondale—who had managed Freeman’s 1960 gubernatorial campaign—as Minnesota’s attorney general. From age thirty-two to age thirty-six, Mondale served two terms as attorney general. In 1964, Mondale was appointed to the U.S. Senate, when Hubert Humphrey became U.S. vice president. Mondale was reelected in 1972. Vice President
When Jimmy Carter became the Democratic presidential nominee in 1976, he selected Mondale as his vice presidential running mate. Elected in November, 1976, Mondale became the first vice president to live in the official vice presidential residence, which had been converted from the old Naval Observatory. With Carter’s support, Mondale became the most active vice president to that point in American history, troubleshooting executive offices and functions and advising the president. He helped change the vice presidency from a figurehead office into a full-fledged participant in the presidential administration, making it possible for subsequent vice presidents to play a much larger political role as well. Not only was Mondale the first vice president to live in a formal vice presidential residence, but he also had an office in the West Wing of the White House, indicating his substantive role in the government. Mondale became a major proponent of Carter’s
The Eighties in America
foreign and domestic policies, traveling across the United States and the globe in this role. Carter and Mondale won the 1980 Democratic presidential and vice presidential nomination, albeit with more difficulty than was usual for incumbents. They were defeated by Ronald Reagan and George H. W. Bush in a three-way race, in which independent presidential candidate and former Republican congressman John Anderson may have siphoned off enough votes from the Carter-Mondale ticket to give the election to the Republicans. Presidential Nominee Mondale practiced law for the next few years and positioned himself for a presidential run in 1984. The early front-runner for the Democratic nomination, he gained a majority of the delegates before the Democratic National Convention, defeating both the Reverend Jesse Jackson and Senator Gary Hart of Colorado convincingly after a spirited campaign. The highlight of the convention was Mondale’s selection of a New York congresswoman, Geraldine Ferraro, as his running mate. Mondale seemed determined to set a precedent with his choice, and he succeeded, since Ferraro was the first woman to receive a major party vice presidential nomination. Presumably, this decision should have increased the percentage of women who voted for the ticket. Unfortunately, Ferraro’s liabilities outweighed the benefits. Although the ticket gained the support of many women, it did not receive a majority of the 1984 women’s vote. As a Roman Catholic, Ferraro was attacked by the Church for her prochoice stance. She also damaged her credibility when she waffled on a promise to release her husband’s tax returns. Mondale took liberal positions in the campaign, endorsing a nuclear weapons freeze and passage of the Equal Rights Amendment. Attempting to gain strength from candor, Mondale said he would raise taxes and asserted that incumbent President Ronald Reagan would be forced to do likewise. Such a liberal stance provided no way to divert strength from a popular incumbent who appeared strong on national defense and responsible for economic prosperity. Mondale was a skilled debater and managed to do very well in the first debate, planting doubts about Reagan’s ability to govern, since Reagan was the oldest person ever to serve as president. Reagan regained momentum in the following debate with his memorable line, “I will not make age an issue of
Mondale, Walter
■
663
this campaign. I am not going to exploit, for political purposes, my opponent’s youth and inexperience.” In the election, Mondale was defeated in a landslide, winning less than of 41 percent of the popular vote and carrying only the District of Columbia (which has always voted Democratic) and his home state of Minnesota (by a few thousand votes), thereby garnering only 13 electoral votes to Reagan’s 525. The Democrats sustained the worst electoral result in their history. No other major-party candidate had done worse since Democrat Franklin D. Roosevelt defeated Alf Landon in 1936. Impact Mondale’s lasting legacy was the result of his choosing a female running mate. Subsequently, there was an upsurge in women candidates and elected officials in the United States and a general
Presidential candidate Walter Mondale, left, and running mate Geraldine Ferraro wave to their welcoming party as they arrive in South Lake Tahoe, California, in mid-July, 1984. (AP/Wide World Photos)
664
■
Montana, Joe
The Eighties in America
tendency for the Democratic Party to benefit from the “gender gap,” with women being more likely to vote for Democrats than Republicans. Further Reading
Forest, John. Warriors of the Political Arena: The Presidential Election of 1984. New York: Vantage Press, 1986. Useful account of the 1984 presidential campaign. Gillon, Steven M. The Democrats’ Dilemma: Walter F. Mondale and the Liberal Legacy. New York: Columbia University Press, 1992. Important scholarly analysis of the problems the Democratic Party had in the 1980’s competing against the highly successful national political agenda set by Reagan and Bush. This book may have helped Bill Clinton adjust the Democrats’ agenda in order to win in 1992. Mondale, Walter. The Accountability of Power: Toward a Responsible Presidency. New York: David McKay, 1975. Written for his 1976 presidential campaign, this work by Mondale sets out his stand on national issues, which later helped set the Democratic agenda of the mid-1980’s. Rainey, Austin. American Elections of 1984. Durham, N.C.: Duke University Press, 1985. Leading political scientist analyzes the 1984 election, considering the impact of the Mondale candidacy on the congressional and gubernatorial contests. Witt, Linda, Karen M. Paget, and Glenna Matthews. Running as a Woman: Gender and Power in American Politics. New York: Free Press, 1993. Important scholarly work, including material on Ferraro’s vice presidential bid. Richard L. Wilson See also
Cold War; Elections in the United States, 1980; Elections in the United States, 1984; Ferraro, Geraldine; Hart, Gary; Liberalism in U.S. politics; Reagan, Ronald; Reagan Democrats.
■ Montana, Joe Identification American football player Born June 11, 1956; New Eagle, Pennsylvania
Montana was the most successful quarterback in the NFL during the 1980’s, leading the San Francisco 49ers to seven Western Division titles, five NFC championships, and four Super Bowl titles. Montana is the only player in NFL history to win three Super Bowl most valuable player awards.
San Francisco 49er Joe Montana runs out of the pocket while he looks downfield for a receiver. (Hulton Archive/Getty Images)
A third-round pick in the 1979 National Football League (NFL) draft, Joe Montana became the San Francisco 49ers’ starting quarterback ten games into the 1980 season. In a late-season game against the New Orleans Saints, Montana rallied the 49ers from a 35-7 deficit to a 38-35 overtime victory. It was the first of many late-game rallies Montana would engineer for the 49ers, feats that earned him the nickname “Joe Cool.” In 1981, Montana led the 49ers to a Super Bowl victory. More impressive even than his Super Bowl performance—for which he won the most valuable player (MVP) award—was Montana’s showing in the National Football Conference (NFC) Championship game. With the 49ers trailing the Dallas Cowboys 26 to 21 with less than five minutes remaining,
The Eighties in America
Montana directed an eighty-nine-yard drive capped by the most famous play in 49ers history—“the Catch”—a six-yard touchdown pass from Montana to Dwight Clark. Following a dismal 1982 season and a disappointing 1983 season that ended in a loss to the New York Giants in the NFC Championship game, Montana and the 49ers returned to the Super Bowl in 1984, defeating the Miami Dolphins 38 to 16 on the strength of another Montana MVP performance: He completed twenty-four of thirty-five attemped passes, throwing for a Super Bowl-record total 331 yards. Following the 1988 and 1989 regular seasons, the 49ers won two more Super Bowl titles. The game capping the 1988 season (played January 22, 1989) saw more Montana magic, as Joe Cool orchestrated the most dramatic drive in Super Bowl history. Trailing the Cincinnati Bengals 16 to 13, Montana and the 49ers had just over three minutes to move the ball from the San Francisco eight yard line to within field goal range. The team could then kick a field goal, tie the game, and force it to go into overtime. Eleven plays later, the 49ers were celebrating not a game-tying field goal but a game-winning touchdown pass, completed from Montana to John Taylor with thirty-four seconds remaining. The Super Bowl following the 1989 season (played January 28, 1990) brought Montana a third MVP trophy. (Jerry Rice was Super Bowl MVP in 1988.) Completing twenty-two of twenty-nine attempted passes, including five for touchdowns (a Super Bowl record), Montana led the 49ers to a 55-10 victory over the Denver Broncos. Impact The 1980-1989 San Francisco 49ers were more than just Joe Montana’s team. The team had a great coach in Bill Walsh, a gifted running back in Roger Craig, and star receivers in Jerry Rice and Dwight Clark. Montana, nevertheless, was the embodiment of NFL success in the 1980’s. Not only did he win numerous awards (Offensive Player of the Year in 1989; Comeback Player of the Year in 1986; MVP of the league in 1989), but he also helped make the 49ers only the second team in NFL history (after the Pittsburgh Steelers of the 1970’s) to win four Super Bowls. Further Reading
Barber, Phil, and John Fawaz. NFL’s Greatest: Pro Football’s Best Players, Teams, and Games. New York: DK, 2000.
Moonlighting
■
665
Italia, Bob. Joe Montana. Edina, Minn.: Abdo & Daughters, 1992. Montana, Joe. Joe Montana’s Art and Magic of Quarterbacking. New York: Henry Holt, 1997. Matt Brillinger See also
Elway, John; Football; Rice, Jerry; Sports; Taylor, Lawrence.
■ Moonlighting Identification Television comedic drama series Creator Glenn Gordon Caron (c. 1954) Date Aired from March 3, 1985, to May 14, 1989
Combining the silliness of sitcom comedy, the suspense of a detective series, and sexual tension between the two main characters, Moonlighting was an innovative and critically praised television series. The show, with its mix of comedy and drama, produced a new television genre: the “dramedy.” Moonlighting was created and written by Glenn Gordon Caron for former high-fashion model turned actress Cybill Shepherd. The premise of the show was that Shepherd’s character, Maddie Hayes, a successful high-fashion model, had retired from modeling, only to discover she had been cheated out of all her assets except her home and the Blue Moon Detective Agency. The unsuccessful agency employed only a secretary and one detective, the wisecracking, street-smart David Addison. Addison was played by Bruce Willis, an unknown selected from more than three thousand actors who had tried out for the part. The initial conflict between the straight-laced, bythe-book management style of Hayes, determined to make the agency a success, and the laid-back, instinctive style of Addison quickly developed into a sexual tension. Introduced by the American Broadcasting Company (ABC) as a midseason replacement in March, 1985, the series tied for twentieth place in the Nielsen ratings that year. In the 1986-1987 season, it rose to ninth place and was nominated for a number of awards for both comedy and drama. Willis won an Emmy for Outstanding Lead Actor in a Drama, as well as a Golden Globe for Best Actor in a Comedy/ Musical, both in 1987. Shepherd won Golden Globes in 1986 and 1987 for Best Actress in a Comedy/ Musical.
666
■
The Eighties in America
Moral Majority
Critics praised the series for its innovations. Characters, particularly Addison, would often “break the fourth wall,” by speaking directly to the audience. One episode was filmed in black and white, adopting a film noir style; another was a feminist take on Shakespeare’s The Taming of the Shrew, in which the characters dressed in Elizabethan costumes and spoke in iambic pentameter. Playing to a mediaconscious audience, the content of the episodes was full of references to 1980’s pop culture. Although many television viewers loved the variety of visual and sound techniques employed in the series, they were more attuned to Maddie and David’s relationship, wondering when they would, to use Addison’s words, “get horizontal.” The March 31, 1987, episode, titled “The Big Bang,” was announced by a half-page ad in TV Guide, stating “No more between the lines. Tonight’s between the sheets.” An estimated sixty million viewers tuned in, surpassing the numbers for the Academy Awards, but the episode marked the beginning of a downward spiral for the show. Caused in part by the dissipation of the sexual tension between its stars, the show’s decline also resulted from scheduling problems caused by Shepherd’s pregnancy, as well as the departure of series creator Caron. Impact Unlike most other quality television shows of the 1980’s, Moonlighting did not employ a large ensemble cast. The show centered on the relationship of the two major characters, whose fast-talking, wisecracking repartee enchanted its audience. As both an hour-long comedy and a drama, its fun and serious nature paved the way for “dramedies” of the future. Further Reading
Joyrich, Lynne. Re-viewing Reception: Television, Gender, and Postmodern Culture. Bloomington: Indiana University Press, 1996. Thompson, Robert J. Television’s Second Golden Age: From “Hill Street Blues” to “ER.” Syracuse, N.Y.: Syracuse University Press, 1997. Williams, J. P. “The Mystique of Moonlighting.” Journal of Popular Film and Television 16, no. 3 (1988): 90-99. Marcia B. Dinneen See also Cagney and Lacey; Feminism; Glass ceiling; Television.
■ Moral Majority Identification
Christian conservative political organization Date 1979-1989 The Moral Majority was one of the first political organizations formed for the purpose of campaigning for the election of political candidates who espoused Christian conservatives’ social values. Founded in 1979 by the Reverend Jerry Falwell, the Moral Majority was dedicated to promoting Christian conservative concepts of morality and social responsibility. The Moral Majority served as a national headquarters, providing direct mailing lists and other information services to conservative organizations and individuals throughout the United States. In 1980, it distributed the Family Issues Voting Index, rating candidates on their support for what it called “family values.” The organization was credited with helping ensure the election of Republican candidate Ronald Reagan to the presidency in 1980 and 1984. Falwell, founder of Thomas Road Baptist Church and Liberty University, in Lynchburg, Virginia, served as public spokesman for the Moral Majority. He used his nationally syndicated weekly television program, The Old-Time Gospel Hour, to solicit support for conservative candidates from its twentyfive million viewers. By 1981, Thomas Road Baptist Church’s weekly services were broadcast on 392 television stations and 600 radio stations. Another strong voice for the Moral Majority was the Reverend Pat Robertson, founder of Christian Broadcasting Network (CBN) and the American Center for Law and Justice. Robertson and fellow televangelist James Robinson joined in the efforts to get Christian conservatives elected to office. Robertson mounted a campaign for the presidency himself in 1988, in which he won five states’ electoral votes. Other Moral Majority supporters included brewery magnate Joseph Coors; Ed McAteer and Bob Billings of the Religious Roundtable; Reed Larson of the Right to Work lobby; Connie Marschner, leader of the National Pro-Family Coalition; Phyllis Schafly, president of Eagle Forum and STOP-ERA; and Congressman Larry MacDonald of the John Birch Society. The Council for National Policy, founded in 1980 by Baptist minister Tim LaHaye, cooperated with the Moral Majority. Contrary to some critics who characterized the Moral Majority as a fundamentalist organization,
The Eighties in America
the movement was ecumenical in scope. Among its adherents were Catholics, Jews, Mormons, Evangelicals, and mainline Protestants. Issues The Moral Majority campaigned primarily in support of a “pro-family,” pro-life, anti-big government, and anticommunist agenda. Members lobbied to overturn the U.S. Supreme Court’s decision in Roe v. Wade (1973) and to outlaw abortion. The organization promoted its vision of the proper American family, advocating two-parent homes for children and deploring the frequency of divorce and the increasing rate of cohabitation without marriage. Members criticized the federal welfare system, claiming that it encouraged promiscuity and the parental abandonment of family responsibilities. They blamed the high rate of taxation for forcing women out of the home and into the workplace. They lobbied against the Equal Rights Amendment (ERA) to the U.S. Constitution, and the failure of the states to ratify the amendment was attributed directly to the Moral Majority’s opposition. Moral Majority adherents accused the federal government of using welfare programs to convert public schools from educational into socialization institutions and blamed big government for the decline in the quality of public education. However, they advocated the teaching of creationism along with the theory of evolution in public schools. They opposed the involvement of homosexuals in teaching children and the recognition of homosexual unions as marriage. They campaigned for the inclusion of abstinence and moral values in all sex education classes taught in public schools. The Moral Majority attacked court decisions that limited prayer in public schools and ordered the removal of Christian religious symbols from public facilities. It lobbied for the appointment of more conservative judges to all federal courts, especially the Supreme Court. In foreign affairs, the organization adopted pro-Israel and anticommunist positions and favored a strong national defense policy. In 1980, it lobbied against the U.S.-Soviet Strategic Arms Limitation Treaties (SALT), aimed at limiting nuclear weapons and partial disarmament. The Moral Majority encountered strong opposition when it proposed censorship of print and electronic media that purveyed pornography and what it labeled an “anti-family” agenda. Purveyors of pornography reacted against the Christian movement
Moral Majority
■
667
by filing lawsuits and attacking the character of Christian leaders. For example, Larry Flynt, publisher of Hustler Magazine, attacked Jerry Falwell and the Moral Majority with ads in his magazines and filed a lawsuit against Falwell. In turn, Falwell sued Flynt for defamation of character. The case, Hustler Magazine v. Falwell (1988), was appealed all the way to the Supreme Court, where Flynt was victorious, as the Court declared that parodies of public figures were protected by the First Amendment. Impact Before its dissolution in 1989, the Moral Majority became the largest conservative lobby in the United States. At its peak, it was supported by more than 100,000 clergy and had unified 7 million laypeople in political action for conservative candidates and issues. Upon its demise, it was succeeded by the Christian Coalition, which built upon the Moral Majority base to further conservative causes and candidates in national politics during the 1990’s. Further Reading
Bromley, David G., and Anson Shupe, eds. New Christian Politics. Macon, Ga.: Mercer University Press, 1984. Contains essays on the Moral Majority and the sources of its social and political support. Bruce, Steve. The Rise and Fall of the New Christian Right: Conservative Protestant Politics in America, 1978-1988. Oxford, England: Clarendon Press, 1988. Links the rise of the Moral Majority with the New Christian Right political movement and the Christian Coalition. Smolla, Rodney A. Jerry Falwell v. Larry Flynt: The First Amendment on Trial. Chicago: Chicago University Press, 1990. Defines the issues of the trial and presents Flynt’s testimony in his deposition and in the trial. Wilcox, Clyde. Onward Christian Soldiers? The Religious Right in American Politics. 2d ed. Boulder, Colo.: Westview Press, 2000. Discusses the founding of the Moral Majority, its major political issues, and its impact on the national elections of 1980 and 1984. Wilcox, Clyde, Matthew DeBell, and Lee Sigelman. “The Second Coming of the New Christian Right: Patterns of Popular Support in 1984 and 1996.” Social Science Quarterly 80, no. 1 (March, 1999): 181-192. Presents comparative analyses of public support for the Moral Majority and the Christian Coalition. Marguerite R. Plummer
668
■
The Eighties in America
Mothers Against Drunk Driving (MADD)
See also
Conservatism in U.S. politics; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Falwell, Jerry; Hustler Magazine v. Falwell; Reagan, Ronald; Religion and spirituality in the United States; Robertson, Pat; Supreme Court decisions; Televangelism; Women’s rights.
■ Mothers Against Drunk Driving (MADD) Identification
Grassroots organization working to reduce drunk driving Date Founded in 1980 In response to an increasing number of deaths, particularly among adolescents, MADD was founded to educate the public about the dangers of drunk driving and to advocate for stronger drunk driving laws. Initially, the group focused the bulk of its efforts on raising the national minimum drinking age to twenty-one; those efforts succeeded in 1984. Candy Lightner founded Mothers Against Drunk Driving (MADD) in 1980, following the death of her thirteen-year-old daughter Cari in Fair Oaks, California. Cari was walking to school when she was struck from behind by a drunk driver who had three prior drunk-driving convictions and was out on bail following a hit-and-run arrest two days earlier. Within the first year of operation, two MADD chapters were established in California and Maryland. One of MADD’s first high-profile activities was meeting with members of Congress and the National Highway Traffic Safety Administration (NHTSA) in Washington, D.C., to advocate for stronger traffic safety laws and regulations. In 1982, President Ronald Reagan invited MADD to be part of the newly formed Presidential Commission on Drunk Driving. MADD supported a law sponsored by Representatives Jim Howard and Mike Barnes that set aside some federal highway funds to provide to states to support anti-drunk driving efforts. MADD also backed a law that established the first National Drunk and Drugged Driving Awareness Week in December of 1982. In 1983, the National Broadcasting Company (NBC) produced a made-for-television movie about MADD, significantly increasing the group’s profile and resulting in the addition of more chapters. In 1984, one of MADD’s first goals was realized, when
President Reagan signed the bill raising the national drinking age to twenty-one into law. The organization would later file an amicus brief with the Supreme Court when the law’s constitutionality was challenged. With the law passed, MADD branched out into other forms of advocacy and support, while maintaining its lobbying efforts on Capitol Hill. In 1987, the organization launched its national 1-800GET-MADD hotline to provide victim support. In 1988, again with MADD’s support, the Omnibus Anti-Drug Abuse Act extended the same compensation rights offered to victims of other crimes to all victims of drunk drivers. Impact When MADD was founded in 1980, more than 28,000 people were dying each year in alcoholrelated crashes. During the first five years following passage of the law raising the minimum drinking age, an estimated 5,491 lives were saved. By the end of the 1980’s, the grassroots organization had grown to more than 330 chapters in forty-seven states. It stood as an example of the power of grassroots organizing to change both laws and behavior in the United States. Further Reading
Jacobs, James. Drunk Driving: An American Dilemma. Chicago: University of Chicago Press, 1992. Kirk, Milo. Let Them Live: How Underage Drinking Affects Family and Friends, and Solutions to the Problem. Irving, Tex.: Mothers Against Drunk Driving, 1992. Ross, Laurence. Confronting Drunk Driving: Social Policy for Saving Lives. Chicago: University of Chicago Press, 1992. Mary McElroy See also
Drug Abuse Resistance Education (D.A.R.E.); Just Say No campaign.
■ Mötley Crüe Identification Heavy metal band Date Formed in 1980-1981
Heavy metal was a major musical force in much of the world in the 1980’s, and the most popular style of metal was glam (glamour) metal. Mötley Crüe did not originate the glam look or sound but played a major role in developing the glam scene, and the band’s widespread popularity ultimately influenced nearly every metal band that followed.
The Eighties in America
Four young men formed Mötley Crüe in the Los Angeles Sunset Strip music club scene in 1980 and 1981, calling themselves Nikki Sixx (Frank Ferrano, bassist), Tommy Lee (Thomas Lee Bass, drummer), Mick Mars (Bob Deal, guitarist), and Vince Neil (Vincent Neil Wharton, vocalist). The band’s wild live shows, which included setting Sixx on fire, attracted a dedicated following, and in December, 1981, the band released an album, Too Fast for Love, on their own label. Over sixteen thousand copies sold, attracting the attention of a major label, Elektra, which rereleased the album in 1982 to strong sales. This success was followed by Shout at the Devil (1983), which was soon certified platinum. Mötley Crüe’s remaining 1980’s albums were Theatre of Pain (1985), Girls, Girls, Girls (1987), and Dr. Feelgood (1989), all of which were quickly certified multi-platinum. Dr. Feelgood reached number one on the Billboard 200 chart. A major reason for Mötley Crüe’s success was cable television channel MTV, which also launched in 1981. MTV became the primary venue for music videos, and its inclusion of heavy metal videos substantially expanded the audiences for such bands as Ozzy Osbourne, Van Halen, Bon Jovi, and Mötley Crüe. Mötley Crüe’s look and sound earned them regular airplay on MTV, with their most requested video being “Home Sweet Home,” from Theatre of Pain. In the mid- to late 1980’s, heavy metal bands could be defined by their relationship to Mötley Crüe. Some bands, such as Poison, followed Mötley Crüe’s lead. Some, such as Guns n’ Roses, modified their approach to create a distinctive but related subgenre. Some, such as Metallica, rejected them, attempted to define their brand of heavy metal in opposition to Mötley Crüe’s variety. Those groups that followed Mötley Crüe followed both their musical lead and their fashion sense, which in turn had been influenced by movies such as The Road Warrior (1981) and Blade Runner (1982). In addition, Mötley Crüe’s members became poster boys for bad behavior. Drug and alcohol abuse were part of the band’s image. Nikki Sixx overdosed on heroin and had to be resuscitated after his heart stopped. Sixx, Lee, and Neil all had serious vehicle accidents, and a passenger in Neil’s car, Razzle, the drummer for the band Hanoi Rocks, was killed as a result of Neil’s drunk driving. Impact Mötley Crüe epitomized the excess and decadence of the United States in the 1980’s. In
Mount St. Helens eruption
■
669
their world, there was always room for more drugs, more sex, more outlandish behavior, and louder music. Many bands imitated them, and many fans strove to follow their heroes’ lead, shaping both the course and the mainstream popularity of heavy metal music. Further Reading
Bukszpan, Daniel. The Encyclöpedia öf Heavy Metal. New York: Barnes & Noble, 2003. Christe, Ian. Sound of the Beast. New York: HarperEntertainment, 2003. Mötley Crüe, with Neil Strauss. The Dirt. New York: HarperCollins, 2001. Simmons, Sylvie, and Malcolm Dome. Mötley Crüe: Lüde, Crüde, and Rüde. Chessington, Surrey, England: Castle Communications, 1994. Charles Gramlich See also Blade Runner; Bon Jovi; Drug Abuse Resistance Education (D.A.R.E.); Fashions and clothing; Guns n’ Roses; Heavy metal; MTV; Music; Music videos; Osbourne, Ozzy; Pop music; Van Halen.
■ Mount St. Helens eruption The Event Disastrous volcanic explosion Date May 18, 1980 Place Mount St. Helens, in Washington State
The catastrophic eruption of Mount St. Helens reduced the height of the peak by thirteen hundred feet, devastated a blast zone of 230 square miles, exploded some 540 million tons of volcanic ash into the atmosphere, and claimed fiftyseven lives. The event shocked the public and led to a new era of scientific research into vulcanology. A series of earthquakes in late March, 1980, caused seismologists at the University of Washington to issue a public alert that an eruption of Mount St. Helens could take place in the near future. The earthquakes increased in frequency, until the mountain shook almost constantly, making it probable that the volcano’s 123-year dormancy was ending. On March 27, increased seismic activity, augmented by steam and ash eruptions, opened a crater in the glaciercovered peak. Officials at the United States Geological Survey (USGS) then issued a hazard warning to state and federal agencies. Expulsions of steam and ash continued in April, drawing public interest and
670
■
Mount St. Helens eruption
scientific concern. A measurable harmonic tremor, a signal of underground magma movement, indicated that a large-scale eruption might follow. In addition, an expanding bulge on the upper northern slope of the mountain created an avalanche hazard. Several weeks of relative quiet ensued, however, as small eruptions and bulge growth continued at a steady rate. The Eruption Scientists, officials, and the public alike were shocked at the immensity of the May 18 cataclysm. Predictions about the nature and size of eruptions had been based on evidence of previous volcanic activity in the region, but the powerful lateral blast that caused the mountain to collapse was without precedent. At 8:32 a.m., an earthquake of 5.1 magnitude on the Richter scale shook the mountain, collapsing its northern flank and sending an avalanche surging into Spirit Lake at the base of the volcano and westward into the upper reaches of the North Fork Toutle River. The symmetrical peak of the mountain disintegrated into the largest debris avalanche in recorded history. The removal of the north slope allowed volcanic pressures below to explode in a pyroclastic surge that propelled rock, ash, and gases outward at speeds in excess of 650 miles per hour. The blast expanded laterally across the terrain, devastating the forest and its inhabitants in a fan-shaped
The Eighties in America
area of 230 square miles. Although blast temperatures neared 660 degrees Fahrenheit, the granular ash is thought to have asphyxiated, rather than burned, a large number of the victims. Within ten minutes of the initial earthquake, an anvil-shaped Plinian column, or ash plume, had erupted more than twelve miles into the stratosphere. Eruptions continued for nine hours, generating dense clouds of ash that spread at a rate of sixty miles per hour and produced noticeable amounts of fallout in eleven states. In regions near the volcano in eastern Washington, daylight turned to darkness, and travelers were stranded by poor visibility and stalled automobile engines. Adding to the chaos at the feet of the volcano, pyroclastic activity had melted glaciers, creating lahars, or mudflows resembling liquid concrete, that inundated major drainage systems. The flood damaged, and in some cases destroyed, bridges, homes, and highways, as it made its way some seventy miles to the Columbia River. The shipping channel there was reduced by twothirds, necessitating a three-month dredging operation.
Impact Although two hundred people in the immediate area survived the blast, fifty-seven lives were lost. More than $1 billion in property and economic losses resulted from the eruption, with erosion-control expenditures alone topping $600 million. Soil erosion resulting from the eruption was magnified by the slashand-burn techniques that had been used in the region Cutaway View of Mount St. Helens Scenario for timber salvage. The small North Fork Toutle River carried sediment downstream Ash Clouds in quantities exceeding the load normally transported by the Amazon River. The Army Corps of Engineers worked Summit to control flooding by dredging streambeds and building Ruptured Side Vent a system of dams, as loggers Internal bulge (Old Magma) harvested over 850 million Fractured Rock board feet of lumber from New Magma Landslides downed trees. The volcano Section Magma Blasted Away itself continued to extrude Tunnel (Vent) magma intermittently, building a total of three domes inside the cauldron between
The Eighties in America
MOVE
■
671
Further Reading
Barstad, Fred. A Falcon Guide to Mount St. Helens: A Guide to Exploring the Great Outdoors. Guilford, Conn.: Falcon, 2005. Detailed coverage of the post-eruption terrain, hiking trails, and scenic views. Carson, Rob. Mount St. Helens: The Eruption and Recovery of a Volcano. Seattle: Sasquatch Books, 1990. Multiple photographs and accompanying text reveal the magnitude of the event. Dale, Virginia H., Frederick J. Swanson, and Charles M. Crisafulli, eds. Ecological Responses to the 1980 Eruption of Mount St. Helens. New York: Springer, 2005. Scientific reports describing the ecosystem’s devastation and recovery. Harris, Stephen L. Fire Mountains of the West: The Cascade and Mono Lake Volcanoes. 3d ed. Missoula, Mont.: Mountain Press, 2005. Definitive coverage of volcanic geology, with chapters on individual volcanoes. Pringle, Patrick T. Roadside Geology of Mount St. Helens National Volcanic Monument and Vicinity. Olympia: Washington Department of Natural Resources, 1993. Authoritative description of the eruption and its aftermath. Numerous charts, graphs, and an introduction to vulcanology. Margaret A. Koger Mount St. Helens erupts again on July 22, 1980. (Austin Post/ USGS)
See also
1980 and 1986. After the eruption, glacier growth resumed as well, reestablishing a potential flood hazard. Natural recovery within the blast zone surprised scientists, as pockets of surviving vegetation emerged and wildlife returned. The U.S. Forest Service and the timber industry planted over 250 million trees in a massive restoration effort. The Cascade Volcano Observatory, a regional office established by the USGS in 1980, monitored activity at Mount St. Helens in order to predict future eruptions more accurately and to provide for public safety. The 1982 founding of the Mount St. Helens National Volcanic Monument preserved a 110-acre area surrounding the volcano to encourage research, recreation, and education. Mount St. Helens became the most closely studied volcano on earth, leading to major advances in vulcanology.
■ MOVE
Air pollution; Environmental movement; Natural disasters; Science and technology.
Identification
African American political organization
Several violent episodes between MOVE and Philadelphia’s police force between 1978 and 1985 caused deaths on both sides, escalated racial tension, and created adverse national publicity for the city. MOVE, most of whose members were surnamed Africa, was confrontational from the beginning. Founded in 1972 by John Africa (born Vincent Leaphart), the organization was based on religious values of communal living, agrarian economy, and radical environmentalism. Its members stockpiled weapons, performed bodily functions in public, taunted passersby, and created public health and safety hazards. Mayor Frank Rizzo ordered a block-
672
■
MOVE
The Eighties in America
A Philadelphia neighborhood burns after city authorities dropped a bomb on the MOVE headquarters on May 13, 1985. (AP/Wide World Photos)
ade of their home at 309 North 33rd Street in the Powelton Village section of West Philadelphia on March 16, 1978. Five months later, on August 8, 1978, a battle left police officer John Ramp dead and the home bulldozed. Nine MOVE members were convicted of murder and imprisoned. The remaining members found a new home at 6221 Osage Avenue in West Philadelphia. On August 8, 1984, MOVE marked the sixth anniversary of its first major battle with police by beginning construction to turn this home into a fortress. Neighbors, especially whites, became fearful. They asked the city for help. On May 13, 1985, Mayor Wilson Goode, himself an African American, ordered five hundred police officers—including special weapons and tactics (SWAT) teams armed with tear gas, machine guns, and heavy military gear—to surround the fortress and force MOVE to vacate the premises. MOVE refused to surrender. During the day-long siege, sporadic gunfire erupted from both sides.
In the late afternoon, Police Commissioner Gregore J. Sambor ordered an aerial attack. About 5:30 p.m., a Pennsylvania State Police helicopter dropped a powerful explosive on the roof. Only one adult, Ramona Africa, and one boy, Birdie Africa, survived. John Africa was among the six adults and five children killed in the attack. The fire from the bomb spread throughout the neighborhood, destroying or damaging about sixty homes. The attack, explosion, and aftermath were broadcast live on local television. Shortly after the national media received the story, Philadelphia was ridiculed as “the city that bombed itself.” Impact MOVE was a prominent theme in the urban folklore of Philadelphia even before the 1985 bombing. In 1983, local white rock band Beru Revue recorded a song about MOVE, “Be Careful Tonight.” After the bombing, several African American rappers and white punk bands nationwide mentioned
The Eighties in America
MOVE in their lyrics. Even with most of its members dead or imprisoned, MOVE remained an active organization and a cause célèbre. Journalist Mumia Abu-Jamal (born Wesley Cook), convicted of murdering Philadelphia police officer Daniel Faulkner on December 9, 1981, continued in prison to write on MOVE’s behalf. MOVE’s intense polarizing effect between its African American supporters and white detractors has few parallels in the history of American race relations. Further Reading
Halus, Eugene J. “‘At Frankford We Stand!’ The Mobilization of Euro-American Ethnic Consciousness in Philadelphia Neighborhoods and Changes in City Government, 1950-1995.” Ph.D. dissertation. Washington, D.C.: Catholic University of America, 2003. Wagner-Pacifici, Robin. Discourse and Destruction: The City of Philadelphia Versus MOVE. Chicago: University of Chicago Press, 1994. Williams, Daniel R. Executing Justice: An Inside Account of the Case of Mumia Abu-Jamal. New York: St. Martin’s Griffin, 2001. Eric v. d. Luft See also
African Americans; Brawley, Tawana; Crime; Environmental movement; Racial discrimination; Terrorism.
■ Mr. T Identification African American actor Born May 2, 1951; Chicago, Illinois
Mr. T was one of the few African Americans in television in the 1980’s who exhibited self-reliance and no fear of white authority. Mr. T was born Lawrence Teraud in 1951, and he spent the 1970’s as a nightclub bouncer and celebrity bodyguard. He adopted a haircut patterned after African warriors and a multitude of gold chains around his neck and changed his name to Mr. T. Sylvester Stallone cast him in Rocky III, and his abrasive personality made a lasting impression on the audience. He then landed the role of Sergeant B. A. Baracus on The A-Team, a campy action television series that was panned by critics but acquired a de-
Mr. T
■
673
voted cult following. His tough persona from Rocky III was toned down for the series, and he was frequently the butt of jokes because his awesome physical feats were not matched by his mental abilities. The lighthearted action of The A-Team—on which thousands of rounds of ammunition were fired, but no one ever got hit—enabled Mr. T to soften his persona, which evolved from frightening to gruff but kindhearted. Mr. T parlayed his celebrity status into a comic book, a television cartoon series, and a rap album. In these formats, he had an uplifting message for young fans. In the cartoon series, he was a gymnastics coach who also helped his young students solve crimes. The rap album urged respect for parents and other adults. The album, however, was designed to capitalize on Mr. T’s celebrity, rather than any musical talent. It sank into obscurity in the United States but crested at number seventy-five on the British pop music charts. In 1985, Mr. T tried his hand at professional wrestling. His achievements included a tag team bout partnered with Hulk Hogan in the first cable edition of pay-per-view wrestling. After two years, he took on a more modest role, appearing occasionally as a special referee. He faded from the public eye, surfacing only in occasional television commercials, over the next twenty years. Impact Mr. T became a recognizable icon of the 1980’s, joining the panoply of celebrities intimately associated with the decade. He attemped, with some success, to use his visibility for socially conscious purposes, as in his children’s cartoon show. Even his decision to wear many pounds of gold chains was meant to refer to the history of African American slavery and the chains of bondage. He remained best known, however, as B. A. Baracus, the menacing but lovable tough guy of The A-Team. Further Reading
Oglesby, Bill, ed. Action TV: Tough-Guys, Smooth Operators and Foxy Chicks. New York: Routledge, 2001. Terrace, Vincent. Television Characters. Jefferson, N.C.: McFarland, 2006. Michael Polley See also
African Americans; Film in the United States; Hip-hop and rap; Television; World Wrestling Federation.
674
■
The Eighties in America
MTV
■ MTV Identification Cable television network Date Launched on August 1, 1981
MTV began as a cable television network entirely devoted to airing the new format of music videos, twenty-four hours a day, seven days a week. The channel was incredibly influential in 1980’s popular culture: Not only did it change the nature of music marketing and the course of musical history, but it also permanently altered the editing styles of narrative television and cinema. MTV (Music Television) began broadcasting on U.S. cable television networks on August 1, 1981. The channel’s purpose was to provide music videos twentyfour hours a day. Promotion spots during the early years of MTV featured an astronaut on the moon alongside a television, a flag, and the MTV logo. The graphic was accompanied by a simple but heavy guitar riff. The graphic could be taken to mean that the commencement of MTV was as groundbreaking as the placement of humans on the moon. The network’s slogan was “I Want My MTV.” It seemed a way of declaring that there was a demand for the product the network had to offer. The music it promoted was targeted at a young demographic whose collective taste was hard to categorize as being for any one style of music. The first video the network aired was “Video Killed the Radio Star,” a quirky single by the then-unknown band the Buggles that asserted that the rise of music videos would sound the death knell for radio-based musicians. Early Format and Audience
During MTV’s early years, the network modeled its programming schedule fairly closely on those of music radio stations. It aired several videos in a row, with brief breaks for music news on the half hour and longer breaks at the top of every hour. The videos were even introduced by video deejays, or veejays. In 1987, MTV began airing a week-end summary called The Week in Rock. By broadcasting music news every half hour, all day every day, MTV quickly began to supplant print media such as music magazines as young people’s primary source of information about the U.S. popular music scene. Music videos were typically three or four minutes long. They sometimes followed a narrative line provided by a song’s lyrics, and sometimes they simply featured a band or artist performing the song, albeit often in unusual, surreal, or constantly changing set-
tings. The videos tended to strive to be visually arresting, featuring bright colors or stark, expressive black-and-white photography. As the medium progressed, some videos were made with the same high production values as Hollywood films, featuring elaborate design, intricate plots, and exotic settings. Videos also quickly developed their own distinctive visual syntax. Most noticeable, they tended to be edited far more aggressively than were mainstream Hollywood movies and television programs. Shots were briefer and cuts were designed to be more obtrusive, again with the goal of seizing and holding the attention of young viewers. As young people did indeed begin watching MTV in significant numbers, moreover, the style of the network’s videos came to define their generation, which began to be discussed in terms of fast editing and short attention spans. MTV featured a wide array of music, including brand new and formerly underground artists, increasing its appeal to America’s teens. Indeed, as the 1980’s progressed, the youth appeal of MTV seemed almost inevitable, combining as it did a reputation for featuring (and creating) cutting-edge trends of the decade with a distinctive look that differentiated it from any television program that young people’s parents might watch (or approve of). MTV was seen as something belonging to everyone less than thirty years old. It came to serve a function for the youth culture of the 1980’s that radio had served for a similar demographic in the 1950’s. MTV allowed American youths to see what their favorite artists looked like and to follow fashion trends related to those artists. Even the commercials shown on MTV were geared toward a decidedly youthful market. Products advertised were very likely to fall within the categories of cutting-edge clothing, trendy automobiles, fast food, video games, and similar items. Moreover, they came to use the visual styles first developed by music videos, seeking to appeal to young viewers by speaking their visual language. Bands and Genres
In the early years of MTV, 19811984, bands from the United Kingdom, Ireland, and Australia were featured in heavy rotation, partly as a result of the fact that those bands were among the most eager to produce music videos and submit them to the network. English bands were heavily represented on MTV’s playlist, followed by Irish, Australian, and Scottish bands, respectively. As a result of the influx of these bands, the musical sensibility that
The Eighties in America
permeated the network was heavily influenced by their versions of New Wave music. Also known as Brit-Pop, or Synth-Pop, these bands typically featured a “pop” sound, complemented by heavy keyboards or synthesizers, and band members themselves were typically fond of fashion. As a result, American audience members concerned to identify with the latest fashions emulated the styles sported by British, Irish, and Australian musicians in their videos. MTV became instrumental in the rapid spread of certain clothing trends by virtue of its ability to expose millions of fans to those trends simultaneously through its videos. Because music videos were essentially marketing tools, clothing and hair styles became branding devices, and fans chose to adopt particular fashions alongside their choice of particular music styles and artists. Such associations between music and fashion had always existed within popular culture, but the mass broadcast of music videos in conjunction with an ever-growing list of musical subgenres—each with its own accompanying “look”— dramatically increased the conjunction between music, clothing, and identity in American culture and rendered that intersection significantly more important to North American youth. While New Wave music definitely reigned at MTV, another contemporary music genre carved a place for itself on the network—heavy metal. Typically, the heavy metal videos that were shown in heavy rotation on MTV featured a strand of metal that had grown away from that genre’s roots. Heavy metal had been pioneered by such bands as Black Sabbath, Led Zeppelin, and Deep Purple, who played a dark, brooding, sometimes occult-based version of hard rock. The heavy metal that MTV seemed to champion, though, was the subgenre known (sometimes derisively) as glam metal. Much like their New Wave counterparts, glam metal artists aimed to be visually appealing. They were known for their long hair, and sometimes wore makeup. Leather jackets and boots and other accessories became standard attire for both band members and their fans. The music itself featured heavy guitar riffs, but in the context of a more popinfluenced sensibility than other types of heavy metal. It remained more likely to contain sexually offensive lyrics than was New Wave music, however. MTV’s decision to market itself to youth made its broadcast of sexually suggestive videos and other questionable content controversial among some
MTV
■
675
parents. The network sought to allay criticism when possible. Sometimes, one version of a video would be played during the day, and another, racier version would air later at night. The network sometimes opted not to show a video at all, if it was deemed too provocative. The New Look of Music While New Wave and glam metal found great success on MTV, there was plenty of room in the network’s incessant rotation for other forms of metal. No-frills bands such as Ozzy Osbourne (formerly of Black Sabbath), Judas Priest, and Iron Maiden began to enter the rotation with a heavy sound, gloomy theatrics, and a more frightening look and sound than that of glam metal bands. As the 1980’s progressed, MTV became the venue to which bands would turn to get noticed. Videos became one of the primary public relations mechanisms of the music industry. Mainstream artists such as Madonna, the Cars, Michael Jackson, and Bon Jovi were all featured in heavy rotation in the days shortly after the inception of MTV. In 1983, MTV broke its own boundaries with the airing of Michael Jackson’s “Thriller.” This video of the title track from Jackson’s multi-platinum album, was a fourteen-minute horror movie that featured extensive makeup and costuming to turn dancers and singers into ghouls and zombies. It became one of the network’s most famous videos. Still, MTV was often criticized for rotating few artists of color, aside from Jackson, a situation that the network would attempt to rectify later in the decade. Charity and MTV In addition to work by solo artists and specific groups, MTV was instrumental in broadcasting special events produced by larger groups within the music industry. These happenings were charity events, in which a large group of performers would sing and record a song written by the musicians spearheading the effort, and proceeds from the sale of the recording would go to a specific cause, typically feeding starving children in Africa. The most notable of these events were Band Aid (1984), Live Aid (1985), and USA for Africa (1985). Band Aid was the result of combined efforts by British and Irish bands to help starving people in Ethiopia. It was spearheaded by Bob Geldof, of the band the Boomtown Rats. Live Aid was a series of fund-raising concerts, held throughout the United Kingdom and the East Coast of the United States, that featured artists from the United States and the United Kingdom.
676
■
The Eighties in America
MTV
USA for Africa was composed predominantly of American musical artists, including some legendary Motown artists. These pop music fund-raising events typically featured a single, the recording of which would be filmed and featured as a video to be included in the regular rotation of MTV. Further Packaging of a New Product
To further appeal to its target audience, MTV began to offer programming that featured activities associated with that audience. Specifically, MTV’s Spring Break began airing in 1986, and starting in 1981, the network began hosting its own annual holiday party, MTV’s New Year’s Eve. As the tastes of its audience developed and changed, and the voices of critics grew increasingly louder, MTV began to offer lengthier shows devoted to a wider array of musical genres. In 1986, 120 Minutes debuted. The show featured two hours of underground, alternative rock and pop and brooding New Wave hybrid music. It was targeted at the college-rock market and other audiences who favored less commercial, mainstream offerings. Also in the middle of the 1980’s, Club MTV aired. The show, targeted at the dance music crowd, did not feature videos. Instead, it was filmed on location at the Palladium in New York City and featured dancers at the club in cutting-edge outfits demonstrating the latest dances. Veejay Downtown Julie Brown would occasionally talk to people in the crowd between spots of recorded music. In 1987, heavy metal fans were given their own show, Headbanger’s Ball. In addition to airing heavy metal videos, the show featured occasional guest appearances by heavy metal groups, who would interact with veejays in between the videos. Finally, by 1988, Yo! MTV Raps was offered as a way to showcase the talent of rap artists. It was hosted by Dr. Dre and Ed Lover, a rap-influenced comedic duo, and it featured the pair in the MTV studio, again hosting a series of videos and sometimes interviewing relevant guests.
Impact The importance of MTV can be found in several of its attributes. First, it both appealed to and helped create the on-demand aspect of the U.S. youth market during the 1980’s. Second, it illustrated the dynamics of popular music throughout the United States, Australia, and Europe, becoming one of the most significant chroniclers of musical trends during the decade. Third, it helped define a generation by offering to millions of young people
between the ages of twelve and twenty-four the opportunity to learn more about their favorite artists and to see and hear them more often than at any time previously. The relationship between artists’ public personas, their private lives, and their fan base became both more complex and more intimate than ever before, and it shaped young people’s understanding of and attitudes toward popular culture. The network also decisively altered the landscape of popular culture, both by disseminating fashion trends quickly and widely throughout the country and by popularizing editing styles that were soon incorporated by Hollywood’s film and television studios. MTV viewers were exposed to bands from outside the market confines dictated by radio, and they were certainly exposed to a much wider variety of bands than were present on any one radio station. As a result, music fans of the 1980’s became more familiar with the diversity of musical options available to them, and musicians who would otherwise have had little hope of significant exposure were able to find markets that could sustain them. Lastly, MTV provided enough cultural legitimacy and financial capital to the new format of the music video to allow it to develop into an art form in its own right. As a result, music itself took on a visual as well as an aural component, and the popular experience of music was altered. Further Reading
Austen, Jake. TV-a-Go-Go: Rock on TV from American Bandstand to American Idol. Chicago: Chicago Review Press, 2005. Catalogs seemingly random events involving musicians and their appearances on television that defined eras and performers. Frith, Simon, Andrew Goodwin, and Lawrence Grossberg, eds. Sound and Vision: The Music Video Reader. New York: Routledge, 2000. Anthology of articles that explores the relationship between the 1980’s, postmodernism, the medium of video, and music television. Weingarten, Marc. Station to Station: The Secret History of Rock ’n’ Roll on Television. New York: Pocket Books, 2000. Provides a history of rock music in styles personified by artists ranging from Elvis Presley to Prince to Run-DMC, via the artists’ relationship with television; includes those who helped create the MTV era. Dodie Marie Miller
The Eighties in America See also
Blondie; Bon Jovi; Cable television; Devo; Duran Duran; Fads; Farm Aid; Fashions and clothing; Generation X; Go-Go’s, The; Heavy metal; Hiphop and rap; Jackson, Michael; Lauper, Cyndi; Live Aid; Madonna; Mullet; Music; Music videos; New Wave music; Pop music; Television; USA for Africa.
■ Mullet Definition
Hairstyle
The mullet was a hairstyle popular among both men and women of the 1980’s in all sectors of the American population. It became a social identity symbol for several different musical movements and a widespread cultural phenomenon that illustrated a change in how Americans intended to define themselves for the world. Characterized by short hair on the top and sides and long hair in the back, the mullet was often referred to by the slang phrase “business in the front and party in the back.” The actual term “mullet” is of uncertain cultural origins, but it is widely believed that the hairstyle was inspired by David Bowie’s androgynous Ziggy Stardust look of the 1970’s. In the 1980’s, it became so widespread as to assume the status of a icon of the decade. Although inspired by glam rock musicians, the mullet became a defining style of more mainstream rock music once glam rock declined and heavy metal and pop music began to increase in popularity. This crossover was due in part to the establishment of MTV, which aired music videos featuring the hairstyle being worn by early 1980’s acts such as the Cars and later 1980’s acts such as Guns n’ Roses. Eventually, the hairstyle spread beyond the mainstream rock markets and became a distinguishing characteristic of marginal heavy metal and country music performers and their fans as well. During most of the 1980’s, the mullet was primarily associated with blue-collar America. However, the hairstyle was also popular among prominent celebrities of the time. Aside from music videos and rock music culture, the mullet was seen in other popular culture arenas: Mel Gibson wore a mullet in Lethal Weapon, tennis all-star Andre Agassi also wore this style on the court, and even singer Barry White wore it for a period of time. As mainstream America emulated this look, the mullet allowed a link to be forged between celebrities of all types—rock stars, pop stars, movie stars, and sports figures—and their fans.
Mulroney, Brian
■
677
Impact Early in the 1980’s, the mullet was a source of pride for glam rockers and heavy metal or rock fanatics. At the middle and end of the 1980’s, the mullet became a symbol of social significance for groups that identified themselves as distinctively American— country music performers and fans and sports enthusiasts. Thus, the hairstyle became immediately associated with automobiles, rock and roll, sports, and beer. Despite the fact that the hairstyle was also popular across Europe and South America, the middle to late 1980’s saw this cultural icon become a source of national pride. Further Reading
Innes-Smith, James, and Henrietta Webb. Bad Hair. New York: Bloomsbury/Holtzbrinck, 2002. Karchmer, Noah D. Mullets and Mayhem: Coming of Age in the Late 1980’s. Bloomington, Ind.: AuthorHouse Books, 2001. Jennifer L. Amel See also Androgyny; Country music; Fads; Fashions and clothing; Guns n’ Roses; Heavy metal; MTV; Music; Music videos; Pop music.
■ Mulroney, Brian Identification
Leader of the Canadian Progressive Conservative Party and prime minister of Canada from 1984 to 1993 Born March 20, 1939; Baie-Comeau, Quebec, Canada As prime minister of Canada, Mulroney negotiated a free trade agreement with the United States in 1988. This agreement formed the basis of an agreement reached in the early 1990’s with the United States and Mexico. Brian Mulroney was elected leader of the Progressive Conservative Party in 1983, defeating former prime minister Joe Clark. In 1984, he became prime minister after his party won 211 of the 282 seats in the House of Commons. His party won another majority in the House in 1988, the first back-to-back victories for the party in thirty-five years. In 1985, Mulroney began free trade negotiations with the United States. The prime minister used his close relationship with U.S. president Ronald Reagan to reach an agreement in 1988. Under the terms of the agreement, all tariffs between the United
678
■
Mulroney, Brian
States and Canada would be eliminated by 1998. This agreement was followed by a Canadian-U.S. agreement on acid rain, which was finalized in 1991. Mulroney’s government also worked to reduce Canada’s national debt by cutting spending. However, many of his proposed cuts were blocked by the Senate, which was under the control of the Liberal Party. Mulroney succeeded, however, in privatizing some of Canada’s crown corporations, business enterprises owned by the national government. The government was able to sell twenty-three of its sixty-one corporations, including the national airline Air Canada. Mulroney proposed a national sales tax, the Goods and Services Tax (GST), in 1989. After bitter debate in the House of Commons and the Senate, the tax became law in 1991. The Mulroney government labored to resolve questions of national unity. In 1982, Quebec was the only province that did not ratify the new Canadian constitution. In response, Mulroney negotiated
The Eighties in America
the Meech Lake Accord in 1987. This agreement with provincial governments would have recognized Quebec’s demand to be declared a “distinct society” within Canada. It also sought to grant all the provinces additional powers. Mulroney’s efforts at greater unification were unsuccessful, however, as the Meech Lake Accord was not ratified. The provincial governments of Manitoba and Newfoundland refused to sign it before the ratification deadline in June, 1990. Impact Mulroney changed politics in Canada while strengthening trade relations with the United States. His government’s policies on free trade and the Goods and Services Tax were not reversed by subsequent governments, becoming enshrined in Canadian law. Some observers blame Mulroney’s less popular policies for shattering the Progressive Conservative Party: The party remained unable to reach a majority throughout the 1990’s.
Prime Minister Brian Mulroney (center) greets German Chancellor Helmut Kohl during arrival ceremonies for the Economic Summit in Toronto in 1988. (AP/Wide World Photos)
The Eighties in America Further Reading
Kaplan, William. Presumed Guilty: Brian Mulroney, the Airbus Affair, and the Government of Canada. Toronto: McClelland & Stewart, 1998. Sawatsky, John. Mulroney: The Politics of Ambition. Toronto: Macfarlane Walter & Ross, 1991. John David Rausch, Jr. See also Business and the economy in Canada; Canada-United States Free Trade Agreement; Elections in Canada; Foreign policy of Canada; Income and wages in Canada; Meech Lake Accord; Reagan, Ronald; Turner, John; Unemployment in Canada.
■ Multiculturalism in education Definition
A movement that recognizes the diversity of cultures that make up the United States, and, as such, seeks to create an educational system that will educate all children equally
Education activists who were dissatisfied with the inequities of the educational system and the ability of schools to produce graduates who understood and appreciated the variety of cultures that made up the United States developed a body of scholarship and pedagogy that focused on “educational equality.” Multiculturalism in education had its historical roots in the Civil Rights movement of the 1960’s and 1970’s. Prior to the 1980’s, the focus of such education might have been an effort to promote learning about one culture (for example, African American) or the introduction of learning materials related to a single unit of content (for example, women’s history). In the 1980’s, through the influence of a number of vocal and prolific teacher-scholars, the emphasis moved toward a more encompassing view of multicultural education. Multicultural teaching in the 1980’s usually involved instructional content that promoted appreciation of the diverse cultures that make up the United States. Important Scholarship James Banks, a pioneer of multicultural education, supported the view that the total school environment must change in order for multicultural education to work. Scholarship focused on oppressive practices in current education, including tracking, teaching strategies that were not
Multiculturalism in education
■
679
sensitive to students’ cultural backgrounds, standardized testing that might be culture-bound, and the classroom climate. Much of the theoretical writing on which multicultural education was based took a critical view of contemporary educational practice. Writers such as Joel Spring, Henry Giroux, and Peter McLaren developed major critiques of the educational system, which, they said, had kept the oppressed “in their place.” With this theoretical background as an underpinning, multicultural education became a field of study and practice that led educators to make changes that would be empowering to those who had previously been marginalized or oppressed by the educational system. Models for Delivery
Teachers and curriculum developers in the 1980’s developed various models and frameworks for the delivery of multicultural education in the schools. The following is a summary of these models, which were described in detail by Christine Sleeter and Carl Grant. One approach sought to raise the academic achievement of oppressed groups by attempting to make instruction culturally relevant—for example, teaching content that was not as tied to the needs or backgrounds of the dominant culture but that would be useful to students’ lives. A human relations focus taught all students about the commonalities of all people. A single group approach focused on the histories and contemporary issues of oppression of people of specific groups. A multicultural education approach reflected the pluralistic nature of society. Students were taught content using instructional methods that valued cultural knowledge and differences of numerous cultures and lifestyles. Finally, toward the end of the 1980’s, a social reconstructionist approach to multicultural education was developed that involved teaching students about institutionalized oppression and discrimination. Students learned about their roles as agents of social change so that they might improve the society in which they lived. It is important to note that these approaches varied widely, and thus “multiculturalism in education” in practice looked very different from one educational site to another.
Areas of Debate and Disagreement
In the 1980’s, educators were not in agreement regarding goals visà-vis multiculturalism. Some wished the United States to be a “melting pot,” in which all cultures were subsumed into one American society. Many felt that
680
■
The Eighties in America
Multiplex theaters
this concept of assimilation was a major purpose of schools. Other educators supported a “salad bowl” approach—that is, keeping parts of each immigrant’s culture intact, so that individuals did not lose their cultural identities. Many multicultural educators critiqued both of these positions, wishing instead to have multiculturalism in education serve as a social movement focused on undoing the inequitable practices that had made those with certain cultural backgrounds dominant over others in American society. Partly because of these differences of opinion and definition within the educational community, critiques of multiculturalism in education would abound. On the level of the individual classroom teacher, the 1980’s found many teachers planning some activities focused on multiculturalism and diversity. Some teachers attempted to incorporate various cultures into the classroom (for example, international foods, songs, language learning) while others focused on the more systemic changes that had to take place if schools were to become truly multicultural. Impact Multicultural education was beginning to come of age in the 1980’s. Schools were realizing that the face of America was changing and that students needed to know how to live and work with people who did not look, think, or talk as they did. Efforts were made to develop educational methods and approaches that would empower individuals of many cultures. However, the work in the 1980’s was only the beginning. Much of the scholarly work on multiculturalism in education was published in subsequent decades. Further Reading
Banks, James A. Introduction to Multicultural Education. 4th ed. Boston: Allyn & Bacon, 2007. A classic text that offers a concise and clear definition of multicultural education and its goals. Bennett, Christine. Comprehensive Multicultural Education: Theory and Practice. 6th ed. Boston: Allyn & Bacon, 2006. Historical background, basic terminology, and concepts of multicultural education. Sleeter, Christine, and Carl A. Grant. Making Choices for Multicultural Education. New York: John Wiley & Sons, 2006. A very readable book that presents the various approaches to multicultural education that emerged in the 1980’s and have since been refined. Mary C. Ware
See also
African Americans; Education in Canada; Education in the United States; Latinos; Minorities in Canada; Native Americans; Racial discrimination.
■ Multiplex theaters Definition
Movie venues that show multiple features simultaneously on multiple screens
In the 1980’s, individual, large movie theaters began to divide themselves into several smaller viewing spaces, each with fewer seats and frequently with smaller screens. The move enabled each venue to show more than one film at a time, usually with multiple starting times, thereby maximizing consumer access and economic profitability. Movies in the United States were often first shown in the late nineteenth and early twentieth centuries in lavish vaudeville theaters. The first single-purpose movie theaters were more modest storefront venues called nickelodeons, but as the cinema began to court a wealthier audience, movie theaters began to be constructed to resemble the lavish theaters of vaudeville and the legitimate stage. These so-called picture palaces featured stages, lighting grids, orchestra pits, and elaborate lobbies. Their auditoriums could seat up to two thousand viewers at once, featured very large screens, and were usually located in single, stand-alone buildings. By the 1980’s, entrepreneurs began to divide such large, single auditoriums into sets of smaller theaters called multiplexes. They found it easier, for example, to fill four theaters showing four films to three hundred people each than it was to fill a single theater with twelve hundred fans of a single film. As this exhibition model became popular, multiplexes began to be constructed in the emerging shopping malls. The first such venues had anywhere from two to six smaller screens, each with less seating capacity than a standard theater. By relocating to the shopping malls, the multiplexes were able to capitalize upon malls’ high volumes of foot traffic, drawing audiences from passersby and shoppers, in addition to those who made the trip specifically to see a movie. Although smaller in size, they had the capacity to show several different films at the same time, or the same film at a variety of times, thus increasing public access, consumer choice, and industry profits. Multiplexes grew to include up to eighteen screens, gaining the capacity to handle thousands of filmgoers per day.
The Eighties in America Impact Multiplex theaters resulted from market forces that transformed the traditional format for film presentation. Stanley Durwood originally produced the modifications that would lead from the single-theater complex to the multiplex in 1963. He subdivided his Roxy Theatre in 1964 and found he could save money overall with one staff and a single lobby at the heart of a complex of multiple screens. By the end of the 1980’s, a further modified form resulted: the megaplex, which was a stand-alone structure with eighteen to twenty-four screens and dozens of starting times. The multiplex was notable also in that during the 1980’s ownership of theaters moved out of individual investors’ hands and into corporate control, as chains of movie houses continued to grow beyond the multiplex. By the end of the 1980’s, new media such as videotape, laserdiscs, cable television, and computers had led to more people than ever watching movies at home or at other venues outside the theater, prompting the enlarging of the multiplex theaters into megaplexes, as the theater industry fought to compete in an ever-diversifying market.
Murphy, Eddie
■
681
New York mayor Ed Koch, right, tries to talk Eddie Murphy, center, and Joe Piscopo out of jumping off a ledge in a May, 1983, sketch from Saturday Night Live. Murphy’s performances in the sketch comedy show launched his career. (AP/Wide World Photos)
Further Reading
Beyard, Michael D. Developing Retail Entertainment Destinations. Washington, D.C.: Urban Land Institute, 2001. Klinger, Barbara. Beyond the Multiplex: Cinema, New Technologies, and the Home. Berkeley: University of California Press, 2006. Michael W. Simpson See also
Architecture; Cable television; Film in Canada; Film in the United States; Television; Video games and arcades.
■ Murphy, Eddie Identification
African American actor and comedian Born April 3, 1961; Brooklyn, New York One of the most successful comedic actors in television and film during the 1980’s, Murphy helped popularize the mismatched buddy film and revived the formula of a single performer playing multiple roles in the same movie. Eddie Murphy got his big break as a comedian in 1980, when he became a regular on the 1980-1981
season of the television show Saturday Night Live. Murphy developed impressions of celebrities such as Stevie Wonder and James Brown, as well as stock characters, including a human-sized Gumby, an adult version of child actor Buckwheat, and Mister Robinson—an urban, African American version of children’s television host Fred Rogers. Murphy became so popular that some considered him one of the most talented performers ever to be featured on the long-running show. In 1982, Murphy got his first opportunity to make the transition to film, starring with Nick Nolte in the box-office hit 48 Hrs. The film, which combined the thriller and buddy genres, demonstrated Murphy’s talent for incorporating comic lines and timing into an otherwise serious performance. Murray left Saturday Night Live during its 1983-1984 season to concentrate on his film career. Between 1983 and 1989, he made nine films. Trading Places (1983), two Beverly Hills Cop films (1984 and 1987), and Coming to America (1988) were outstanding box-office hits. The latter featured Murphy in four separate roles. Those films not as well received included Best Defense (1984) and Harlem Nights (1989), which Murphy co-wrote, produced, directed, and starred in. Harlem Nights,
682
■
The Eighties in America
Murray, Bill
considered by some a vanity project, marked the beginning of a downward career slide lasting until the late 1990’s. Having a passable singing voice and performing often-uncredited backup vocals for others, Murphy tried to launch a singing career to complement his acting. He released two hit singles, “Party All the Time” (1985) and “Put Your Mouth on Me” (1989), but he was never taken seriously as a singer. Another career snag occurred in 1985, when Murphy’s former agent sued him for $30 million. Murphy’s worth at the time was reckoned to be around $50 million. Because he was in the middle of shooting a film, his studio, Paramount Pictures, settled the case out of court. Murphy also released a comedy album, Comedian (1983), which received a Grammy Award. He received Emmy nominations in 1983 and 1984 for his work on Saturday Night Live and Golden Globe nominations in 1983, 1984, and 1985 for his roles in 48 Hrs., Trading Places, and Beverly Hills Cop. Impact Eddie Murphy experienced a phenomenal rise to stardom at the young age of nineteen. After a relatively brief but acclaimed tenure on television, the success of his first motion pictures established him as one of the most talented entertainers of the decade and one of the industry’s few “marquee names,” or virtually guaranteed box-office draws. He created a bevy of characters uniquely and forever associated with him. Though often compared with African American comic Richard Pryor and considered the model for more recent comedians like Chris Rock, Eddie Murphy demonstrated a seemingly effortless ability to play multiple roles in one movie, and his goofy, infectious, disarming laugh made him a one-of-a-kind entertainer who helped define comedy and film of the 1980’s. Further Reading
Koenig, Teresa, and Rivian Bell. Eddie Murphy. Minneapolis: Lerner, 1985. Rottenberg, Josh, Vanessa Juarez, and Adam B. Vary. “How Eddie Got His Groove Back.” Entertainment Weekly, no. 917 (January 26, 2007): 30-37. Ruuth, Marianne. Eddie: Eddie Murphy from A to Z. Los Angeles: Holloway House, 1985. Wilborn, Deborah A. Eddie Murphy. New York: Chelsea House, 1999. Jane L. Ball
See also African Americans; Comedians; Film in the United States; Television.
■ Murray, Bill Identification American comedian and actor Born September 21, 1950; Wilmette, Illinois
Murray was one of a number of comedians—along with Eddie Murphy, Dan Aykroyd, and Chevy Chase—who brought the edgy comic sensibilities of NBC’s late-night variety show Saturday Night Live further into the mainstream in the 1980’s, through a number of highly successful film comedies. In 1980, Bill Murray finished his final season on Saturday Night Live, having become one of the show’s most popular cast members through his flip, ironic demeanor and send-ups of show-biz clichés, such as lounge singers and celebrity journalists. That year, he also starred in the unsuccessful Where the Buffalo Roam, in which he played gonzo journalist Hunter S. Thompson. However, in 1980 Murray also appeared in the hit comedy Caddyshack as a Vietnam-veteran groundskeeper battling a gopher on the loose. The film was a landmark in a hip yet sophomoric style of comedy that became particularly popular during the decade. Caddyshack was cowritten and directed by Harold Ramis, a screenwriter, actor, and director who, like Murray, was a veteran of the Chicago improvisational comedy troupe Second City. The film solidified Murray’s alliance with Ramis. Murray next starred in another hit, 1981’s Stripes (costarring and cowritten by Ramis), in which he played a ne’er-dowell who joins the U.S. Army. The following year, Murray showed his range by taking an unbilled supporting role in Tootsie, playing Dustin Hoffman’s roommate. In 1984, Murray again collaborated with Ramis, this time on a film Ramis wrote with Murray’s fellow Saturday Night Live alumnus Dan Aykroyd: Ghostbusters, a big-budget spoof, starred the trio as three shady scientists who start a business ridding New York City of ghosts. The film was a bona fide summer blockbuster, grossing almost $230 million and becoming a pop culture sensation as it yielded a hit theme song, countless toys, and other merchandising spin-offs. Murray won over audiences with his quick one-liners, and he held his own as a romantic lead opposite Sigourney Weaver. For his work in
The Eighties in America
Ghostbusters, Murray received a Golden Globe nomination for best actor in a musical or comedy. Murray used the clout he acquired from Ghostbusters to convince the film’s distributors, Columbia Pictures, to help fund a long-standing project of his, a film adaptation of W. Somerset Maugham’s serious novel, The Razor’s Edge. Murray starred in and cowrote the 1984 film, but it was neither a critical nor a commercial success. Murray spent much of the next few years out of the spotlight, only taking smaller roles in comedies. It was not until 1988 that he took another starring role, accepting the lead in the big-budget comedy Scrooged, a modern-day version of Charles Dickens’s A Christmas Carol. The film was only a moderate success. The next year, Murray reunited with Ramis, Aykroyd, and Weaver for Ghostbusters 2, which became another summer hit, albeit not as successful a hit as the original. Impact Through his successful films of the 1980’s, Murray showed that the methods of sketch and improvisational comedy could also be utilized in film. His sarcastic yet likable persona helped make him immensely popular with audiences. Further Reading
Elder, Sean. “Bill Murray.” Salon.com. February 6, 2001. Murray, Bill, and George Peper. Cinderella Story: My Life in Golf. New York: Broadway, 2000. White, Timothy. “Bill Murray’s Rumpled Anarchy.” In The Entertainers. New York: Billboard Books, 1998. Michael Pelusi See also
Comedians; Film in the United States; Ghostbusters; Murphy, Eddie; Television.
■ Music Definition
The many styles of popular music and its subgenres
The 1980’s witnessed changes in music that reflected changes in society as female artists and teen performers gained prominence and as music styles and fashions from around the world became increasingly popular. The music of that time gave rise to clothing and dance fads that became as much a part of the decade as the musical genres they represented.
Music
■
683
By the early 1980’s, disco’s popularity was declining, and other genres of the 1970’s were also slowing fading. However, various hard rock genres, including heavy metal, remained a staple of most American radio stations. Also, rhythm and blues (R&B) took on new relevance as up-and-coming and established Motown artists released important albums. Music styles and artists that came to prominence during the decade seemed to offer something for everyone. Launched in 1981, MTV offered a revolutionary way (music videos) for bands and artists to reach potential audiences throughout North America. A form of pop music known as New Wave was one of the first genres whose artists capitalized on the concept of music television. However, with its emphasis on the North American youth market, MTV neglected the over-thirty demographic. In 1985, MTV’s sister network, VH-1, was developed to showcase more mainstream videos and aimed toward an adultoriented audience. New Wave
The punk rock-inspired genre known as New Wave comprised different styles that would come to define the 1980’s and become the staple of MTV rotations. With roots in late-1970’s England, New Wave was often associated with synthesizers, fashionable clothing, and androgynous singers, and encompassing the subgenres of synthpop, new romantic, gothic rock, ska, post-punk, rockabilly, and power pop, but the lines between these sounds were not firmly drawn. As the name suggests, synthpop bands featured synthesizer keyboards, usually played in tandem with the bass guitar to create musical tension and build drama throughout songs. The Human League, Depeche Mode, and Soft Cell were popular examples of the genre. Synthpop and other New Wave bands enjoyed a great deal of Top 40 radio airplay because their songs were rhythmic, danceable, and, most important, unoffensive. The New Romantic subgenre included groups like Ultravox, Talk Talk, and Duran Duran. Though some of these bands employed synthesizers like the synthpop groups, their label grew from the way they addressed romantic issues and the human condition. Gothic rock artists were characterized by darker tones—both musically and in terms of subject matter. Siouxsie and the Banshees, the Cure, and Bauhaus exemplified the subgenre. Post-punk bands did not feature synthesizers or
684
■
The Eighties in America
Music
keyboards of any kind. The subgenre essentially combined punk rock sentiments with pop-oriented vocals. The approach to instrumentation was often similar to punk, especially in terms of drumming. Joy Division (later to become synthpop band New Order), the Smiths, and Gang of Four are examples of this subgenre. Ska came from the Caribbean music style of the same name defined by a catchy, upbeat sound over which is played keyboards and a horn section, usually with at least one saxophone and trumpet. Popular examples of ska include Madness, the Specials, and English Beat. Rockabilly was probably the hardest for some to accept as New Wave because of its resemblance to American rock and roll of the 1950’s. The clothing styles of such bands reflected the homage to that decade. The Stray Cats were one of the most popular rockabilly bands, as their music was played regularly on MTV and Top 40 radio. A less mainstream version of rockabilly was played by the Cramps. Last, there was power pop, a subgenre that most resembled American pop music played fast, and often featuring songs with quirky subject matter. The Vapors, known for the popular hit song “Turning Japanese,” exemplifies this musical style. Heavy Metal
Characterized by an unadorned guitar sound and a heavy, simple drum beat, late-1960’s heavy metal bands as diverse as Steppenwolf, Led Zeppelin, Deep Purple, and Black Sabbath are credited with originating the genre. Black leather and denim characterized heavy metal fashion. With the advent of MTV, however, a new heavy metal aesthetic developed. Known (sometimes derisively) as hair metal or glam metal, but stylistically classified as pop metal, the 1980’s subgenre of heavy metal was typified by young men with long, styled hair and denim and leather outfits, but in a more colorful array than their predecessors. Pop metal was largely a product of bands who favored a big guitar sound, flashy clothes, and sometimes makeup. Bands such as (early) Bon Jovi, Dokken, and Guns n’ Roses offered audiences an abrasive sound tempered by melody, street-smart lyrics, and the beauty-conscious sensibilities of pop. These bands became increasingly popular among teenage girls. Despite the popularity of pop metal, traditional heavy metal had not been completely overshadowed by the genre. New and veteran heavy metal artists
earned the respect of audiences who were disenchanted by pop metal and who sought the heavier, more aggressive sound and lyrics of traditional heavy metal. Also, some heavy metal and hard rock bands that had cultivated audiences in the 1960’s and 1970’s continued to draw audiences during the 1980’s. Artists such as Iron Maiden, Ozzy Osbourne (formerly of Black Sabbath), Dio, and Alice Cooper offered audiences dark theatrics and lyrical themes that provoked members of conservative political and religious groups. In September, 1985, the Senate committee held hearings on the lyrical content of heavy metal music. Members of the Parents’ Music Resource Center (PMRC) had petitioned the Senate for the hearings. Rock legend Frank Zappa and Dee Snider, lead singer of heavy metal band Twisted Sister, represented rock music. A censorship compromise was reached; it determined that any popular genre album that contained offensive material would bear a parental advisory sticker warning of the content. Hip-Hop
American rap and hip-hop music largely came to prominence in 1979 following the release of “Rapper’s Delight” by the Sugarhill Gang. Hip-hop music in the United States originated in the Bronx, New York, and was characterized by its rough-edged sound and lyrics that captured the realities of economic deprivation, racism, African American culture, and various nightlife scenes. Hip-hop deejays would control the music sample from a turntable as an emcee rapped lyrics over the deejay’s beats. Some rappers chose to simply rap over the percussive sound made when the deejay manipulated the vinyl record. Others, such as Afrika Bambaataa, sampled synthesizer-heavy European pop by bands like Germany’s Kraftwerk to provide a melodic counterpart to the bass portion of the music. The technique provided enough musical texture to make the composition interesting and danceable. As with any other music form, the lyrical content of rap varied as to the purpose of the artist. Grandmaster Flash’s 1982 hit “The Message” was immensely popular largely because of its timely observations about socioeconomic concerns. A wide range of rap music continued to be available to music fans throughout the 1980’s. As MTV had proven instrumental in the successes of new forms of pop music and heavy metal, it also became important in its role in exposing listeners/
The Eighties in America
viewers outside the East Coast to the genre. In 1983, Run-D.M.C.’s “Rock Box” became the first rap video to air on MTV. Lesser-known acts also had videos in rotation at the network. In 1986, Run-D.M.C. teamed up with the hard rock band Aerosmith to record a cover version of the band’s hit single “Walk This Way” in the first rap-rock crossover. During the early 1980’s, LL Cool J became the first artist to sign to Def Jam Recordings. By 1985, his first album had sold one million copies. The same year that Run-D.M.C. and Aerosmith fused two genres that seemed on the surface to be incompatible, the Beastie Boys, white rappers from the New York area, released their first album. Every song on the album was a fusion of rock and rap. The Beastie Boys’ first album, Licensed to Ill, went platinum multiple times. By the late 1980’s, female rappers had come to the forefront of the genre. Salt-n-Pepa, Queen Latifah, and MC Lyte were all part of rap music’s early video age. Their videos were played on MTV (often in the rap music show Yo! MTV Raps) and in more regular rotation on Black Entertainment Television’s (BET) Rap City. Female rap artists also sold millions of albums but typically lacked the attention that male rappers received. Many rap groups tackled social and political concerns, and Public Enemy was no exception. The group crafted politically infused and often controversial rap lyrics, demonizing contemporary political figures as well as popular culture icons such as Elvis Presley and John Wayne. In 1988, Yo! MTV Raps began airing on MTV. During the mid-1980’s, a form of rap music evolved on the West Coast. Known as gangsta rap, it was a particularly aggressive rap subgenre that documented young, African American males’ inner-city lives and often combative and antagonistic relationship with police authorities. One of the most notable groups was N.W.A., a group from Compton, California. Dance Music and Female Performers Some of the highest-grossing female performers of the 1980’s had been stars in previous decades. Diana Ross, formerly of the Supremes; Cher, of Sonny and Cher; Tina Turner, from Ike and Tina Turner; and Chaka Khan were among the notable female performers of the decade. Representing pop rock, hard rock, and heavy metal were artists such as the Go-Go’s, the Bangles, Vixen, Lita Ford, and Doro Pesch.
Music
■
685
Still, there were new female performers who found a place for themselves within the ever-expanding categories of pop music in the 1980’s. Madonna, for instance, emerged with a style that combined overt sexuality with bass-heavy dance music; her singles placed on both R&B and pop charts. In the middle of the decade, Whitney Houston, a young African American, began her public career. Both Houston and Madonna frequently created videos that demonstrated cutting-edge clothing, hair, and dance styles. Further bridging the gap between pop and R&B were new female groups, including the multicultural Mary Jane Girls (formed by R&B innovator Rick James); Climax, an African American group; and Lisa Lisa and Cult Jam. The 1980’s was the decade in which breakdancing—a form of dance that evolved with hiphop—became popular. Parachute pants, loose-fitting nylon trousers, or track or sweat suits, and athletic shoes and gold jewelry made up typical breakdance outfits. The fashion also became associated with rap culture itself. In terms of equipment, the boom box was an integral part of the decade’s dance and rap music scene. Portable, and capable of being quite loud, the boom box was ideal for impromptu dance sessions around urban neighborhoods. Also, the cassette tape was an important development, replacing the eight-track tape popular throughout the 1970’s. Teen Performers After the censorship controversy surrounding rap and heavy metal, teen pop performers, as solo artists or groups, offered an unblemished music genre to parents concerned about lyrical content and suggestive images. A notable group was New Edition, an African American, all-male group that included future R&B star Bobby Brown. The group’s success paved the way for future boy bands such as New Kids on the Block. Toward the end of the 1980’s, there was a tremendous interest in teenage female solo artists, in particular Debbie Gibson and Tiffany (full name Tiffany Renee Darwish). Both were roughly sixteen years old when their careers began. They were the quintessential inoffensive pop stars. Their clothing styles— always tasteful and not overtly sexual—captured the interest of young girls who wanted to emulate the teen singers, and their songs reached the pop charts in the late 1980’s.
686
■
The Eighties in America
Music videos
Music Films Films that incorporated dance were common throughout the 1980’s. For breakdancing, there were Breakin’ and Breakin’ 2: Electric Boogaloo, both released in 1984. Fame (1980) set the classic stylings of ballet and modern dance alongside pop music. Footloose (1984) used dancing as a means of civilized rebellion at a conservative high school. Flashdance (1983) depicted the incongruous worlds of an exotic dancer’s roles as welder by day and dancer by night, all at odds with her dream of entering ballet school, and Dirty Dancing (1987) told the coming-of-age story of a girl who learns classic paired dancing as taught by her attractive teacher, played by Patrick Swayze. Impact The 1980’s saw the rise of new music genres, rap/hip-hop in particular, that became mainstream in later decades. Also, many artists who have since been acknowledged for their pioneering contributions to the music industry largely began their careers in the 1980’s. Music audiences and performers of the decade became increasingly aware of genre-bending possibilities that were possible and necessary to keep music a viable form of expression. Further Reading
Bannister, Matthew. White Boys, White Noise: Masculinities and 1980’s Indie Guitar Rock. Burlington, Vt.: Ashgate, 2006. Discusses the underground stylings of the Smiths, R.E.M., the Replacements, and others as presenting a masculinity different from that of mainstream rock music. McCoy, Judy. Rap Music in the 1980’s. Lanham, Md.: Scarecrow Press, 1992. Charts the evolution of rap and analyzes the culture, politics, and artists relevant to the genre. Discusses more than seventy essential rap albums. Weinstein, Deena. Heavy Metal and Its Culture. Cambridge, Mass.: Da Capo Press, 2000. Examines the genre from its beginnings through the 1990’s. Provides a discussion of venues, media, and performers particular to heavy metal. Dodie Marie Miller See also Androgyny; Blondie; Bon Jovi; Boom boxes; Boy George and Culture Club; Break dancing; Cable television; Cher; Compact discs (CDs); Dance, popular; Demographics of the United States; Devo; Duran Duran; Fads; Fashions and clothing; Generation X; Go-Go’s, The; Grant, Amy; Guns n’ Roses; Heavy metal; Hip-hop and rap; Houston,
Whitney; Jackson, Michael; Journey; Latinos; Lauper, Cyndi; Leg warmers; Live Aid; Madonna; Mellencamp, John Cougar; Michael, George; Mötley Crüe; MTV; Mullet; Music videos; New Wave music; Osbourne, Ozzy; Parental advisory stickers; Pop music; Preppies; Prince; Public Enemy; R.E.M.; Richie, Lionel; Run-D.M.C.; Springsteen, Bruce; Sting; Synthesizers; Teen films; Teen singers; USA for Africa; U2; Van Halen; Women in rock music.
■ Music videos Definition
Short films featuring a performance of an artist’s musical work or concept
The music video became a cultural phenomenon that changed and revitalized the music industry and spawned the creation of cable channels devoted to the music video. The music video can be traced back to the first musical movie, The Jazz Singer (1927), starring Al Jolson. Later musical stars, such as the Beatles and Elvis Presley, also appeared in musical films. Eventually, artists saw value in creating promotional clips for music labels to get an idea of their image and music. Cable companies in the 1980’s were expanding the number of channels they were offering and needed to find more content. The music industry initially had little to do with music videos, which was reflected by the primary investors in the launch of MTV (Music Television): American Express and Warner Bros. One of the most expensive videos of the time, “Ashes to Ashes” (1980), by David Bowie, showed the potential of the medium. These factors, in combination with the advent of affordable highquality video recorders, opened the way for a successful launch of an all-music video channel. Early Music Videos MTV first aired on August 1, 1981, with the video for “Video Killed the Radio Star,” by the Buggles. The clip was indicative of early music videos, which tended to rely heavily on humor and camp to promote a song. Many of the videos in the early 1980’s were imported from England and Australia, where there had been a faster move to singles-based music. Both countries had music video countdown shows, so their artists had an advantage over American artists. Duran Duran in particular was well known for its expensive and visually appealing videos. The band also made the first video to be
The Eighties in America
Time magazine devotes its December 26, 1983, cover to the coming of music videos. (Hulton Archive/Getty Images)
banned from airplay, “Girls on Film,” which was the beginning of a long string of controversy in the music video industry. Music videos quickly became hugely popular with the viewing public. There were two main types of videos: performance and concept. The performance video was the dominant form, with the artist either filmed onstage or on a set periodically performing in short scenes. The performance video dominated the market but was the less artistic of the two forms. The concept video was based on what a director and artist envisioned for a particular song. It would sometimes follow a plot but many times appeared to have little to do with the song to which it was attached, instead trying to create a mood or surreal feel. Many videos emulated Hollywood feature films, and certain directors were sought for their ability to re-create scenes. Videos commonly included themes such as the Cold War or AIDS, regardless of their format. Many musical artists were aware of the power of their image, and the music video quickly became another way to promote themselves. Artists such as Ma-
Music videos
■
687
donna and Michael Jackson became masters of selfpromotion in the medium. Musicians were propelled to stardom on the strength of their videos alone, and the concert tour began to have less import on a musical career. Fashion fads went from music videos to the stores in short periods of time, and record sales soared when associated songs were based on popular videos. Initially, the format for televising music videos was based on radio stations’ formats. Videos were played in light, medium, and heavy rotation according to their popularity, and the veejays (video jockeys) grew to be as recognizable and popular as the musicians they showed. Like many radio stations of the time, video channels targeted their broadcasts toward a white audience. It was not until 1983, when Michael Jackson released a series of videos tied to his multi-platinum album Thriller, that African Americans saw significant amounts of airplay. Although African American artists made their mark with popular music, it was not until 1986, when Aerosmith and Run-D.M.C. released their crossover video for “Walk This Way,” that the musical genres of hip-hop and rap truly made their way into American television culture. Critics decried music videos because they often pushed mediocre songs to the top of the music charts. Many videos were controversial because of their violence, blatant sexual overtones, or exploitation of women. Most artists realized, however, that controversy sold albums, so little was done in the industry to police itself. The Music Video Matures
The mid-1980’s saw the rise of other channels devoted to the music videos (namely MTV’s softer sister station, VH-1, created in 1985). Also, music video directing became more specialized during this time. Each musical genre had developed a signature style so that viewers knew instantly what they were watching. Jackson’s “Thriller” video, for example, was directed by John Landis, a feature film director. This classic music video, the most expensive at that time, opened the door for other feature-length directors to work in the field of music videos and lent some legitimacy to the form. Some music video directors became well known and were hired on their ability to create a particular type of video or atmosphere. Many critics view the team of Kevin Godley and Lol Creme as the premier directors of the 1980’s. Music videos became a valid art
688
■
Music videos
form, and a number of directors used them to boost their careers in the quest to direct a feature or independent film. Music videos grew in cost and complexity and lost the campy quality of their early 1980’s form. Impact The music video reshaped the way artists promoted their music in the 1980’s. An artist no longer needed to take the traditional route of concert tours to achieve stardom and could excel based on the ability to create an impressive image, not necessarily quality music. The music video, because of its success in popular culture, helped to reanimate a lethargic music industry. Further Reading
Austerlitz, Saul. Money for Nothing: A History of the Music Video from the Beatles to the White Stripes. New York: Continuum, 2007. An overview of the art,
The Eighties in America
history, and impact of the music video from its origins to the present day. Battino, David, and Kelli Richards, eds. The Art of Digital Music: Fifty-Six Visionary Artists and Insiders Reveal Their Creative Secrets. San Francisco: Backbeat Books, 2005. Interviews with industry insiders on the art and creation of music videos. Reiss, Steve, and Neil Feineman. Thirty Frames Per Second: The Visionary Art of the Music Video. New York: Harry N. Abrams, 2000. Focuses on the music video as an art form. James J. Heiney See also Cable television; Compact discs (CDs); Duran Duran; Heavy metal; Jackson, Michael; Lauper, Cyndi; Madonna; MTV; Music; New Wave music; Pop music; Run-D.M.C.; Television; Women in rock music.
N ■ Nation at Risk, A Identification
Government report critical of American education Date Released on April 26, 1983 The highly publicized release of A Nation at Risk set in motion a school reform movement during the 1980’s. The report continued to be a catalyst for educational change beyond the decade. During the 1980 presidential campaign, Ronald Reagan proposed the elimination of the Department of Education, established just a year earlier by President Jimmy Carter, as part of his promise to reduce the size of government. Terrel Bell, appointed by President Reagan in 1981 to be secretary of education, accepted the position with the goal of reexamining the appropriate federal role in education. He appointed a commission, the National Commission on Excellence in Education, which was given eighteen months and a broad mandate to examine, among others things, the quality of the nation’s schools and colleges. The commission was to make recommendations on how to improve the educational system of the United States, with special emphasis on better serving teenage students. The commission held dozens of meetings and hearings around the country, commissioned numerous reports, convened regular meetings of its members, and released its final report, A Nation at Risk: The Imperative for Educational Reform, at a White House ceremony that included President Reagan, Vice President George H. W. Bush, educational leaders from across the country, and a large group from the press. The report was written not in educational jargon but rather in an accessible language whose power Secretary Bell recognized immediately. It included an introduction that was reprinted in newspapers and magazines across the country, declaring in no uncertain terms that the U.S. educational system was in such disrepair that the future of the nation itself was in danger. If nothing changed, the United States
could soon be expected to lag behind other countries in commerce, industry, science, and technological innovations. One often-quoted phrase, “a rising tide of mediocrity,” pointed to the erosion of the educational foundations of the country. The publicity surrounding the report only increased once it was released to the public. More than 400,000 copies were distributed, and Time and Newsweek both devoted lengthy articles to education. Secretary Bell convened twelve regional conferences to disseminate the report throughout the country. President Reagan gave the keynote address at the final such event. Any discussion of abolishing the Department of Education soon ended. States across the country created their own education commissions, governors called for educational reform, and by the 1988 election, President Bush had declared himself the “Education President.” Impact Not since the launching of Sputnik by the Soviet Union in 1957 had the topic of education figured so significantly in American life as it did in 1983. The national conversation continued for years, as the nation’s problems did not disappear, and each successive president sought to claim the mantle of the Education President. Little consensus was reached on the best manner in which to fix American education, but in the wake of A Nation at Risk, it was universally agreed that something had to be done. Further Reading
Hayes, William. Are We Still a Nation at Risk Two Decades Later? Lanham, Md.: Scarecrow Education, 2004. National Committee on Excellence in Education. A Nation at Risk: The Imperative for Educational Reform—A Report to the Nation and the Secretary of Education. Washington, D.C.: Author, 1983. Also at http://purl.access.gpo.gov/GPO/LPS3244. Spring, Joel H. American Education. Boston: McGrawHill, 2004. John Boyd
690
■
The Eighties in America
Nation of Yahweh
See also
Closing of the American Mind, The; Education in the United States; Mainstreaming in education; Multiculturalism in education; National Education Summit of 1989; Reagan, Ronald; School vouchers debate; Standards and accountability in education.
■ Nation of Yahweh Identification
African American religious organization
The Nation of Yahweh combined black supremacist mantras with Judeo-Christian beliefs. The group gained both popularity and scrutiny throughout the 1980’s, as it was praised for its community service efforts but criticized for its message of racial separatism. The organization’s reputation also suffered as a result of alleged connections to more than a dozen murders in the decade. The Nation of Yahweh was formed in the 1970’s and proclaimed its belief system to fall under the JudeoChristian umbrella, more specifically as a splinter of the Black Hebrew Israelites line of thought. The founder of the Nation of Yahweh was Hulon Mitchell, Jr., who came to call himself Yahweh ben Yahweh, which is Hebrew for “God son of God.” Prior to the group’s legal troubles in the late 1980’s, the most widespread criticism of the organization had to do with the accusation that Yahweh ben Yahweh’s teachings were racist and promoted violent separatism. Yahweh ben Yahweh emphasized black supremacy. His followers believed that God was black, and therefore blacks were the chosen people in the eyes of God. Yahweh ben Yahweh also taught his followers anti-white beliefs, as he considered white people to be oppressors. In addition to promoting racial distrust, he taught the importance of loyalty to himself as the son of God. Nation of Yahweh adherents came to believe so fervently in black empowerment, racial hatred, and the divine nature of their leader that they began to compete for Yahweh ben Yahweh’s favor by violently attacking their perceived enemies. Eventually, in order to become a trusted member of the Nation of Yahweh, initiates were required to murder white people. Yahweh ben Yahweh also began to order that group members who hesitated to follow his orders be beaten. Despite public suspicions that such horrific events occurred behind closed doors, the Nation of Yahweh was able to gain some fa-
vor in its Florida community by contributing large sums to Floridian businesses and charities. At the end of the decade, the organization remained controversial but no allegations of violence against it had yet been proven. This would change in the early 1990’s. Impact In the 1980’s, the Nation of Yahweh had a large following and owned a great deal of real estate in the Miami area, including the group’s headquarters, which it called the Temple of Love. However, murders in which group members were involved, along with the subsequent convictions for those crimes, all but extinguished the group. The extremity of the group’s message and its coverage in the press added to the national conversation about race in a decade when race relations and prejudice were often at the center of media representations and the cultural imagination. Further Reading
Boyle, James J. Killer Cults: Shocking True Stories of the Most Dangerous Cults in History. New York: St. Martin’s Paperbacks, 1995. Jenkins, Philip. Mystics and Messiahs: Cults and New Religions in American History. Oxford, England: Oxford University Press, 2001. Snow, Robert L. Deadly Cults: The Crimes of True Believers. Westport, Conn.: Praeger, 2003. Jennifer L. Titanski See also African Americans; Religion and spirituality in the United States.
■ National Anthem Act of 1980 Identification Canadian federal legislation Date Passed by Parliament on June 27, 1980
As a result of the National Anthem Act of 1980, “O Canada” became the official national anthem of Canada, one hundred years after it was first sung. Calixa Lavallée (1842-1891) was a choirmaster at St. James Church, in Quebec City, when he was invited by the Saint Jean-Baptiste Association to write the music for a French Canadian national anthem. Adolphe-Basile Routhier, a prominent Quebec City lawyer and judge, was invited to write the words for the anthem, in French. The resulting composition, “Chant National,” received the official approval of Lieutenant Governor Théodore Robitaille. It was
The Eighties in America
published as a national anthem in Quebec City by Arthur Lavigne in April, 1880. On June 24, the new anthem was both sung and played for the first time in public at a banquet held in Quebec City’s Pavillon des Patineurs (Skaters’ Pavilion), under the baton of Joseph Vezina. Among the more than five hundred guests in attendance was Queen Victoria’s son-inlaw, Governor General the marquis of Lorne. Following that premiere performance, “Chant National” was performed frequently throughout Frenchspeaking Canada. The first recorded occasion of it being played in English-speaking Canada was for the royal visit to Toronto, in 1901, of the duke of York and Cornwall (later King George V). Several literal English translations of the four French verses were made over the first few decades of the anthem’s existence, but they were considered dull and uninspiring. Finally, in 1908, Robert Stanley Weir wrote new English lyrics to Lavallée’s melody. Though not a literal translation of the French lyrics, Weir’s verses impressed and moved enough Canadians that they became the accepted English version of the anthem, whose name became “O Canada.” Throughout the twentieth century, “O Canada” was often played together with “God Save the Queen,” the recognized royal Canadian anthem. In 1927, an official version of “O Canada” was legally authorized for singing and performing in Canadian schools and at public functions. In 1942, an attempt was made to introduce a bill making “O Canada” the national anthem, but Prime Minister William Lyon Mackenzie King, feeling that the business of waging World War II took priority, refused to consider the bill’s passage in that year. More attempts to pass legislation failed, partly as a result of legal objections by the holders of the song’s copyright, which the government finally acquired in 1970. On February 28, 1972, the secretary of state of Canada, Gerard Pelletier, unsuccessfully presented a bill in the House of Commons proposing the adoption of “O Canada” as the national anthem. In the early 1980’s, Canadian nationalism was on the rise. In 1982, the Canada Act would patriate the country’s constitution, making Canada a fully auton-
National Anthem Act of 1980
■
691
omous nation for the first time. Moreover, 1980 was the centary of the song’s composition, and members of the House of Commons promised that it would become the official national anthem during its centenary year. Thus, on June 18, 1980, when another secretary of state, Francis Fox, presented a bill similar to those that had been defeated earlier, the timing finally seemed right. The bill was unanimously accepted by the House of Commons and the Senate on June 27; royal assent was given the same day. On July 1, Governor General Edward Schreyer proclaimed the National Anthem Act of 1980. To commemorate the bill’s expected passage, as well as the anthem’s centenary, the Canadian government issued two special postage stamps on June 18, 1980. Impact When the 1980 National Anthem Bill was debated in the House of Commons, all three House Leaders agreed to facilitate the adoption of the bill by limiting the debate, during second reading, to one speaker for each party. They also agreed that no amendments could be proposed to the English version of the anthem. The sense of urgency that had developed around the bill stemmed from a collective unease about a lack of national unity in the wake of the referendum over Québécois sovereignty that had occurred in May of that year. The act was thus passed in part because the federal government felt it was necessary to shore up national symbols that could help bind the country together. Further Reading
Hang, Xing, ed. Encyclopedia of National Anthems. Lanham, Md.: Scarecrow, 2003. Lavallée, Calixa. “O Canada”: A National Song for Every Canadian. Toronto: Whaley, Royde, 1930. “O Canada”: Our National Anthem. Markham, Ont.: North Winds Press, 2003. Powers, Eugenia. “‘O Canada’: Shan’t Be Chant.” Performing Arts and Entertainment in Canada 28, no. 2 (Summer, 1993). Martin J. Manning See also Canada and the British Commonwealth; Music; Trudeau, Pierre.
692
■
National Education Summit of 1989
■ National Education Summit of 1989 The Event
President George H. W. Bush and the nation’s governors discuss problems in America’s schools Date September 27-28, 1989 Place Charlottesville, Virginia The 1989 National Education Summit focused national attention on U.S. education programs, resulting in bipartisan reform efforts that led to the development of national education goals. The ebb and flow of Americans’ interest in education reform has been a recurrent pattern in the history of American education since the early twentieth century, a focus that resurfaced in the century’s last decades. Although successfully extricated from Vietnam, Americans in the 1970’s faced other national
The Eighties in America
crises, including the Watergate scandal (1972), the Organization of Petroleum Exporting Countries (OPEC) oil embargo (1973), the Iranian hostage crisis (1979-1981), and more ambiguous problems such as rising inflation. In education, touted programs like Head Start had not achieved anticipated levels of success, and controversies over bus zoning and desegregation efforts were debated before the Supreme Court. By 1983, concerns about America’s schools led to a national education study, A Nation at Risk: The Imperative for Educational Reform. This report, authored by the National Commission on Excellence in Education, presented a bleak outlook on the future of U.S. education, arguing that the nation’s “once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world.” Perhaps because it so accurately reflected the nation’s
President George H. W. Bush attends the second working session of the National Education Summit of 1989. (NARA)
The Eighties in America
perceptions about a “crisis” in education, A Nation at Risk became enormously influential and played an important role in elevating education concerns into the national spotlight. In September, 1989, President George H. W. Bush met with forty-nine of the nation’s governors in Charlottesville, Virginia, to discuss problems that existed in America’s school systems. (Bill Clinton, the Democratic governor of Arkansas, was one of the governors who participated in the summit.) The National Education Summit of 1989 resulted in the announcement of the following six national education goals (a figure that was eventually expanded to eight): First, the number of children served by preschool programs would increase annually; by 1995, all at-risk four-year-olds would be served. Second, all American students were to have basic skills commensurate with their grade level; by 1993, the gap in test scores between white and minority children would be reduced. Third, high school graduation rates would improve every year, and the number of illiterate Americans would decrease. Fourth, the performance of American students in mathematics, science, and foreign languages was to improve until it exceeded that of students from “other industrialized nations.” Fifth, college participation, particularly by minority students, would be increased by reducing the contemporary “imbalance” between grants and loans. Sixth, more new teachers would be recruited, particularly minority teachers, to ease “the impending teacher shortage”; other steps would be taken to upgrade the status of the profession. In addition to these six recommendations, the governors and President Bush agreed “to establish clear, national performance goals.” Impact The National Education Summit of 1989 represented a unique, bipartisan effort to reform American education. Furthermore, the summit attracted national attention to the problems that existed in U.S. schools. Few of the summit’s goals were met, but they remained agreed upon as crucial goal to achieve, and later governmental attempts to reform the U.S. educational system often made reference to the same goals agreed upon in 1989. Further Reading
Bush, George. “Address Before a Joint Session of the Congress on the State of the Union, January 31, 1990.” Weekly Compilation of Presidential Documents.
National Energy Program
■
693
Washington, D.C.: National Archives and Record Administration, 1990. Vinovskis, Maris A. The Road to Charlottesville: The 1989 Education Summit. Michigan: National Education Goals Panel, 1999. Renée Love See also
Education in the United States; Magnet schools; Mainstreaming in education; Multiculturalism in education; Nation at Risk, A; School vouchers debate; Standards and accountability in education.
■ National Energy Program Identification Canadian government policy Date 1980-1984
In response to the energy shortages of the 1970’s, Prime Minister Pierre Trudeau instituted the National Energy Program, which aimed to promote energy self-sufficiency, increase Canadian ownership in the energy industry, encourage oil exploration and alternative energy, and increase government revenue. The western provinces and multinational energy companies bitterly opposed the program from its inception. Energy was a major issue in Canada’s 1980 national election, and the Labor Party campaigned on greater control of Canada’s resources, revenues, and future. Rising oil prices had hit the urban, industrialized eastern provinces hardest, but the oil-producing western provinces were alarmed by federal promises to control the price of Canadian oil. When the National Energy Program (NEP) was announced in October, 1980, neither the provinces nor the energy companies had been consulted. The NEP, administered by the Department of Energy, Mines and Resources, addressed foreign control of oil companies with a proposal to boost Canadian ownership to 50 percent and to give tax and lease incentives to Canadian-owned companies. The government also claimed a right to 25 percent of oil or gas discovered on federal property. Most controversially, the NEP established price controls and a new tax on gas and petroleum that opponents decried as double taxation. Although the price controls and taxation were intended to encourage reinvestment in Canadian exploration, the oil-producing provinces faced lower revenues, and the energy companies feared loss of profits.
694
■
The Eighties in America
National Minimum Drinking Age Act of 1984
Oil-rich Alberta withheld 100,000 barrels per day from the market to protest price limits. Exploration and production in several sites were suspended until their profitability could be established. Both Alberta and British Columbia prepared to challenge the federal government in court. Many companies moved to the United States, where the Ronald Reagan administration offered a more lenient regulatory environment and higher profits. Impact Although the provisions of the NEP seemed reasonable and necessary to many, the western provinces blamed the loss of thousands of jobs and billions of dollars on the move. There was even the threat of a western separatist movement. Gradually, NEP provisions were rolled back. The unpopular program is widely believed to have contributed to Brian Mulroney’s Conservative Party victory in 1984. Further Reading
Doern, G. Bruce, and Robert Johnson, eds. Rules, Rules, Rules, Rules: Multi-Level Regulatory Governance. Toronto: University of Toronto Press, 2006. Studies the nature, causes, and dynamics of government regulation in and affecting Canada. Fossum, John Erik. Oil, the State, and Federalism: The Rise and Demise of Petro-Canada as a Statist Impulse. Toronto: University of Toronto Press, 1997. Explores reasons for federal intervention in energy industry and analyzes its failure. Contrasts the Canadian policy conflict with those in other nations. McDougall, I. A. Marketing Canada’s Energy. Toronto: Canadian Institute for Economic Policy/Lorimer, 1983. Argues that the NEP was a positive response to the 1970’s energy crisis and that government’s responsibility was to promote energy independence. Jan Hall See also Chrétien, Jean; Elections in Canada; Middle East and North America; Mulroney, Brian; Trudeau, Pierre.
■ National Minimum Drinking Age Act of 1984 Identification
Legislation that pushed states to raise the legal buying age of alcohol to twentyone Date Signed on July 17, 1984 By threatening to withhold 10 percent of highway funds from states that did not comply with the law, this legislation forced all states to adopt a drinking age of twenty-one. Mothers Against Drunk Driving (MADD) believed that raising the age for drinking alcohol to twentyone would lower the number of drunk drivers and highway fatalities. MADD convinced the Ronald Reagan administration to help, along with Congress, and the National Minimum Drinking Age Act was the result. The federal government technically does not have the power to establish the drinking age, and, indeed, this law did not set one. Instead, the government effectively stated that any state with a drinking age lower than twenty-one would have 10 percent of its highway funds withheld. It should be noted that this age of twenty-one was for buying alcohol, not for consumption, and some states even today have allowances for some underage consumption of alcohol. There was a variety of opposition to the bill and its enforcement, but none was successful. A lawsuit, South Dakota v. Dole (1987), was filed by South Dakota against the federal Department of Transportation and Secretary of Transportation Elizabeth Dole challenging the tying of highway funds to something relatively unrelated to highways (and within the states’ realm of powers). However, the Supreme Court held that because the act did not require the states to raise their drinking age, it did not go far enough into the realm of state powers to be unconstitutional; since the act, in the eyes of Congress, promoted the general welfare, the use of funds was deemed constitutional. The Court overlooked the fact that most states could not afford to go without federal highway funds. Impact While the federal government is not legally allowed to establish a drinking age, as that is a power of the states, this leveraged maneuver effectively resulted in a federal drinking age. Since 1984, no states have acted to lower their drinking age, although some have relaxed their enforcement policies.
The Eighties in America Further Reading
Gardner, Martin R. Understanding Juvenile Law. Newark, N.J.: LexisNexis, 2003. Pegram, Thomas R. Battling Demon Rum: The Struggle for a Dry America, 1800-1933. Chicago: Ivan R. Dee, 1998. U.S. Congress. House. Committee on Public Works and Transportation. Subcommittee on Investigations and Oversight. National Minimum Drinking Age Law: Hearing Before the Subcommittee on Investigations and Oversight of the Committee on Public Works and Transportation, House of Representatives. 99th Congress, 2d session, September 18, 1986. Scott A. Merriman See also Mothers Against Drunk Driving (MADD); Rehnquist, William H.; Supreme Court decisions.
■ Native Americans Definition
Members of any of the aboriginal peoples of the United States
Despite continued protests instigated by the American Indian Movement and the flourishing of profitable casinos on reservations, Native Americans remained one of the poorest, most disadvantaged segments of American society. The 1970’s was a watershed decade for Native American protest in the United States. After its founding in 1968, the American Indian Movement (AIM) engaged in a number of significant protests. In 1970, various tribes occupied Alcatraz Island by “right of discovery.” A year later, a group of protesters set up camp on Mount Rushmore in the Black Hills (Paha Sapa to the Sioux), near the famous national monument. The camp dramatized the broken treaty of 1868, in which the Sioux had been granted the Black Hills in perpetuity. Since that time, gold had been mined from those hills by white people and white industries, and the very face of the mountains promised to the Sioux had been carved in the likeness of four American presidents. Finally, in November of 1972, a group of Native Americans marched on Washington, D.C., in what was called the Trail of Broken Treaties March. When neither the president nor the vice president would meet with Native American leaders, the group occupied the Bureau of Indian Affairs offices for five days.
Native Americans
■
695
Native American Rights in the 1980’s Though the protests of the 1980’s were not nearly as notorious as those of the 1970’s, the work of AIM continued. In 1981, Native American protesters established a camp near Rapid City, South Dakota, as a first step in reclaiming the sacred Black Hills. Their claim for the land on which they camped was based on three legal documents: the 1868 treaty, the American Indian Freedom of Religion Act of 1978, and a federal law allowing the free use of wilderness sites for schools and churches. The protest highlighted again the bizarre history of the Black Hills claim. Abrogated in 1877, the treaty of 1868 was affirmed by the U.S. Court of Claims in 1977, resulting in the Court’s conclusion that the Sioux had never been compensated for the land. The United States government made various offers to what the court called the “Dakota Nation” to compensate them for the loss of the Black Hills. First, it offered to pay what the land was worth in 1877: $17.5 million. In 1979, the Court of Claims added accumulated interest to the offer, holding the United States government responsible for “the most ripe and rank case of dishonorable dealing in our history.” The new amount of $122 million was affirmed by the Supreme Court in 1980 and included a $10,595,493 payment to the lawyer who represented the Dakota Nation. Many of the “traditionals” among the Sioux did not want the money, however; they wanted the land sacred to their religious traditions. Regrettably, though the 1868 treaty had promised them that land in perpetuity, they were not given back their land. Other protests in the 1980’s and later centered upon the mascots of professional sports teams, particularly those of the Cleveland Indians and the Atlanta Braves, as well as the display in museums of the skeletal remains of Native Americans. Despite the protest, these teams continued to use Native American figures as mascots, often depicting them in cartoon form. However, a measure of respect was granted to Native American grave sites and the remains therein. In 1989, the Denver Museum of Natural History chose not to include skeletal remains in its “Rio Azul: Lost City of the Maya.” A short time later, the small museum located at the site of Etowah Indian Mounds in Cartersville, Georgia, closed off an open grave site that had been used as one of its displays. Probably the most significant event in the life of
696
■
The Eighties in America
Native Americans
many Native Americans during the 1980’s was the development and expansion of gambling casinos on tribal reservations. As Cherokee C. L. Henson points out, gaming has long been a part of Native American traditions. When tribes in Florida and California started high stakes bingo games in the late 1970’s, the state governments tried to intervene to close the games. When the tribes sued the respective states in federal court, the courts ruled that if a state allowed gambling, then reservations therein were entitled to run gaming establishments without state intervention. Not popular with the states, this ruling was compromised by Congress in 1988, when it passed the Indian Gaming Regulatory Act (IGRA). Seeking to mediate between the Native Americans and the states, Congress instituted a system under which both the states and the tribes felt unfairly constrained. The IGRA created the National Indian Gaming Commission, whose approval was required for all management contracts. Furthermore, tribes were required to negotiate gaming compacts with their respective states if they intended to engage in “Class 3” gaming, which was defined as any type of gaming beyond bingo and the like. Since Class 3 gaming is the lifeblood of most casinos, the result of this legislation was that Indian casinos were regulated by a federal commission and were always dependent upon states to approve their contracts. To most Native Americans, the law seemed to infringe upon the legal jurisdiction of the reservation, as a parcel of land owned and governed by the tribe. Despite the legal issues that confronted Indian gaming, in time it flourished, growing from a hundred-million-dollar industry to a multibilliondollar industry. Tribes benefited as a group from these businesses, because they were required by law to use the income generated by casinos for the general welfare of the tribe. Some tribes even distributed large per capita payments to their members. Impact Despite the proliferation of profitable gambling casinos on reservations, Native Americans, who make up only 0.05 percent of the entire U.S. population, continued to rank among the most impoverished segment of American society. In the 1990 census, 30.9 percent of Native Americans lived in poverty, compared to 13.1 percent of the U.S. population as a whole. Similarly, unemployment on reservations averaged 45.9 percent, while it aver-
aged 6 percent in the country as a whole. Median family incomes for Native Americans in the 1990 census were $21,750 as compared with $35,225 for whites. Similar discrepancies existed among educational opportunities: Some 9.3 percent of Native Americans were college educated in 1990, whereas 20.3 percent of whites were. Meanwhile, between 1980 and 1993, rates of alcoholism continued to be high among Native Americans. Whereas white society had a rate of alcoholism of 6.9 people per 100,000, among Native Americans, the rate was 59 per 100,000. Thus, despite grassroots protests and despite the profit of Native American casinos, Native Americans remained one of the most deprived segments of American society. Further Reading
Henson, C. L. “Gaming in Indian Country.” American Studies Today Online. http://www.americansc .org.uk/Online/Gaming.htm. In a brief article, C. L. Henson, a Cherokee and the former Director of the Special Education Unit of the Bureau of Indian Affairs, decries the development and growth of gaming on Native American reservations. Mason, Dale W. Indian Gaming and Tribal Sovereignty and American Politics. Norman: University of Oklahoma Press, 2000. Mason examines the issue of casinos on reservations in the context of tribal sovereignty and American politics. Matthiessen, Peter. In the Spirit of Crazy Horse. New York: Penguin, 1993. Matthiessen’s book is so controversial in its accusations that it was banned for a while. It provides an account of the Federal Bureau of Investigation’s war on the American Indian Movement and the conviction and imprisonment of Leonard Peltier. Smith, Paul C. Like a Hurricane: The Indian Movement from Alcatraz to Wounded Knee. New York: Norton, 1996. A useful history of the American Indian Movement. H. William Rice See also
Aboriginal rights in Canada, Cher; Erdrich, Louise; Harp seal hunting; Income and wages in the United States; Indian Gaming Regulatory Act of 1988; Minorities in Canada; Multiculturalism in education; Racial discrimination.
The Eighties in America
■ Natural disasters Definition
Meteorologic and geologic events resulting in significant loss of life or property
Key natural disasters of the decade revealed the effects of technological advances in forecasting such events and also highlighted the extent to which the course of such disasters could be shaped by trends like population growth and economic development. In 1980, the modernization of the U.S. National Weather Service took a huge step forward when the Federal Aviation Authority (FAA), the Department of Defense, and the National Weather Service (NWS) launched a joint agreement to develop NEXRAD (Next Generation Weather Radar). The resulting Doppler radar’s high resolution greatly aided forecasters’ ability to predict troublesome weather and to issue watches and warnings in a more timely fashion. Although the new technology would not be fully
Natural disasters
■
697
employed until the 1990’s, it played a significant role in the forecasting of the greatest natural disasters of the 1980’s. El Niño, Temperature Extremes, and Resulting Disasters The 1980’s saw a number of disastrous floods,
which many scientists linked to the presence of El Niño, a periodic warming of Pacific Ocean waters off the coast of South America that affects global weather conditions. In 1982 and 1983, a series of storms brought flooding to both the western United States and the states along the Gulf coast, resulting in approximately one hundred deaths and billions of dollars in damages. Excessive regional rains produced flooding in Mississippi in 1983 and in Virginia and West Virginia in 1985, resulting in approximately seventy deaths and billions of dollars in damages. From 1983 to 1986, flooding along the shoreline of the Great Salt Lake in Utah also caused billions of dollars in damages. A major flood in St.
Sections of the Cypress Viaduct along Northern California’s Interstate 880 collapsed during the 1989 Loma Prieta earthquake, which measured 7.1 on the Richter scale. (USGS)
698
■
Natural disasters
Charles County, Missouri, which lies in a floodplain at the confluence of the Mississippi and Missouri Rivers, occurred in 1986. Heat waves linked to El Niño in 1980 and 1988 throughout major portions of the United States proved to be several of the deadliest such events on record. In 1980, a strong high-pressure ridge kept temperatures above ninety degrees for much of the summer, breaking temperature records in a number of cities. Resulting droughts and windstorms caused an estimated $20 billion in damages, largely in agricultural losses. Estimates placed the number of heatrelated deaths between 1,250 and 10,000. The 1988 heat wave and drought caused an estimated $40 billion in damages and between 5,000 and 10,000 deaths. Smaller heat waves and droughts occurred in 1986 and 1989. At the other extreme, freezes in Florida during the winters of 1983 and 1985 resulted in billions of dollars in losses to the citrus industry. Extremely dry conditions and high winds also helped fuel an outbreak of forest fires in Yellowstone National Park in the summer of 1988. Thousands of fire fighters fought flames that had burned more than 1.5 million acres by the summer’s end. Hurricanes, Tornadoes, Earthquakes, Volcanoes
The decade saw a number of costly hurricanes. In 1982, Typhoon Iwa passed near Hawaii, and its excessive winds left one person dead and caused $250 million in damages. Hurricane Alicia struck Galveston, Texas, as a Category 3 storm on August 18, 1983, causing an estimated $2 billion in damages, twenty-one deaths, and numerous injuries. In 1985, Hurricane Elena stalled off Florida’s west coast over the Labor Day weekend as a Category 3 storm, resulting in the largest peacetime evacuation to that point in U.S. history. On September 2, Elena came ashore near Biloxi, Mississippi. Elena caused four deaths and an estimated $213 million in damages. Hurricane Juan struck Louisiana from October 26 through November 1, 1985. Although only a Category 1 storm, Juan caused severe flooding that resulted in sixty-three deaths and $1.5 billion in damages. The decade’s worst U.S. hurricane was 1989’s Hurricane Hugo, which first struck Puerto Rico and the U.S. Virgin Islands. It then struck the Carolinas on September 22 as a Category 4 storm with a twenty-foot storm surge and severe winds, resulting in $7 billion in damages and thirty-five dead. On March 28, 1984, tornadoes struck North and
The Eighties in America
South Carolina, causing fifty-seven deaths and over 1,000 injuries. On May 31, 1985, tornadoes struck Pennsylvania and Ohio, resulting in 756 deaths and an estimated $450 million in damages. In 1988, an F4 tornado struck North Carolina at night, leaving $77 million in damages, four dead, and 154 injured in its wake. A 1982 San Francisco landslide left twenty-five dead and $66 million in damages, but the most well known disaster in this category was the 1989 Loma Prieta earthquake that also struck the San Francisco area. The quake, which measured 7.1 on the Richter scale, left heavy damage and resulted in sixty-two deaths. One of the most spectacular and well known of the decade’s natural disasters was the 1980 eruption of the volcano Mount St. Helens in Washington State. The hot cloud of ash particles and a subsequent mudflow decimated forty square miles. There were fifty-seven deaths, as well as a rash of respiratory diseases in the surrounding area. Impact The modernization of weather forecasting, such as the development of NEXRAD, Doppler radar, and improved computer technology, resulted in more timely warnings of impending natural disasters and helped save lives. The new technology also led to the 1989 decision to restructure the NWS and eliminate a number of branch offices. It also allowed for the creation of mobile forecasting units that could be dispatched where needed. For example, mobile forecasting units called Air Transportable Mobile Units (ATMU) could be deployed to forest fires nationwide by 1987. Improved forecasts from the National Hurricane Center allowed people in threatened areas to decide whether or not to evacuate well in advance of approaching storms. Such evacuations sometimes revealed the remaining unpredictability of nature, however, as when numerous people evacuated parts of Texas in advance of 1988’s Hurricane Gilbert, only to see it shift direction and strike Mexico. Population growth and development in areas vulnerable to natural disasters such as floods, hurricanes, and earthquakes not only led to increases in property damages but also led to demands for stricter building codes. The Loma Prieta earthquake’s destruction of housing moved the cities of the Bay Area to act on the issue of buildings that were not reinforced or designed to withstand such quakes, and the 1986 Missouri flood led to questions over whether the federal government should re-
The Eighties in America
quire stricter codes for mobile home construction. Increasing damage figures also triggered insuranceindustry crises, especially in vulnerable areas. Local, state, and federal governments also sought to improve their responses to disasters’ aftermaths. President Jimmy Carter had created the Federal Emergency Relief Management Agency (FEMA) in 1979 as a unified agency to coordinate disaster relief. The natural disasters of the 1980’s tested the new agency’s ability to meet its goals. Further Reading
Harris, Stephen L. Agents of Chaos: Earthquakes, Volcanoes, and Other Natural Disasters. Missoula, Mont.: Mountain Press, 1990. Explores natural geologic disasters, with a focus on the Western United States. Officer, Charles, and Jake Page. Tales of the Earth: Paroxysms and Perturbations of the Blue Planet. New York: Oxford University Press, 1993. Studies both natural and human-induced events, including the major events of the 1980’s. Steinberg, Theodore. Acts of God: The Unnatural History of Natural Disaster in America. New York: Oxford University Press, 2000. Uses case studies to examine American views of natural disasters and humans’ role in creating them. _______. Down to Earth: Nature’s Role in American History. New York: Oxford University Press, 2002. Environmental history that explores changing views of nature and how nature has shaped history. Wood, Robert A., ed. The Weather Almanac: A Reference Guide to Weather, Climate, and Related Issues in the United States and Its Key Cities. 7th ed. Detroit: Gale Research, 1996. This eight-hundred-plus-page book provides much data and information on the subject. Marcella Bush Trevino See also
Doppler radar; El Niño; Heat wave of 1980; Hurricane Hugo; Loma Prieta earthquake; Mount St. Helens eruption; Yellowstone National Park fires.
Navratilova, Martina
■
699
■ Navratilova, Martina Identification Professional tennis player Born October 18, 1956; Prague, Czechoslovakia
(now Czech Republic) Navratilova won more professional tennis grand slam championships in singles, doubles, and mixed doubles than any female or male player of the 1980’s. She won the singles title every year between 1982 and 1987. During the 1980’s, Martina Navratilova’s grand slam victories included fifteen singles titles. The grand slam tournaments consist of the Australian, French, and U.S. Opens and Wimbledon. Navratilova was second in singles ten times, won twenty-four doubles titles, and won four mixed doubles titles. In 1982, she became the first female athlete in any sport to win more than $1 million in prize money during a calendar year. The following year, her singles record in all her tournaments was 86-1. She had a total of only six losses in singles between 1982 and 1984— the year that United Press International named her Female Athlete of the Year. Navratilova was the number-one-ranked female tennis player in the world for a total of 150 weeks between 1982 and 1987. In 1985, 1986, and 1987, she was in the singles final in all eleven of the grand slam tournaments she entered. A very rare achievement occurred in 1987, when she won the singles, doubles, and mixed doubles, all the available events, at the U.S. Open. With her primary women’s double partner, Pam Shriver, she had a 109-match winning streak between 1983 and 1985. In 1984, the pair won the doubles in all four of the grand slam tournaments. Navratilova is also remembered for her rivalry, friendship, and great matches played against tennis great Chris Evert. Impact Navratilova had a significant impact on the sport of tennis through her unparalleled physical conditioning program, which included several coaches who trained her in nutrition, weight lifting, and tennis. She was among the first female players to practice primarily with men and was unique among her competitors to use the serve-and-volley aggressive attacking style of play. Away from the tennis court, Navratilova received much media attention regarding her public declaration of her lesbian identity. As a world-famous athlete, she brought attention to issues surrounding gay rights and women’s rights. Her honesty and avail-
700
■
Naylor, Gloria
The Eighties in America
ability to the media made her a popular sports champion and role model for other lesbians. Further Reading
Blue, Adrianne. Martina: The Lives and Times of Martina Navratilova. Secaucus, N.J.: Carol, 1995. Navratilova, Martina. Being Myself. New York: HarperCollins, 1986. Alan Prescott Peterson See also
Homosexuality and gay rights; McEnroe, John; Sports; Tennis.
■ Naylor, Gloria Identification African American novelist Born January 25, 1950; New York, New York
An important African American woman novelist, Naylor published three books during the 1980’s that made a significant impact upon American literary culture. During the 1980’s, Gloria Naylor published three important and widely acclaimed novels: The Women of Brewster Place: A Novel in Seven Stories (1982), Linden Hills (1985), and Mama Day (1988). All three books focused on African American characters, telling distinctly American stories that touched on a wide swath of experience to treat poverty, racism, sexism, sexuality, spirituality, and community. The Women of Brewster Place received accolades, winning the 1983 American Book Award for best first novel. In 1989, the novel was adapted for the American Broadcasting Company (ABC) as a miniseries produced by Oprah Winfrey, who also acted in the production. Naylor’s books are at once accessible to a wide scope of readers and highly literary, making allusions to the works of Dante and William Shakespeare and to other classics by both white writers and writers of color. Among critics, Naylor is regarded for her deft depictions of African American women across socioeconomic classes and life experiences. Her work took on particular significance against the backdrop of a public debate of the 1980’s, during which some public figures viciously characterized African Americans as abusing the nation’s welfare system and leading the country’s crime rates. Naylor exploded these stereotypes—particularly that of the “black welfare mother.”The Women of Brewster Place, set in an urban housing project in the Northeast, narrates the histo-
Gloria Naylor.
ries of seven female residents and their relationships. By chronicling their difficulties and suffering, Naylor revealed American society’s racism, sexism, and homophobia while telling horrific stories of violence and poverty. Poverty is extremely hard on children in Naylor’s novel: A rat bites one child’s face, a large family runs wild and eats from the neighborhood garbage can, and another child is electrocuted. The novel moves beyond its portraits of the seven women to tell the story of Brewster Place itself. It begins and ends by recounting the first thirty years of the community’s development from promising neighborhood into dead-end street project. A strong sense of place pervades all of Naylor’s work, and it was a major theme in many African American novels of the 1980’s, as well as films such as Do the Right Thing (1989), set in Brooklyn, and Coming to America (1988), set in Queens. In African American television shows such as 227 (1985-1990), set in Washington, D.C., and Frank’s Place (19871988), set in Louisiana, setting also played a prominent role in the narrative.
The Eighties in America
Naylor’s subsequent novels were increasingly ambitious. The narrative structure of Linden Hills parallels that of Dante’s Inferno, drawing an implicit comparison between African American middle-class culture with its buried ugliness and moral bankruptcy and Dante’s vision of Hell. Like Toni Morrison’s Song of Solomon (1977), the book offers an indictment of a black bourgeoisie that strives for wealth and social attainment. Mama Day portrays the matriarchal society of Willow Springs, a Gullah island owned by slaves since 1823, when a slave woman married her master, forced him to deed the land to his slaves, then killed him. At the center of this empowering, African American, matriarchal culture is Miranda “Mama” Day, firmly rooted in the irrational, feminine world as a healer, knowledgeable about medicines and potions. Naylor contrasts this world with New York, where Mama Day’s niece Cocoa has moved. Many critics have compared Mama Day to Shakespeare’s The Tempest and King Lear, though Naylor draws as much influence from earlier African American and women writers as from Shakespeare. In the 1980’s, more than in previous decades, writers incorporated influences from a wide range of literature, including earlier African American female voices such as Ann Petry and Zora Neale Hurston. Naylor took advantage of this broad literary heritage while reinventing the American novel to encompass intersections between race, culture, gender, sexuality, and class. Impact Gloria Naylor, alongside colleagues such as Toni Morrison, Jamaica Kincaid, Alice Walker, and Terry McMillan, represented a generation of African American women writers who not only portrayed the American experience but made it their own. Their explorations of the intersections between gender, race, class, and sexuality during the 1980’s constituted not only some of the best writing of the decade but also some of the most American writing as well, permanently changing the definition of American literature. Further Reading
Montgomery, Maxine Lavon. Conversations with Gloria Naylor. Jackson: University Press of Mississippi, 2004. Gates, Henry Louis, Jr., and K. A. Appiah, eds. Gloria Naylor: Critical Perspectives Past and Present. New York: Amistad, 1993. Georgie L. Donovan
Neoexpressionism in painting
■
701
See also
African Americans; Beloved; Color Purple, The; Kincaid, Jamaica; Literature in the United States; Multiculturalism in education.
■ Neoexpressionism in painting Definition
Art movement
During the 1980’s, a diverse yet bold group of artists sought to move away from the remoteness associated with minimalism and conceptualism and to promote a more aggressive and emotive approach to painting. The neoexpressionist rebellion against the prevailing trends of contemporary art had its beginnings during the 1960’s, but it did not reach its peak until the late 1970’s and early 1980’s. The artists associated with this movement, such as the supremely selfconfident American artist Julian Schnabel and the eclectic Italian artist Francesco Clemente, believed in expressing their inner angst through a more figurative painting style. Art was to be linked closely with the psyche of the individual artist. The neoexpressionist artists were greatly influenced by the works of the groundbreaking psychologist Carl Jung. Attempting to express through art the more primitive of impulses, the neoexpressionists looked to memory, to raw sexuality, and to emotional fervor for inspiration. A childlike enthusiasm for self-expression found its way back into art. Painting Is Not Dead The growing avant-garde art scene of the 1960’s and 1970’s had become an extremely reductive and impersonal enterprise. Emotion had become less common as a motivation for the artistic process. Prevailing artistic trends were cerebral and cared little for the act of painting. Art critics and theorists even went so far as to expound that painting was certainly irrelevant and most likely dead. For many European and American artists, however, the thought that painting could be “dead” had the ring of absurdity. For this diverse group of artists, the often messy act of painting was a necessary ingredient in the creative process. It was their belief that painting could be made vital and relevant to the contemporary world. This new attempt to bring back painting was looked upon with disdain by several prominent art authorities of the time. While the artists of this revolutionary new movement did employ wild and seemingly inappropriate
702
■
The Eighties in America
Network anchors
color schemes, amateurish drawing techniques, and some bizarre and at times revolting images, they were, thankfully, breathing life back into the artistic process. Schnabel Becomes a Star
In addition to the American graffiti and neoexpressionist artist Jean-Michel Basquiat, Schnabel became a major force in the art world during the 1980’s. Sadly, Basquiat died in 1988 while still in his twenties. While this was a tragic blow to the art world, the neoexpressionist movement already had established itself as a powerful artistic force by then through the efforts of many other important artists, including the ambitious Schnabel. He had his first solo exhibition in New York City in 1979. As a result of a massive publicity campaign, his show was a spectacular success. The popularity of his large, fiercely energetic paintings led to some negative rumblings in the art community. Not one to be cautious, Schnabel boldly presented his new work with no apologies for its raw and contradictory nature. Always willing to take risks, he helped energize other artists to take risks as well. During the 1980’s, with the help of such prominent art dealers as Mary Boone and Leo Castelli, Schnabel’s reputation as an artist skyrocketed to almost rock-star status. He was more than willing to break established artistic norms. He was not above using such taboo items as velvet in his paintings. Viewing art as a form of “liberation,” Schnabel continued to produce provocative works and to inspire a new generation of artists. He later went on to become a noted director of such films as Basquiat (1996) and Before Night Falls (2000).
Impact Taking inspiration from earlier art movements such as German expressionism and abstract expressionism, the diverse neoexpressionist artists changed the artistic landscape during the 1980’s. Led by such American artists as Jean-Michel Basquiat, Philip Guston, Susan Rothenberg, David Salle, and Julian Schnabel; by such German artists as Georg Baselitz and Anselm Kiefer; by such British artists as Christopher Le Brun and Paula Rego; and by such Italian artists as Francesco Clemente and Sandro Chia, the neoexpressionist movement rose to dominate the art scene of the 1980’s because of its willingness to break with tradition and to employ more mainstream promotional methods. Art dealers and gallery owners marketed these rebellious artists with a tenacity unrivaled in the art community. While this commercialized approach to sales-
manship made many art experts uneasy, it definitely put a bright spotlight on these artists and made several of them almost superstars. Neoexpressionism expanded the creative and commercial options for artists of the 1980’s and beyond. Further Reading
Dempsey, Amy. Art in the Modern Era: Styles, Schools, and Movements. New York: Thames and Hudson, 2005. A comprehensive survey of modern art, including an entry on neoexpressionism. Fineberg, Jonathan David. Art Since 1940: Strategies of Being. New York: Harry N. Abrams, 1995. An important overview of contemporary art, including a fascinating section on Julian Schnabel and other American neoexpressionists. Kuspit, Donald. The New Subjectivism: Art of the 1980’s. Ann Arbor, Mich.: UMI Research Press, 1988. A detailed investigation of what made the art world unique during the 1980’s. Little, Stephen. . . . Isms: Understanding Art. New York: Universe, 2004. Includes a concise overview of neoexpressionism. Pearlman, Alison. Unpackaging Art of the 1980’s. Chicago: University of Chicago Press, 2003. Includes a riveting discussion of the art of the neoexpressionist artists Julian Schnabel and David Salle. Sandler, Irving. Art of the Post-modern Era: From the Late 1960’s to the Early 1990’s. New York: HarperCollins, 1996. Includes an insightful chapter on American neoexpressioniam. Schnabel, Julian. Julian Schnabel. New York: Abrams, 2003. An important introduction to one of the most original American neoexpressionist artists. Includes a wonderful selection of illustrations of his work. Jeffry Jensen See also
Art movements; Basquiat, Jean-Michel; Schnabel, Julian.
■ Network anchors Definition
Television broadcasters of the national evening news
Tom Brokaw, Peter Jennings, and Dan Rather each spent more than two decades as the anchors of their network evening newscasts, and it was during the 1980’s that they enjoyed their largest audiences and influence.
The Eighties in America
Tom Brokaw began coanchoring the NBC Nightly News in 1981. Two years later, his coanchor moved on, leaving Brokaw as the sole host. His career path took him from local news in the Midwest and Southern California through to Washington and eventually to the National Broadcasting Company (NBC) News. Peter Jennings returned to the anchor desk on the same evening—September 5, 1983—that Brokaw began as the sole anchor. Jennings had anchored the evening news for the American Broadcasting Company (ABC) when he was in his twenties. He was considered a prodigy; his father was a legendary Canadian journalist. By his own admission, Jennings was ill-prepared at such a young age to take on the important role of anchoring, so he stepped aside and served as a foreign correspondent for ABC for many years. Dan Rather replaced a legend, Walter Cronkite, at the Columbia Broadcasting System (CBS), taking over the anchor chair on March 9, 1981. His career began in his native Texas, but he was quickly hired by CBS. At the network, he moved from reporter to cohost of 60 Minutes before succeeding Cronkite. “Corporatization” of News
Brokaw, Jennings, and Rather were giants, but not necessarily because of how long they each held the critically important role of news anchor. Their importance resulted from the recognition that, beginning in the 1980’s, network television news had begun to change, and they were the men who witnessed the transformation of the network news program. Capital Cities purchased ABC in 1985. One year later, General Electric took over NBC. CBS underwent a significant managerial change during that same year. In the new corporate environment, network news divisions were no longer allowed to be financial “loss leaders.” That is, they were no longer expected, or even allowed, to lose money on the theory that an excellent news department would build the overall reputation of its network, thereby contributing to the larger bottom line. In the past, the news departments were expected to spend whatever money it took in order to broadcast relevant news and information from anywhere in the world. Considering that Cold War hostilities between the Soviet Union and the United States continued through the 1980’s and that political and military unrest existed in various world hot spots in decades prior, there was plenty of news to report. If the networks lost money in the process of informing
Network anchors
■
703
the public, that was acceptable, because it helped build the audience for their other programming. However, with the “corporatization” of news, news divisions were expected to make money themselves, ensuring that the types of stories these men—especially Jennings—considered important (namely, international news) would be swept aside as the 1980’s came to an end. Instead, lighter, more feature-based reports became more critical. They were cheaper to produce and, according to many network executives, of more interest to the public. During his time as a foreign correspondent, Jennings reported from (among other places) Rome and Beirut. ABC still had bureaus in those locations during the 1980’s, but less than two decades later they, along with five others, were shuttered. This trend would continue as the three networks closed more international news bureaus in later years. It was Jennings who, among his anchor colleagues, spent the most amount of time overseas; however, the newscasts that Brokaw, Rather, and he anchored benefited from a strong stable of international news correspondents and stories. The audience numbers that Brokaw, Jennings, and Rather enjoyed during the 1980’s were in a steady decline in later decades. Nielsen Media Research reported that in the early 1980’s more than fifty million people per night watched a network newscast. By the early twenty-first century, that figure had dropped by almost half. Why such a precipitous decline? In the 1980’s, cable news was still in its infancy, the Internet had not been fully developed, and the notion of a family gathering at home for dinner was not as passé as it seems today. Impact Brokaw, Jennings, and Rather were influential because they became anchors during a critical transition in television, when networks adopted a corporate mentality. News from around the world was thought to be less relevant, and technology still had not swept aside conventional viewing habits. These men were not dinosaurs, but the relevance of their newscasts was about to slide. Further Reading
Alan, Jeff. Anchoring America: The Changing Face of Network News. New York: Bonus Books, 2003. Though light on historical research, this book does provide interesting snapshots of Brokaw, Jennings, Rather, and other former and current network news journalists and anchors.
704
■
New Coke
Allen, Craig M. News Is People: The Rise of Local TV News and the Fall of News from New York. Ames: Iowa State Press, 2001. Allen takes a scholarly approach to help his readers understand why local television news has become more critical than national and world news to the general public. Brokaw, Tom. A Long Way from Home: Growing Up in the American Heartland. New York: Random House, 2002. Brokaw’s memoir of his youth in South Dakota and the values that shaped him. Darnton, Kate, Kayce Freed Jennings, and Lynn Sherr, eds. Peter Jennings: A Reporter’s Life. New York: PublicAffairs, 2007. The first significant work about Jennings following his death in 2005. Rather, Dan, and Peter Wyden. I Remember. Boston: Little, Brown, 1991. Offers interesting insights by Rather into his youth and early years in Texas. Anthony Moretti See also Brokaw, Tom; Cable television; Jennings, Peter; Journalism; Rather, Dan; Television.
■ New Coke Definition Coca-Cola’s new soft-drink formula Date Introduced on April 23, 1985
Coca-Cola underestimated Americans’ sentimental attachment to its soft drink and caused a public outcry by replacing its ninety-nine-year-old cola formula. The new product and its failure acquired iconic value, coming to symbolize all the decade’s major mistakes by corporate executives. In the early 1980’s, the Coca-Cola Company began experimenting with different sweeteners to produce a new, diet version of Coke. This research continued in 1983, as Coca-Cola sought to develop a sweeter cola to rival Pepsi and increase its market share among teenagers. In January, 1985, operating under extreme security, marketing executives began to develop an advertising campaign for the new, sweeter cola. Hastily, two research companies conducted market research regarding the new formula. Preliminary blind taste tests indicated that Americans preferred the new, sweeter cola. However, only 20 percent of the taste tests conducted used the final formula. Nonetheless, these results guided CocaCola management’s decision to market the new formula. The company also underestimated research that
The Eighties in America
indicated Americans held a sentimental and patriotic attachment to the Coca-Cola formula, instead believing that consumers would remain loyal to the brand name despite a change in the formula. CocaCola’s management thus determined that only one soft drink would be named Coca-Cola in the United States, to ensure that any increase in consumers of potential Coke products would be attributable to converted Pepsi drinkers, rather than loyal CocaCola drinkers. The new formula became the only Coca-Cola, and the company no longer bottled the original formula, used in the United States for the previous ninety-nine years. On April 23, 1985, Coca-Cola president Roberto Goizueta and other top executives held a press conference at the Lincoln Center in New York to introduce the new Coke. Surprised by the overwhelmingly negative response and hostility from reporters, Goizueta was ill-prepared to field questions from journalists in a professional manner. Reports of CocaCola’s new Coke and Goizueta’s loss of composure led news broadcasts around the country. Released to the public the next day, the new Coke was rejected by nostalgic Americans, who felt betrayed by the corporation’s decision to change the formula during a decade whose culture was centrally characterized by nostalgia. By June, the Coca-Cola customer service telephone number was receiving over eight thousand calls a day from angry consumers. Forced by customers to admit its mistake, on July 10, 1985, Coca-Cola reintroduced the original formula under the name Coca-Cola Classic. The new, sweeter formula remained on shelves as Coke II. The decision to sell both formulas led many analysts to speculate that Coca-Cola had planned the introduction of new Coke and the subsequent media and consumer outcry in order to garner free publicity. The company, however, disputed these charges, claiming they were neither “dumb enough” nor “smart enough” to plan such an advertising ploy. Impact The Coca-Cola Company miscalculated the emotional connection Americans had with its signature soft drink. Without conclusive market research, the hastily made decision to introduce a new formula created a hostile consumer backlash, forcing Coca-Cola to relent and to rebottle Coca-Cola Classic. The new Coke fiasco is now used in many advertising textbooks as an example of a failed marketing strategy.
The Eighties in America
New Mexico State Penitentiary Riot
■
705
traband, corruption, racial tensions, inadequate staffing, and antiquated security. While these problems caused the riot, however, the event’s bloody toll resulted from the snitch system. By utilizing inmate informants, or snitches, the penitentiary administration created a class of hated inmates who became the main target of the rioters’ rage. By early 1980, the stage was set for the riot. Because of cellblock renovation, the most violent inmates were placed in a less secure dormitory than was normal, called E-2. Moreover, work on security grilles hampered the guards’ ability to isolate Coca-Cola chief executive officer Roberto Goizueta, left, toasts the introduction of New Coke with chief operating officer Donald Keough in 1985. (AP/Wide World Photos) cellblocks. Finally, the supposedly shatterproof window between the control center and Further Reading the main inmate corridor proved vulnerable to atHayes, Constance L. The Real Thing: Truth and Power tack. at the Coca-Cola Company. New York: Random On February 1, inmates in E-2 began fermenting House, 2004. grain alcohol. Emboldened by drink, they decided Oliver, Thomas. The Real Coke, The Real Show. New to capture the facility. That night, there were fifteen York: Random House, 1986. mostly inexperienced guards to oversee 1,157 inPendergrast, Mark. For God, Country, and Coca-Cola: mates. At 1:40 a.m., four guards entered E-2 to secure The Unauthorized History of the Great American Soft it for the night and were subdued. The inmates then Drink and the Company That Makes It. New York: captured the corridor guards, grabbed their keys, Macmillan, 1993. and opened more dormitories. Inmates streamed Malana S. Salyer through the unlocked corridor grille to the control center, shattering its “shatterproof” window with a See also Advertising; Business and the economy in fire extinguisher. The control center guards fled, the United States; Caffeine; Diets; Food trends. and rioters swarmed in. By 2:15 a.m., the penitentiary was under their control. The prison became a nightmarish bacchanalia, as inmates stormed the hospital for drugs and the shop ■ New Mexico State Penitentiary for inhalants. Rioters tortured some of the guards, Riot though others were protected from harm, and none The Event America’s second-deadliest prison riot were killed. In the administrative offices, the prisonDate February 2-3, 1980 ers torched their files. The killing began at 3:00 a.m. Place Penitentiary of New Mexico, Santa Fe with the beating of a hated snitch. At dawn, the rioters used acetylene torches to break into Cellblock 4, The brutal Penitentiary of New Mexico riot horrified Amerwhere many known snitches were housed, along ica, resulting in significant nationwide penitentiary rewith the infirm and weak. There, the riot produced form. its worst bloodshed. After finishing in Cellblock 4, For years, the Penitentiary of New Mexico was the rampage, by then motivated by racism and revenge, consumed the rest of the facility. plagued by problems including overcrowding, con-
706
■
New Wave music
Outside, state police and National Guard units, directed by Governor Bruce King and Deputy Secretary of Corrections Felix Rodriguez, surrounded the perimeter to prevent a massive escape of inmates. A standoff ensued until the late morning, when inmates agreed to release the captured guards. In return, Secretary Rodriguez promised to end the penitentiary’s harsh regimen and increase educational and psychological programs. Allowed a television interview, inmate spokesmen described the horrible conditions that had led to the riot, while Dormitory B burned in the background. The hostages were released, and at 1:30 p.m., special weapons and tactics (SWAT) teams retook the penitentiary.
The Eighties in America
Seymour Stein, coined the term “New Wave” in relation to music. He wanted a marketing term to use for some bands that had recently been signed to his label. It was no longer fashionable to promote bands as being “punk,” so Stein believed that the “New Wave” label would generate interest among radio stations and clubs. He believed that this new musical direction could be compared to the 1960’s French New Wave movement in film. As with the filmmakers of the French New Wave, the New Wave rock performers were primarily anti-establishment and believed in taking musical risks, but they tended to be less aesthetically rough-edged than were punk rockers.
Out of the self-destructive punk rock scene of the mid- to late 1970’s, a more pop and refined genre of rock known as New Wave emerged. It became prominent during the early 1980’s, becoming associated both with mainstream groups and with a continuing fringe sensibility.
Punk Attitude with a Musical Twist Several important New Wave groups—including Talking Heads, Blondie, Elvis Costello, the Cars, and the Police— released important recordings during the late 1970’s. It can be said that both Talking Heads and Blondie had one foot in punk rock and one foot in the music scene that was in the process of emerging into New Wave. In addition to Elvis Costello, Talking Heads, and Blondie, Television, the B-52’s, the Jam, Devo, and Patti Smith originally had been labeled as being part of the punk rock scene. One of the most striking distinctions between punk and New Wave was that New Wave groups tended to combine the sensibilities and pretense of art with basic musical hooks that came from pop music. While punk rock celebrated a take-no-prisoners approach to recording and performance that often eschewed traditional pleasing melodies, New Wave bands found a way to refine their sound. This refinement allowed the bands to garner a level of commercial airplay that punk rock performers could never achieve. While most New Wave bands became linked because of their willingness to experiment, to write songs with intriguing lyrics, and to pay more attention to production values, the sound of each band was distinctive. New Wave bands borrowed from funk, disco, reggae, and ska. While the all-girl band the Go-Go’s rose to the top of the Billboard charts by producing an infectious pop sound, the Police became extremely popular by employing a variation on reggae, the Specials and Madness took inspiration from ska music, Nick Lowe and XTC played edgy power pop, and the Pretenders and Graham Parker produced a gritty rock sound.
New Wave music can trace its beginnings to the late 1970’s. It is believed that the head of Sire Records,
A Look as Well as a Sound With the advent of music videos and of cable television channel MTV in
Impact
According to forensic investigators, thirtythree people died in the New Mexico State Penitentiary Riot, a death toll second only to the 1971 Attica prison riot. In July, federal judge Santiago Campos signed a decree ordering the Penitentiary of New Mexico to set population limits and provide humane discipline, medical care, recreation, mail rights, schooling, jobs, and prerelease programs to its inmates. The judge’s order resulted in one of the most comprehensive prison reforms in U.S. history. The reformed penitentiary came to serve as a model for other penitentiary systems. Further Reading
Colvin, Mark. The Penitentiary in Crisis: From Accommodation to Riot in New Mexico. Albany: State University of New York Press, 1992. Morris, Roger. The Devil’s Butcher Shop: The New Mexico Prison Uprising. New York: Franklin Watts, 1983. John Nizalowski See also Crime; Miami Riot of 1980; Racial discrimination.
■ New Wave music Definition
Genre of rock music
The Eighties in America
New Wave music
Selected 1980’s New Wave Singles Year
Song
Performer
1980
“Call Me” “Brass in Pocket (I’m Special)” “Whip It!” “Cars” “We Got the Beat” “Antmusic” “Rapture,” “The Tide Is High” “De Do Do Do, De Da Da Da,” “Don’t Stand So Close to Me,” “Every Little Thing She Does Is Magic” “Our Lips Are Sealed” “Goody Two Shows” “Shake It Up” “Don’t You Want Me” “A Town Called Malice” “Spirits in the Material World” “The Look of Love” “I Ran (So Far Away)” “I Want Candy” “Vacation,” “We Got the Beat” (album version) “Everyday I Write the Book” “Do You Really Want to Hurt Me,” “Karma Chameleon,” “I’ll Tumble 4 Ya” “Hungry Like the Wolf,” “Rio” “Sweet Dreams (Are Made of This)” “Back on the Chain Gang” “Burning Down the House” “Wishing (If I Had a Photograph of You)” “She Blinded Me with Science” “Drive” “The Reflex” “Here Comes the Rain Again” “Two Tribes” “Tonight She Comes” “Would I Lie to You” “And She Was” “Be Near Me” “Welcome to the Pleasuredome” “Notorious” “Missionary Man” “Don’t Get Me Wrong” “Manic Monday,” “Walk Like an Egyptian” “Love Shack”
Blondie The Pretenders Devo Gary Neuman The Go-Go’s Adam Ant Blondie
1981
1982
1983
1984
1985
1986
1989
The Police The Go-Go’s Adam Ant The Cars The Human League The Jam The Police ABC A Flock of Seagulls Bow Wow Wow The Go-Go’s Elvis Costello Culture Club Duran Duran Eurythmics The Pretenders Talking Heads A Flock of Seagulls Thomas Dolby The Cars Duran Duran Eurythmics Frankie Goes to Hollywood The Cars Eurythmics Talking Heads ABC Frankie Goes to Hollywood Duran Duran Eurythmics The Pretenders The Bangles B-52s
■
707
708
■
The Eighties in America
Nicholson, Jack
1981, several New Wave bands capitalized on the new format to promote their music. They added a “look” to their videos that garnered widespread media attention. During the early 1980’s, a subgenre of New Wave known as “New Romantic” became prominent. Such English bands as Duran Duran, Boy George and Culture Club, ABC, Frankie Goes to Hollywood, A Flock of Seagulls, the Thompson Twins, Spandau Ballet, Haircut 100, Adam Ant, and Bow Wow Wow produced striking videos, in which colorful and outrageous costuming played a major role in establishing the identity of the group. While the irreverence found in punk rock still was evident, New Wave bands were also interested in style, in creating something more akin to an art form. An electronic sound also was employed in many New Wave songs. The synthesizer and drum machine were just as essential as the guitar to an ever-increasing number of New Wave bands. Thomas Dolby’s “She Blinded Me with Science” and Gary Neuman’s “Cars” are prime examples of the importance of electronic instrumentation to the movement. By the mid-1980’s, though, the popularity of New Wave bands had waned. Impact New Wave music achieved mainstream popularity for only a short period of time; as a result, the genre became heavily associated with the 1980’s, and it remained a staple of later, nostalgic representations of the decade. Several New Wave artists— such as Elvis Costello and Talking Heads leader David Byrne—did continue to be successful for decades afterward, but they tended to evolve musically rather than maintaining their original sound. Others enjoyed periods of resurgence when the New Wave sound itself became popular again. These included the Pet Shop Boys, the Cure, Depeche Mode, and New Order. Many of the later bands that were called “alternative” (a marketing label similar to “New Wave” in origin) took inspiration from the New Wave of the past. Further Reading
Harrington, Joe S. Sonic Cool: The Life and Death of Rock ’n’ Roll. Milwaukee: Hal Leonard, 2002. Wonderful overview of rock history that includes a blistering chapter on the 1980’s. Heylin, Clinton. From the Velvets to the Voidoids: A Prepunk History for a Post-punk World. New York: Penguin Books, 1993. Chronological snapshot of punk rock and what happened after it in rock music.
Martin, Bill. Avant Rock: Experimental Music from the Beatles to Björk. Chicago: Open Court, 2002. Includes several insightful chapters on creative figures of the 1980’s who pushed the musical envelope. Masar, Brenden. The History of Punk Rock. Detroit: Lucent Books, 2006. Details how the downfall of punk rock led to emergence of such genres as New Wave and post-punk. Reynolds, Simon. Rip It Up and Start Again: Postpunk, 1978-1984. New York: Penguin Books, 2006. Music journalist Reynolds details the history of the bands that gained prominence after the demise of 1970’s punk rock. Some of the bands of this period that the author focuses on are the B-52’s, Joy Division, Gang of Four, and Devo. Skancke, Jennifer. The History of Indie Rock. Detroit: Lucent Books, 2007. Traces the roots of Indie rock back to 1970’s punk and New Wave. Jeffry Jensen See also
Blondie; Boy George and Culture Club; Devo; Go-Go’s, The; MTV; Music; Music videos; Pop music; R.E.M.; Sting; Talking Heads; Women in rock music.
■ Nicholson, Jack Identification American actor Born April 22, 1937; Neptune, New Jersey
During the 1980’s, Nicholson consolidated his position as one of Hollywood’s most talented actors and extended his range of roles. Jack Nicholson’s first film of the decade, The Shining (1980), featured him in what would become one of his iconic roles, that of mentally disturbed writer Jack Torrance. Based on the 1977 Stephen King novel and directed by Stanley Kubrick, the film followed the repressed writer’s descent into madness and violence. Although now regarded as one of the best horror films ever made, The Shining was panned by most reviewers. Critics of Nicholson’s over-the-top performance in The Shining faulted him for abandoning the realistic roles that had made him famous. His portrayal of American playwright Eugene O’Neill in Reds (1981), which was directed by his friend Warren Beatty, was greeted as a return to form, and Nicholson was nom-
The Eighties in America
Nicholson, Jack
■
709
Jack Nicholson gives a thumbs up sign as he arrives with co-star Anjelica Huston and director John Huston at the premiere of Prizzi’s Honor in 1985. (AP/Wide World Photos)
inated for an Academy Award for Best Actor in a Supporting Role. He also won praise for his understated role as a Texas border patrolman in Tony Richardson’s The Border (1982). Nicholson appeared next as an aging playboy in James Brooks’s bittersweet Terms of Endearment (1983), based on a 1975 novel by Larry McMurtry. The film was one of the most critically and commercially successful of the decade and earned Nicholson an Academy Award for Best Actor in a Supporting Role. It was his second Oscar. Famed director John Huston’s black comedy Prizzi’s Honor (1985) starred Nicholson as a hit man smitten with another paid killer, played by Kathleen
Turner. Based on a 1982 novel by Richard Condon, the film brought Nicholson another nomination for an Academy Award for Best Actor. However, it was Huston’s daughter (and Nicholson’s companion) Anjelica Huston who took home the film’s only Oscar—for Best Actress in a Supporting Role. Nicholson took two important roles in 1987, beginning with his portrayal of Daryl Van Horne in the satanic comedy The Witches of Eastwick (1987). Based on the 1984 novel by John Updike, the film also starred Cher, Susan Sarandon, and Michelle Pfeiffer. Nicholson played a radically different character—a guilt-ridden, alcoholic drifter—in the Depressionera Ironweed (1987), based on a 1983 novel by Wil-
710
■
Night Stalker case
liam Kennedy. He and costar Meryl Streep were both nominated for Academy Awards. Nicholson’s final role of the 1980’s, the Joker in Tim Burton’s Batman (1989), would turn out to be one of his favorites. Given free rein to return to the energetic, exuberant style he had displayed in The Shining, he was credited with carrying the film. At the same time, the actor’s personal life was proving equally dramatic. The 1989 revelation that his lover Rebecca Broussard was expecting their child effectively ended his relationship with Anjelica Huston. Impact Jack Nicholson played a wide range of film roles during the 1980’s, from the broodingly introspective to the manically extroverted. Reliably popular with audiences, he was also named Best Actor of the Decade in a survey of leading critics conducted by the magazine American Film. Further Reading
McGilligan, Patrick. Jack’s Life: A Biography of Jack Nicholson. New York: W. W. Norton, 1994. Shepherd, Donald. Jack Nicholson: An Unauthorized Biography. New York: St. Martin’s Press, 1991. Grove Koger See also
Academy Awards; Action films; Cher; Film in the United States; Horror films; King, Stephen; Streep, Meryl; Terms of Endearment; Turner, Kathleen.
■ Night Stalker case The Event
Crimes, investigation, and sentencing of serial killer Richard Ramirez Date First murder occurred on June 28, 1984; sentencing took place on November 7, 1989 Place Los Angeles and San Francisco, California Ramirez’s method, frequency, and manner of crimes made him one of the most shocking serial killers of the decade. Richard Ramirez’s crime spree began on June 28, 1984, with the murder of a seventy-nine-year-old woman in a suburb of Los Angeles. He stabbed her repeatedly and slashed her throat, almost decapitating her. Afterward, he sexually assaulted her body. The police found fingerprints, but computerized system for matching fingerprints had not been developed yet. On separate occasions in February, 1985, Ramirez abducted two young girls, ages six
The Eighties in America
and nine, sexually assaulted them, and freed them. On March 17, he shot a young woman. She survived, but he killed her roommate. Later that day, he pulled another woman from her car and fatally shot her. The police matched the bullet casings from the two scenes. These attacks provided authorities with their first description of Ramirez as a man with long, curly hair, rotting teeth, and bulging eyes. Media coverage fueled panic in the community. A few days later, Ramirez abducted an eight-yearold girl and raped and killed her. He shot a sixtyfour-year-old man and his wife on March 27, 1985. He stabbed the wife multiple times postmortem and carved out her eyes. The police matched the bullet casings again and suspected the work of a serial killer. Ramirez also left a footprint at the scene of the March 27 murders. He had been wearing a brand of shoe that was new to the United States, and only one pair had been sold in the area. The police created a composite sketch based on the store owner’s description. Ramirez had developed a signature in his crimes: He quickly dispatched male victims so he could spend more time brutalizing women. Ramirez took a six-week break before fatally shooting a sixty-six-year-old man, who managed to phone the police before he died, thereby saving his wife. Two weeks later, Ramirez sodomized a woman while her twelve-year-old son was locked in a closet. At the end of May, 1985, Ramirez beat two elderly sisters so severely with a hammer that he cracked its handle; one survived. He drew a pentagram on one sister’s inner thigh and one on the wall. He raped a six-yearold girl at the end of June and slit the throat of a woman the following night. July brought five murders, one attempted rape, one severe beating in which the victim survived, and two rapes, one of which was of an eight-year-old boy, who Ramirez raped in front of his mother. Ramirez Is Dubbed the “Night Stalker”
Ramirez started August, 1985, with a double shooting; both victims survived. Two nights later, he killed a thirtyfive-year-old man and raped the man’s wife. The media named him the “Night Stalker.” Los Angeles County lived in fear of the Night Stalker, and the police were frantic. Ramirez’s “cooling off” periods were shortening. There was little doubt he would strike again, but he eluded Los Angeles authorities by moving north to San Francisco. On August 18, Ramirez struck in the Bay Area,
The Eighties in America
shooting two victims in the head; the wife survived. He drew another pentagram at the scene and wrote lyrics from a heavy metal song. A bullet from the scene was matched to others in Los Angeles County. Fear spread through the San Francisco Bay Area. The mayor of San Francisco attempted to alleviate the panic by releasing confidential facts to the media. After seeing the resulting reports, Ramirez threw his sneakers and gun off the Golden Gate Bridge. Ramirez changed his hunting grounds on August 24 and broke into a couple’s home. He shot the man, raped the woman, and bound her. She worked free, however, and saw him leaving in a station wagon. A teenager had earlier seen the station wagon cruising suspiciously and had taken down its license plate number. When the police found the car, a forensics team discovered a fingerprint and matched it to Ramirez with the help of a newly functioning computerized fingerprint system. The police released Ramirez’s pictures to the media. He was foiled trying to steal a car when a citizen identified him. He tried to run, but a mob caught him. The police arrived in time to save his life. Ramirez never showed any remorse for his crimes and often turned and jeered at the victims in court. He flashed a pentagram that he had drawn on his palm and declared “Hail Satan” in the courtroom. Ramirez was convicted of sixty-seven felonies, including fourteen murders, and he received nineteen death sentences. Impact
Richard Ramirez’s random and brutal crimes of rape, murder, and pedophilia made him stand out even in a state that produces 10 percent of the world’s serial killers. He killed with more frequency than a normal serial killer and spread fear over much of California.
Further Reading
Carlo, Philip. The Night Stalker. New York: Pinnacle, 1996. Story following the serial killer convicted of fourteen murders in the Los Angeles area. Linedecker, Clifford L. Night Stalker. New York: St. Martin’s Press, 1991. Account of Ramirez’s twoyear rampage as a sadistic serial killer, his arrest, and the subsequent sensational trial. Newton, Michael, ed. The Encyclopedia of Serial Killers. 2d ed. New York: Facts On File, 2006. General reference to serial killers. James J. Heiney
Nobel Prizes
■
711
See also America’s Most Wanted; Atlanta child murders; Central Park jogger case; Crime; Goldmark murders; Latinos; Post office shootings; Rape; San Ysidro McDonald’s massacre; Stockton massacre; Tylenol murders.
■ Nobel Prizes Definition
Prizes, established by the will of Alfred Nobel (1833-1896), awarded for achievements in various areas
The Nobel Prizes are generally recognized as the world’s highest honor in each of the fields in which they are awarded. Nobel laureates gain international acclaim and a monetary award, as well as prestige for their home country. In the 1980’s, North Americans dominated the scientific Nobel Prizes and received a few of the awards in literature and peace as well. Of the ninety-four prizes awarded during in the decade, Americans won forty-six and Canadians won four. Chemistry
Achievements in biochemistry predominated among the chemistry laureates of the decade. Walter Gilbert (with Briton Frederick Sanger) studied base sequences in nucleic acids, and Paul Berg was honored for work on recombinant deoxyribonucleic acid (DNA). Thomas Robert Cech and Sidney Altman discovered catalytic properties of ribonucleic acid (RNA), while Robert Bruce Merrifield developed new methods for preparing peptides. Fundamental studies of chemical reactions were the basis of awards to Roald Hoffmann and Japan’s Kenichi Fukui; John Polanyi, Dudley Herschbach, and Yuan Tseh Lee; and Henry Taube. Taube specialized in electron-transfer reactions of metal complexes. Donald J. Cram and Charles J. Pederson, working with France’s Jean-Marie Lehn, designed and synthesized highly selective organic reactants. Jerome Karle and Herbert Hauptman developed improved methods for determining crystal structures by X-ray diffraction.
Economic Sciences
Americans won seven out of the ten Nobel Prizes in Economic Sciences awarded during the decade. Most of the recipients worked at the nation’s major universities: the Massachusetts Institute of Technology, Yale, the University of California at Berkeley, the University of Chicago, and the University of Pennsylvania. James Buchanan
Herbert A. Hauptman, Jerome Karle
Dudley Herschbach Yuan Tseh Lee John C. Polanyi
Donald J. Cram Charles J. Pederson*
1985
1986
1987
*Prize shared with non-North American
Norman F. Ramsey Hans G. Dehmelt*
Subramanyan Chandrasekhar William Alfred Fowler
Kenneth G. Wilson
Nicolaas Bloembergen Arthur Leonard Schawlow*
James Watson Cronin Val Logsdon Fitch
Physics
1989
Joseph Brodsky
Elie Wiesel
International Physicians for the Prevention of Nuclear War
Peace
Leon M. Lederman Melvin Schwartz Jack Steinberger
Robert M. Solow
James M. Buchanan, Jr.
Czesuaw Miuosz
Literature
1988
Sidney Altman Thomas Robert Cech
Robert Bruce Merrifield
1984 Franco Modigliani0
Gerard Debreu
Henry Taube
James Tobin
1983
Roald Hoffmann*
1981
Lawrence R. Klein
George J. Stigler
Paul Berg Walter Gilbert*
1980
Economics
1982
Chemistry
Year
Canadian and American Nobel Prize Winners, 1980-1989
J. Michael Bishop Harold E. Varmus
Gertrude B. Elion George H. Hitchings*
Stanley Cohen Rita Levi-Montalcini
Michael S. Brown Joseph L. Goldstein
Barbara McClintock
Roger W. Sperry David H. Hubel*
Baruj Benacerraf George D. Snell*
Physiology or Medicine
The Eighties in America
worked at the Center for the Study of Public Choice in Fairfax, Virginia. The laureates’ areas of achievement were creation and applications of econometric models (Lawrence R. Klein), analysis of financial markets (James Tobin), industrial structures and public regulation (George Stigler), reformulation of the general theory of equilibrium and developing new analytical methods (Gerard Debreu), saving and financial markets (Franco Modigliani), economic and political decision making (James Buchanan), and the theory of economic growth (Robert Solow). Literature and Peace
The American literature and peace laureates of the 1980’s were all of foreign origin. Poet, playwright, and essayist Joseph Brodsky wrote first in the Russian language and later in English. Czesuaw Miuosz wrote in Polish. His essay The Captive Mind (1953) concerned the effects of totalitarianism on independent thinkers. Both Brodsky and Miuosz were naturalized American citizens, as was Holocaust survivor and peace laureate Elie Wiesel. Similarly, the International Physicians for the Prevention of Nuclear War (IPPNW) was founded in Geneva in 1980, but it moved to Cambridge, Massachusetts, where it organized conferences and sponsored educational activities on the horrors of nuclear war.
Physics
Discoveries in particle physics led to a Nobel Prize for James Watson Cronin and Val Logsdon Fitch in 1980. The team had studied the violation of symmetry principles for K-mesons. Leon M. Lederman, Melvin Schwartz, and Jack Steinberger, also particle physicists, won the prize in 1988 for their work on the neutrino beam method and for discovering the mu-meson. Kenneth G. Wilson was awarded the 1982 prize for formulating a complete theory of critical phenomena in phase transitions. Subrahmanyan Chandrasekhar became a laureate in 1983 for studies of stellar evolution and shared the honor with William Alfred Fowler, who studied the nuclear reactions leading to the formation of chemical elements in the universe. Nicolaas Bloembergen and Arthur L. Schawlow were prize recipients with Sweden’s Kai Siegbahn in 1981 for their work on laser spectroscopy, while Norman F. Ramsey was honored in 1989 for his oscillating fields method and its use in the hydrogen maser. Hans G. Dehmelt shared the 1989 prize with Germany’s Wolfgang Paul for developing the ion-trap method.
Nobel Prizes
■
713
Physiology or Medicine Baruj Benacerraf and George D. Snell were honored with France’s Jean Dausset for their studies of the histocompatibility complex genes in 1980. Other prize-winning work on genes was done by Barbara McClintock, who won the 1983 award for studying mobile genetic elements called transposons, and by J. Michael Bishop and Harold E. Varmus, who won in 1989 for discovering oncogenes. Brain research and studies of visual information processing led to awards for Roger W. Sperry and David H. Hubel in 1981. Hubel shared his award with Sweden’s Torsten Wiesel. Michael S. Brown and Joseph L. Goldstein elucidated the regulation of cholesterol metabolism and shared the 1985 prize. Stanley Cohen and Rita Levi-Montalcini discovered growth factors and shared the 1986 prize. The 1988 prize was shared by George H. Hitchings, Gertrude B. Elion, and Scotland’s James W. Black, who developed new drugs for treating leukemia, herpes, gout, and other diseases. Impact The 1980’s saw more women winning the scientific awards than had in the past; overall, four women became Nobel laureates during the decade. The prizes also brought increased fame for some literature laureates: Brodsky was only forty-seven years old when he won the prize in 1987; he served as poet laureate of the United States in 1991-1992. Miuosz, little known in America in 1980, toured Poland to great acclaim after winning the prize. Further Reading
Bishop, J. M. How to Win a Nobel Prize. Cambridge, Mass.: Harvard University Press, 2003. Discussion of the prizes, how they are awarded, and the experiences of the author in winning a share of the 1989 Nobel Prize in Physiology or Medicine. Chivian, E., ed. Last Aid: The Medical Dimensions of Nuclear War. San Francisco: W. H. Freeman, 1982. Essays largely taken from the first international conference organized by the IPPNW, the corporate winner of the Nobel Peace Prize in 1985, project the overwhelming stress placed on medical resources in the event of a nuclear war. Feldman, B. The Nobel Prize. New York: Arcade, 2000. Relates some of the controversies that have arisen regarding awards of Nobel Prizes. Laureates are listed by nationality. James, L. K. Nobel Laureates in Chemistry, 1901-1993. Washington, D.C.: American Chemical Society/ Chemical Heritage, 1993. Biographical material
714
■
North, Oliver
and summaries of the research leading to the award are included. Woodward, Robert B., and Roald Hoffmann. The Conservation of Orbital Symmetry. Weinheim, Germany: Verlag Chemie, 1971. Symmetry principles are developed for predicting the outcome of chemical reactions. This book was a milestone in theoretical organic chemistry. John R. Phillips See also
Asian Americans; Astronomy; Cancer research; DNA fingerprinting; Genetics research; Poetry; Science and technology.
■ North, Oliver Identification
U.S. Marine assigned to the National Security Council Born October 7, 1943; San Antonio, Texas North was the key figure in the Iran-Contra affair. His boyish good looks and earnest demeanor propelled him to media stardom and earned him the admiration of many Americans. His covert activities led many to question the competency of the Reagan presidency.
The Eighties in America
were known as the Contras and referred to by Reagan as “freedom fighters.” The United States supplied the rebels with armaments and other forms of military assistance. In 1982, Massachusetts senator Edward Boland introduced a resolution to limit aid to the Contras. The Boland Amendment capped the Contra fund at $24 million and forbade the use of American funds to topple the Nicaraguan government. Two years later, a second amendment (Boland II) forbade any economic support for the Contras, or any other military or paramilitary group or individual in Nicaragua. The Reagan administration, however, continued its policy, using extralegal means. Oliver North was directed to aid the Contras without congressional help or knowledge. Armed with the approval of National Security Adviser Robert McFarlane and a “can-do” spirit, North took up the cause with zeal. He organized private fund-raising campaigns and persuaded other nations to donate money to the Contras. After the United States began selling arms to Iran for diplomatic favors, North suggested that any extra profits be diverted to the Contras. The Iran-Contra affair was born. When the affair was uncovered, it caused a major scandal. President Reagan denied any knowledge of North’s activities. In 1986, Reagan fired North. A
In 1986, the American public was introduced to Lieutenant Colonel Oliver North, a handsome, blueeyed Marine with a grand devotion to duty. North, a decorated Vietnam veteran, had been assigned to the National Security Council in 1981. He soon became the point man in a Ronald Reagan administration scheme to deceive Congress, ignore the law, arm dangerous men, and aid questionable allies. In 1981, President Reagan revived America’s commitment to fight communism in Latin America. Disturbed by the Marxist Sandinista leadership in Nicaragua, he ordered Director of Central Intelligence William J. Casey to organize anti-Sandinista guerrillas. These rebels—many of whom had been members of the repressive National Guard of overthrown dicOliver North testifies before the joint House-Senate investigation of the Iran-Contra affair tator Anastasio Somoza Debayle— on July 7, 1987. (AP/Wide World Photos)
The Eighties in America
year later, North testified before Congress at the televised Iran-Contra hearings. The handsome, young colonel soon became a media darling. Appearing in his medal-adorned uniform, North asserted that his actions had been sanctioned by his superiors. He was viewed as the perfect soldier—obedient, loyal, and unquestioning. North’s telegenic personality won the admiration of the American people and inspired a hero-worshipping phenomenon known as “Ollie Mania.” In the end, North became a symbol of misguided patriotism, rogue diplomacy, and blind devotion to duty. Impact Oliver North’s activities led to questions about Reagan’s integrity and competency: Did the president order and approve the Iran-Contra deal and later lie about his involvement, or was Reagan ignorant about the activities in his own administration? For his own part, North was prosecuted for his crimes in the late 1980’s, but his conviction on three counts was eventually thrown out on the grounds that witnesses against him might have been tainted by his own immunized testimony before Congress. Further Reading
Bradlee, Ben, Jr. Guts and Glory: The Rise and Fall of Oliver North. New York: Donald I. Fine, 1998. Walsh, Lawrence E. Firewall: The Iran-Contra Conspiracy and Cover-Up. New York: Norton, 1997. Rhonda L. Smith See also
Iran-Contra affair; Latin America; Reagan, Ronald; Reagan Doctrine; Scandals.
■ Nuclear Waste Policy Act of 1982 Identification
America’s first law governing nuclear waste policy Date Signed on January 7, 1983 Landmark federal legislation was developed to search for a safe location to store high-level nuclear waste generated in the United States. The Nuclear Waste Policy Act of 1982 (NWPA) was signed into law in January, 1983. It is occasionally referred to as the Nuclear Waste Policy Act of 1983, but 1982 is its official statutory year. Based on the idea that the United States is responsible for safely disposing of the nuclear waste that it creates, the NWPA was
Nuclear Waste Policy Act of 1982
■
715
designed “to provide for the development of repositories for the disposal of high-level radioactive waste and spent nuclear fuel, to establish a program of research, development, and demonstrations regarding the disposal of high-level radioactive waste and spent nuclear fuel, and for other purposes.” The Need for a National Policy High-level radioactive waste is uranium fuel that has been used in a nuclear power plant and is no longer an efficient source of energy. A nuclear power plant typically uses uranium 235, which is not particularly radioactive prior to its use. To create the energy, an atom of uranium is split, creating a nuclear reaction called fission. After the split, two or three neutrons are released, along with some heat. These “free” neutrons bounce around and hit other atoms, causing these atoms to split and setting off a chain reaction. As the process continues, it produces a great deal of heat that is used to generate electricity. During fission, the uranium atoms become lighter elements as they lose neutrons. These new elements are radioactive isotopes called fission products. These isotopes are the source of almost all of the heat and radiation in high-level waste. Some of the remaining uranium atoms actually gain neutrons, which creates plutonium. Although plutonium is not as hot or radioactive as the lighter elements, it has a much longer decay rate, which makes it dangerous for a longer period of time. The radioactivity of these various elements will eventually decay, and while some of the lighter isotopes decay within hours, others can last for thousands of years. In fact, plutonium 239 has a half-life of twenty-four thousand years, which means that half of the radiation will have decayed in that time. Nuclear power plants sought ways to safely store the massive amounts of the long-lived material. Key Provisions
Arguably, the most important provision of the NWPA called for a government-run storage facility that would be paid for by the businesses that created the high-level nuclear waste. In short, the government would construct a place to dispose of the waste as long as those who created the waste paid for the construction, upkeep, and daily operating costs of the facility. This was the primary goal of the NWPA. The law also created the Nuclear Waste Technical Review Board, designed to “evaluate the technical and scientific validity of activities” undertaken by the secretary of energy. Congress in-
716
■
The Eighties in America
Nuclear winter scenario
cluded this board to perform oversight obligations because it felt that a board of experts would be well suited to carry out this task. The Search for a Suitable Location
The NWPA required that the Department of Energy (DOE) research the use of deep geological disposal. DOE scientists considered a number of different environments, including salt, volcanic rock, and crystalline rock, that might be appropriate for storing the nuclear waste. In addition to the task of searching for adequate geological conditions, the DOE was required to consult with the states and Indian tribes that would be impacted by the construction of a disposal location. These consultations were to be overseen by the newly created position of the nuclear waste negotiator. By the end of 1983, the DOE had identified nine sites that contained adequate geological foundations for a storage location. In 1986, the department determined that five of these sites were suitable for detailed study and recommended three of these to President Ronald Reagan for his consideration. In 1987, Congress amended the NWPA and determined, after recommendations from the DOE, to focus on Yucca Mountain, located in south-central Nevada, as the nuclear waste repository.
Impact
The NWPA was the government’s first serious attempt to address the growing nuclear waste deposits across the country and the need for safe disposal. Although its impact was anything but immediate, it set in motion a series of events that many believed would lead to the safest and most secure hazardous waste storage facility that scientists could envision. Subsequent Events After years of study, the DOE concluded that the volcanic tuff located under Yucca Mountain ought to be sufficient to contain the nation’s high-level nuclear waste. In 2002, President George W. Bush signed into law legislative approval for the DOE to develop a repository under Yucca Mountain. The DOE anticipated obtaining a license from the U.S. Nuclear Regulatory Commission by the summer of 2008 to begin storage. Further Reading
Gerrard, Michael B. Whose Backyard, Whose Risk: Fear and Fairness in Toxic and Nuclear Waste Siting. Cambridge, Mass.: MIT Press, 1994. Outlines the complex legalities that encompass the disposal of toxic
and nuclear waste in the United States through a balanced political, economic, psychological, and scientific approach. Herzik, Eric B., and Alvin H. Mushkatel, eds. Problems and Prospects for Nuclear Waste Disposal Policy. Westport, Conn.: Greenwood Press, 1993. Provides a good analysis of the NWPA and its likelihood of success as well as alternative methods for storing nuclear waste. Sundqvist, Göran. The Bedrock of Opinion: Science, Technology, and Society in the Siting of High-Level Nuclear Waste. Dordrecht, Netherlands: Kluwer Academic, 2002. Examines how Sweden, a country long considered a forerunner in high-level nuclear waste disposal and management, finally decided to dispose of its nuclear waste. U.S. Department of Energy. Science, Society, and America’s Nuclear Waste. Washington, D.C.: Author, 1990. Designed by the DOE to educate students in grades 8-12 about the NWPA. James W. Stoutenborough See also Congress, U.S.; Environmental movement; Reagan, Ronald.
■ Nuclear winter scenario Definition
The hypothesis that a nuclear war would cause a severe decrease in global temperature and sunlight
Based on computer simulations, this hypothesis was debated by scientists and political figures throughout the decade. Most scientists agreed that nuclear war would have a negative climatic impact that went beyond the immediate damage caused by nuclear weapons. Beginning in 1983, when the nuclear winter scenario was first posited by climatologists and astronomers Richard Turco, Owen B. Toon, Thomas P. Ackerman, James B. Pollack, and Carl Sagan (referred to collectively as TTAPS), the scientific community debated the climatic impact of nuclear war. TTAPS offered several scenarios based on the intensity of a nuclear conflict ranging from a hardly noticeable climatic impact to one that lowered global temperature by 25 degrees Celsius, precipitating an ice age of nearly unimaginable magnitude. Even the scenario of a medium-scale nuclear war provided for a decrease in global temperature of 10 degrees Cel-
The Eighties in America
sius, hence the term “nuclear winter.” Critics of the nuclear arms race used these scenarios to underscore the folly of the use of nuclear weapons. Climatic Impact of Nuclear War Using computer simulations of the debris thrown into the atmosphere, soot from extensive fires, and wind patterns, TTAPS demonstrated, in an article published in the leading scientific journal Science, that various levels of nuclear exchanges between the United States and the Soviet Union would produce negative climatic impacts. The material thrown into the atmosphere would circle the globe, first in the Northern Hemisphere, then spreading to the Southern Hemisphere particularly with high-level conflicts. This material would inhibit sunlight from reaching the earth, producing a cooling effect. TTAPS cited the impact of the explosion of the volcano Tambora in 1815 on the world climate the next year as empirical evidence for their hypothesis, indicating that a nuclear war could produce a much more severe and long-lived impact than a simple volcanic eruption. With the expansion of nuclear arsenals, both superpowers, the United States and the Soviet Union, had moved some of their nuclear weapons capacity from targeting what were called counterforce targets (military sites) to targeting countervalue targets (industrial sites and cities that affected a country’s ability to wage and survive nuclear war), so a nuclear conflict was likely to produce numerous fires and substantial debris. TTAPS produced six scenarios depending on the intensity of the conflict and the targets of the weapons. Their first scenario, based on the use of few weapons on military targets, produced no climatic impact. Their second, called “marginal nuclear winter,” decreased temperatures by a few degrees Celsius in the Northern Hemisphere, producing severe famines. The third, “nominal nuclear winter,” came from a full-scale exchange of nuclear weapons, producing a drop in average land temperature of 10 degrees Celsius for several months, followed by a return of sunlight enhanced by depletion of the ozone layer. Worldwide, one to two billion people could be expected to die in addition to those already killed in the conflict and its direct aftermath. The “substantial nuclear winter scenario” would lead to the deaths of several billion people in both hemispheres. The “severe” scenario could produce temperature declines of 20 degrees Celsius and a decrease in sunlight so severe as to inhibit photosynthesis for several years. This sce-
Nuclear winter scenario
■
717
nario could imperil everyone on earth. The sixth and “extreme” scenario would produce darkness at noon at about the level of a moonlit night and would be likely to extinguish most life on the planet. TTAPS indicated that scenario three was the most likely one to result from a nuclear conflict and continued to refine their models through the 1980’s. Various other scientists around the world entered the nuclear winter debate, producing scenarios based on different manipulations of the data. One major challenge by Stephen H. Schneider and Starley L. Thompson labeled the impact of nuclear war as “nuclear autumn.” Although indicating that nuclear war would be likely to produce a less severe impact than the TTAPS models, these authors still posited a noticeable climatic impact that might be quite severe in some places. Although members of the scientific community differed somewhat in terms of how severe a nuclear winter might be, they agreed that nuclear war would be likely to produce a negative climatic impact that went beyond the immediate impact of the blasts and radioactive fallout. Political Implications of Nuclear Winter The impact of nuclear weapons through blast damage, fires, and radiation had already been shown to be quite severe. As Carl Sagan indicated, the additional impact on global climate made nuclear war nearly unthinkable. During the 1980’s, the United States and the Soviet Union were experiencing the last stages of the Cold War, in which both sides engaged in a continually escalating arms race. Some of the policy debate surrounding the nuclear winter scenario centered on whether the possibility of nuclear winter made a nuclear exchange more or less likely. A few staunch cold warriors dismissed the possibility of nuclear winter, but most policy makers seem to have accepted its possibility. Several opponents of the nuclear arms race used the nuclear winter scenario as further evidence that at least some degree of nuclear disarmament was essential. Impact With the fall of the Soviet Union and a decline in the tensions of the Cold War, a nuclear war scenario seemed to become less likely. Today, it is rarely mentioned. Nonetheless, the science of the nuclear war scenarios remains sound. An exchange of nuclear weapons, particularly if targets such as oil refineries were hit, could produce a negative impact on the climate.
718
■
Nuclear winter scenario
Further Reading
Sagan, Carl. “Nuclear War and Climatic Catastrophe: Some Policy Implications.” Foreign Affairs 62 (1983): 257-292. Examines the political implications of nuclear winter. Sagan, Carl, and Richard Turco. A Path Where No Man Thought. New York: Random House, 1990. Sums up the debate concerning nuclear winter from the perspective of two of its proponents. Schneider, Stephen H., and Starley L. Thompson. “Simulating the Climatic Effects of Nuclear War.” Nature 333 (May, 1988): 221-227. Two critics of the nuclear winter scenario argue for nuclear autumn instead.
The Eighties in America
Turco, R. P., O. B. Toon, T. P. Ackerman, J. B. Pollack, and Carl Sagan. “Nuclear Winter: Global Consequences of Multiple Nuclear Explosions.” Science 222, no. 4630 (1983). The seminal TTAPS article that presented possible negative climatic effects of nuclear war. Turco, Richard P. Earth Under Siege. New York: Oxford University Press, 1997. Long-term evaluation of the nuclear winter scenario with comparisons to the oil fires after the first Gulf War (1991). John M. Theilmann See also Cold War; Reagan, Ronald; Soviet Union and North America; Strategic Defense Initiative (SDI).
O ■ Oates, Joyce Carol Identification American writer and scholar Born June 16, 1938; Lockport, New York
As a prolific and diverse writer of novels, short stories, book reviews, essays, poems, and plays, Oates became an established presence in the literary climate of the 1980’s. The 1980’s marked an exploratory and experimental period in Joyce Carol Oates’s career. Her high productivity as a writer and the various genres in which she wrote earned her a place in the canon of American literature and a distinguished teaching position at Princeton University. Oates’s best-selling novel Bellefleur (1980) established her reputation as a gothic writer and impressed readers with its strange combination of realism and fantasy. Less successful was her play Daisy (pb. 1980), based on James Joyce’s relationship with his schizophrenic daughter, Lucia. Though disappointed in the play’s lack of critical success, Oates quickly recovered and returned to the novels that she preferred to write, such as Victorian-influenced romances and thrillers, including A Bloodsmoor Romance (1982) and Mysteries of Winterthurn (1984). At this time, Oates was also writing short stories that many critics considered superior to her novels, such as “Funland,” “The Witness,” and “Last Days,” eventually collected in her short-story collection Last Days (1984). In the late 1980’s, Oates returned to her realistic style of writing in such novels as Marya: A Life (1986) and You Must Remember This (1987). Both Marya Knauer and Enid Stevick, the protagonists of the novels, experience brutality but manage to transcend it through achievement in academic or literary arenas, much like Oates herself. In doing research for You Must Remember This, whose main male character is a former boxer, she became fascinated with boxing; wrote an essay, “On Boxing,” which eventually became a book by the same title (1987); and was recognized as an authority on the subject, meeting and befriending Norman Mailer, also a box-
ing fan, and even Muhammad Ali, an experience so moving to her that she was reduced to tears. Oates also spent seven hours interviewing Mike Tyson for a Life magazine article. Despite Oates’s newfound celebrity as a boxing expert, she was not lured away from writing fiction or her continuing interest in poetry, known for its bearing witness to women’s experiences. Although critics and contemporaries have stated preferences for Oates’s fiction over her poetry, she has continued to write poems, with a book, Time Traveler (1989), and some of the poems collected in Invisible Woman: New and Selected Poems, 1970-1982 (1982).
Joyce Carol Oates. (© Norman Seeff)
720
■
Ocean Ranger oil rig disaster
Impact In the 1980’s, Oates built her reputation as one of America’s most prolific and influential writers and scholars. By experimenting with different styles and exploring a wide range of subjects in her writing, she proved herself to be a highly versatile author. Although her productivity subjected her to more negative criticism than a writer with a more conservative output would have received, the sheer diversity and range of her work, as well as its consistency, have established her as a major American writer. Further Reading
Johnson, Greg. Invisible Writer: A Biography of Joyce Carol Oates. New York: Plume Books, 1998. Oates, Joyce Carol. The Faith of a Writer: Life, Craft, Art. New York: HarperCollins, 2003. Holly L. Norton See also Boxing; Feminism; Literature in the United States; Poetry.
■ Ocean Ranger oil rig disaster The Event
The largest floating oil rig at the time capsizes and sinks during a storm Date February 15, 1982 Place Grand Banks area, about 290 kilometers (180 miles) east of St. John’s, Newfoundland, Canada The disaster demonstrated the vulnerability of technology to cascading problems as well as the inadequate emergency preparedness of the crew. In the early hours of February 15, 1982, the offshore floating oil drilling rig Ocean Ranger sank in Canadian waters, killing all eighty-four crew members onboard. Built by the Ocean Drilling and Exploration Company (ODECO), the Ocean Ranger had been regarded as indestructible. Because there were no survivors to provide eyewitness accounts, careful detective work was required to reconstruct the sequence of events that had led to the disaster. At first, the oil company thought that the Ocean Ranger had fallen victim to a “gas-loaded ocean,” a blowout caused by the seafloor suddenly erupting in an uncontrollable geyser of gas that would fill the water with so much foam that the rig could no longer maintain buoyancy. As it happened, nothing nearly so exotic had occurred. Instead, a storm-
The Eighties in America
driven wave had broken a glass window, setting off a cascade of failures that ended in the oil rig losing control and tipping over. Although some failures were mechanical, several appear to have been the result of the crew’s losing their emotional equilibrium and becoming unable to respond adequately as the situation spiraled out of control. The broken glass permitted seawater to enter an electronic control room in one of the rig’s eight massive legs, causing short circuits in vital equipment. As a result, the rig could no longer maintain the water level in its ballast tanks for trim control (stability and balance) as heavy equipment was moved to and fro on the deck. At this point, human error entered the equation. Delays in shutting off power to the controls allowed ballast water to continue flowing. Furthermore, once the controls were turned off, the crew no longer had access to the information they provided about the status of the trim control system. In response, the crew tried to repower the inadequately dried control boards, leading to more short circuits and incorrect reballasting that exacerbated the rig’s growing tilt until a wave could pour water into storage compartments built inside one of the legs, overbalancing the rig. Finally, the order was given to abandon the rig. However, the lifeboats that had worked so well during drills on calm days became deathtraps for men trying to escape the listing rig in the violent seas. The boats smashed against the rig’s steel legs or capsized in the rough water, dooming the crew. Impact The loss of the Ocean Ranger led to a reassessment of disaster preparedness programs, which had performed drills under only ideal conditions. In addition, many safety engineers studied the sequence of failures in search of ways to break the chain in future disasters. Further Reading
Chiles, James R. Inviting Disaster: Lessons from the Edge of Technology. New York: HarperCollins, 2001. Places the Ocean Ranger disaster in the larger context of industrial accidents. U.S. National Transportation Safety Board. Capsizing and Sinking of the U.S. Mobile Offshore Drilling Unit Ocean Ranger, February 15, 1982. Washington, D.C.: Government Printing Office, 1983. Official report on the incident and its causes. Leigh Husband Kimmel See also
Natural disasters; Science and technology.
The Eighties in America
■ O’Connor, Sandra Day Identification
Associate justice of the United States, 1981-2006 Born March 26, 1930; El Paso, Texas The first female justice of the the United States, O’Connor was often the swing vote on the Supreme Court. Growing up on a cattle ranch on the New MexicoArizona border, Sandra Day O’Connor learned early the necessity of hard work and persistence. The product of Austin High School and Stanford University, she learned about discrimination against women when, after graduating one year early from Stanford Law School, where she served on the Stanford Law Review and graduated third in a class of 102, she discovered that no law firm would hire her, although one did offer her a position as a legal secretary. While at Stanford, she met her husband, John Jay O’Connor III, with whom she had three children.
O’Connor, Sandra Day
■
721
She served briefly as a deputy county attorney in California and a civilian attorney in Germany, and when the couple moved to Phoenix, Arizona, she practiced law until her 1965 appointment as an Arizona assistant attorney general. In 1969, she filled a vacancy in the state senate, subsequently ran successfully for two terms, and became the first woman to serve as senate majority leader in any state. In 1974, she ran successfully for trial judge and in 1979 was appointed to the Arizona Court of Appeals. On August 19, 1981, President Ronald Reagan formally nominated O’Connor to the Supreme Court, where she proved to be a less-than-reliable Republican. Although in her early years she sided with her conservative colleagues, by the mid-1980’s she exhibited more autonomy, filing concurring decisions to majority opinions and becoming known as the “swing vote” on women’s issues and autonomy of states cases. Her record on abortion cases was varied. Impact
As the first (and until 1993 the only) woman Supreme Court justice, O’Connor served as a role model and was closely watched. Her role as a swing voter became increasingly important as the Court became increasingly polarized, giving her a disproportionate level of power on the Court and increasing the importance of her decisions.
Further Reading
Biskupic, Joan. Sandra Day O’Connor: How the First Woman on the Supreme Court Became Its Most Influential Justice. New York: HarperCollins, 2006. O’Connor, Sandra Day, with H. Alan Day. Lazy B: Growing Up on a Cattle Ranch in the American Southwest. New York: Random House, 2002. O’Connor, Sandra Day, with Craig Joyce. The Majesty of the Law: Reflections of a Supreme Court Justice. New York: Random House, 2003. Savage, David G. Eight Men and a Lady: Profiles of the Justices of the Supreme Court. Bethesda, Md.: National Press Books, 1990. Tushnet, Mark. A Court Divided: The Rehnquist Court and the Future of Constitutional Law. New York: W. W. Norton, 2005. Erika E. Pilver See also
Abortion; Conservatism in U.S. politics; Reagan, Ronald; Rehnquist, William H.; Supreme Court decisions.
Associate Justice Sandra Day O’Connor. (Library of Congress)
722
■
The Eighties in America
Olson, Clifford
■ Olson, Clifford Identification Canadian serial killer Born January 1, 1940; Vancouver, British
Columbia, Canada Olson murdered eleven people in British Columbia in 1980 and 1981. Clifford Olson’s problems with the law began during his teen years, when he was arrested for theft. This trend continued into adulthood, as he became involved in increasingly violent crimes and spent time in prison. His Canadian infamy, however, began in November, 1980, when, for reasons that remain unclear, he turned to murder. Over the next eight months, he would brutally murder three boys and eight girls, ranging from the ages of nine to eighteen, in the Canadian province of British Columbia. Several of Olson’s victims were sexually assaulted before he killed them. Because the victims did not seem to fit a pattern and because many of the bodies were not immediately discovered, the authorities did not initially realize they were dealing with a serial killer. In August, 1981, Olson was finally arrested by the Royal Canadian Mounted Police. The Olson case was surrounded by controversy. Some would come to blame pornography for Olson’s crimes after he was discovered with such materials at the time of his arrest. The main notoriety, however, related to the deal the Canadian government struck with Olson in order to get him to confess to his crimes and to identify the locations of the bodies of some of the victims. The price of his confession was ten thousand Canadian dollars per victim, to be paid to his wife. The government, recognizing that it lacked evidence of Olson’s guilt in all of the murders and that the families of the victims desperately wanted to be able to bury their loved ones (only three bodies had been found at the time of his arrest), agreed to his terms and paid his wife $100,000. (Olson had provided information about one victim for free.) Olson confessed in January, 1982, and was sentenced to eleven concurrent life sentences. A public outcry ensued over a deal in which a murderer appeared to benefit from his crime, but the agreement held. In prison, Olson’s notoriety would continue, as he regularly corresponded with the media. In Canada, a nation that eliminated the death penalty for murder in the 1970’s, a conviction for
first-degree murder means twenty-five years in prison. Under the law, however, Olson was allowed to apply for early parole in 1997. He was turned down. Then, after completing his twenty-five-year sentence, Olson was allowed to apply for parole in 2006. He was turned down for parole again, but he was given the option to apply every two years until his death. Impact Olson remains the most notorious mass killer in Canadian history, and his crimes have sparked renewed calls for the restoration of the death penalty for murder. Further Reading
Holmes, W. Leslie, with Bruce L. Northorp. Where Shadows Linger: The Untold Story of the RCMP’s Olson Murders Investigation. Vancouver, B.C.: Heritage House, 2000. Layton, Elliott. Hunting Humans: The Rise of the Modern Multiple Murderer. Toronto: McClelland & Stewart, 1995. Mulgrew, Ian. Final Payoff: The True Price of Convicting Clifford Robert Olson. Toronto: McClelland-Bantam, 1990. Steve Hewitt See also Crime; École Polytechnique massacre; Pornography; San Ysidro McDonald’s massacre.
■ Olympic boycotts The Event
Back-to-back boycotts of the Summer Olympics, one led by the United States and the other by the Soviet Union Date Summers of 1980 and 1984 Boycotts were an embarrassment to the International Olympic Committee, which could not prevent international politics from interfering with the celebration of the Games. The boycotts also demonstrated the two superpowers’ willingness to use athletes as pawns in an effort to build support for their policies. Soviet troops invaded Afghanistan in late December, 1979. That decision set in motion an Americansponsored boycott of the 1980 Summer Olympics, which were hosted by Moscow. The boycott marked the first time that the United States did not take part in the modern games. President Jimmy Carter orchestrated the boycott; as early as January 4, 1980, he indicated that any Soviet military presence in Af-
The Eighties in America
Olympic boycotts
■
723
favor of the resolution. A national news magazine added that 56 percent of Americans polled favored a boycott, and 68 percent said that the U.S. government should try to have Malawi Albania the Games moved from Moscow. Antigua and Barbuda Malaysia Neither the Soviet Union nor the InternaMauritania Argentina tional Olympic Committee (IOC) believed Mauritius Bahamas that Carter had the right to push for a boyBahrain Monaco cott. Olympic officials repeatedly argued that Morocco Bangladesh Carter was violating one of the most imporNorway Belize tant tenets of the Olympic movement: The Bermuda Pakistan separation of politics and sports. IOC presiPapua New Guinea Bolivia dent Lord Killanin said that neither he nor Paraguay Canada the IOC was condoning what the Soviets had Cayman Islands Philippines done, “but if we started to make political judgTaiwan (Republic of Chile ments, it would be the end of the Games.” China) China (People’s Republic The United States had until May 24 to Saudi Arabia of China) officially accept its invitation. However, on Singapore Egypt March 21, Carter announced that the U.S. Somalia El Salvador team definitely would not attend the SumSouth Korea Fiji mer Olympics. He said, “I can’t say at this Gambia Sudan moment what other nations will not go to Swaziland Ghana the Summer Olympics in Moscow. Ours will Thailand Haiti not go. I say that not with any equivocation. Tunisia Honduras The decision has been made.” Turkey Hong Kong By early April, Carter’s efforts were failing. United States Indonesia Western European Olympic athletes refused Uruguay Iran to follow any calls their governments made Virgin Islands Israel to support the boycott, U.S. athletes were West Germany Japan divided about its merits, the IOC said the Zaire Kenya Games would not be moved to another locaLiechtenstein tion, and the U.S. Olympic Committee was refusing to help the president impress upon other nations the need for the boycott. However, a majority of its members sided with the presighanistan guaranteed that his administration would dent. In a 1,604-to-797 vote, they affirmed that the not allow the U.S. Olympic team to participate in U.S. Olympic team would not compete in Moscow. the Summer Games, and that he would encourage Fifty-two countries joined with the United States America’s allies to take the same action with their and did not send their Olympic teams to the Soviet Olympians. Union. However, many Western countries—Italy, Spain, Australia, and France among them—refused Support and Criticism at Home and Abroad Conto join the boycott. gress and the American public were among many groups that supported Carter’s efforts, especially during the winter of 1980. On January 24, the House “Nyet” to Los Angeles The Soviet Union returned of Representatives voted 386 to 12 to support the the boycott favor four years later when it refused to president’s call for either transferring the Games send its Olympic team to the 1984 Los Angeles out of Moscow, canceling them, or initiating a boyGames. The announcement came on May 8 and cott. Four days later, the Senate Committee on Forblamed the White House for creating an unsafe envieign Relations unanimously passed a similar resoluronment in California for Soviet athletes. The term tion. On January 29, the full Senate voted 88 to 4 in “boycott” was not used.
Countries Boycotting the 1980 Moscow Olympics
724
■
The Eighties in America
Olympic Games of 1980
Countries Boycotting the 1984 Los Angeles Olympics Afghanistan Angola Bulgaria Cambodia Cuba Czechoslovakia East Germany Ethiopia
Hungary Iran Libya Mongolia North Korea Poland Soviet Union Vietnam
It did not take long for the Soviet Union’s allies to join with Moscow. On May 9, Bulgaria said that it would not attend. East Germany’s announcement followed one day later. Vietnam and Mongolia joined the boycott list two days later. Czechoslovakia and Laos followed, as did Afghanistan, Hungary, and Poland. Cuba announced on May 23 that it also was staying home; the boycott now stood at eleven nations. Only Romania withstood whatever pressure was placed on the Eastern Bloc nations; it chose to send its Olympic team to Los Angeles. As was the case in 1980, the IOC faced a terrible reality: It desperately wanted the Olympics to be free of political pressure, but it was powerless to stop such influence. The IOC’s actions in 1984 again failed to eliminate what it believed was the poisonous relationship of international relations and athletics. In both years, it criticized the nation that led the boycott and expressed its confidence in the host nation. In both years, it pressed the political leadership of the country leading the boycott to change its mind. In both years, it failed. Impact To suggest that the Olympic Games, prior to 1980, had always been free of political pressure would be incorrect. Indeed, the marriage of politics and sports existed from the first modern Games in 1896. However, the actions of the United States in 1980 and the Soviet Union in 1984 marked the first time that nations were willing to use their Olympic athletes to bolster a diplomatic policy. Although the Cold War was coming to an end in the 1980’s, the boycotts provided justification for any independent government to make the same decision in the future. Sports had become a tool of international politics, despite the IOC’s efforts to separate the two.
Further Reading
Barney, Robert K., Stephen R. Wenn, and Scott G. Martyn. Selling the Five Rings: The International Olympic Committee and the Rise of Olympic Commercialism. Salt Lake City: University of Utah Press, 2002. An excellent text examining the development of political and commercial pressures on the Olympic Games. Guttmann, Allen. The Olympics: A History of the Modern Games. 2d ed. Urbana: University of Illinois Press, 2002. An easy-to-read, detailed account of the positive and negative events that have affected the growth of the Olympic Games. Hill, Christopher R. Olympic Politics: From Athens to Atlanta, 1896-1996. Manchester, England: Manchester University Press, 1996. A thoroughly researched discussion of the connection between politics and sports. Hulme, Derick L., Jr. The Political Olympics: Moscow, Afghanistan, and the 1980 U.S. Boycott. New York: Praeger, 1990. Examines the effectiveness of using international sport as a political instrument. The author leaves no doubt that he holds the 1980 Olympics in low regard because of the actions of the Soviet government. Anthony Moretti See also
Cold War; Foreign policy of the United States; Goodwill Games of 1986; Olympic Games of 1980; Olympic Games of 1984; Olympic Games of 1988; Soviet Union and North America; Sports.
■ Olympic Games of 1980 The Event
The staging of winter and summer international athletic competitions, held every four years Date Winter Games, February 14-23, 1980; Summer Games, July 19-August 3, 1980 Place Winter Games, Lake Placid, New York; Summer Games, Moscow, Soviet Union (now in Russia) The year 1980 marked the quadrennial celebration of the Olympic Games. The Winter Games were hosted by Lake Placid for the second time, while Moscow became the first city governed by a Communist government to host an Olympics.
The Eighties in America
Olympic Games of 1980
■
725
Leading Medal Winners of the 1980 Winter Olympics Total Medals
Athlete
Country
Sport
Gold-Silver-Bronze
5
Eric Heiden
United States
Speed skating
5-0-0
3
Hanni Wenzel
Liechtenstein
Alpine skiing
2-1-0
2
Ingemar Stenmark
Sweden
Alpine skiing
2-0-0
2
Raisa Smetanina
Soviet Union
Cross-country skiing
1-1-0
Leading Medal Winners of the 1980 Summer Olympics Total Medals
Athlete
Country
Sport
8
Alexander Dityatin
Soviet Union
Artistic gymnastics
3-4-1
5
Nikolay Andrianov
Soviet Union
Artistic gymnastics
2-2-1
4
Nadia Comaneci
Romania
Artistic gymnastics
2-2-0
4
Natalia Shaposhnikova
Soviet Union
Artistic gymnastics
2-0-2
3
Yelena Davydova
Soviet Union
Artistic gymnastics
2-1-0
3
Uladzimir Parfianovich
Soviet Union
Canoeing
3-0-0
3
Vladimir Salnikov
Soviet Union
Swimming
3-0-0
2
Sebastian Coe
Great Britain
Track and field
1-1-0
2
Steve Ovett
Great Britain
Track and field
1-0-1
2
Miruts Yifter
Ethiopia
Track and field
2-0-0
The stunning gold medal performance of the U.S. men’s ice hockey team at the Winter Games provided a morale boost to a country dealing with economic problems at home and political crises around the world, and it helped to cover up an otherwise mediocre performance by the entire U.S. Olympic squad. The absence of more than fifty nations at the Summer Games as the result of a boycott ensured the domination of Soviet and Eastern European athletes. Successes and Setbacks at Lake Placid The expectations for the Winter Games, hosted by Lake Placid, New York, were immediately put into doubt because of a chronically failing transportation system. The small roads and inadequate transportation ensured that many spectators either arrived late to their events or never made it to them at all. Attendance at the opening ceremonies, for example, was at less than capacity because many people were stranded
Gold-Silver-Bronze
and unable to get to the stadium. Compounding the problems at the beginning of the Olympics was an unusual weather pattern that meant that almost no snow was on the ground. Artificial snow was made in order to cover the numerous bare spots, but the conditions did not negatively affect the athletes’ performances. The conditions especially did not bother Swedish skier Ingemar Stenmark, who won gold medals in both the giant slalom and slalom. In the slalom, Stenmark held off American Phil Mahre, winning by half a second. A total of thirty-seven nations and 1,072 athletes made up the Lake Placid Olympics. When the Games were over, the Soviet Union, as expected, had won more gold medals (ten) than any other nation. Its haul of twenty-two total medals was one less than East Germany. The United States finished third in the medal count, with only six gold and twelve total medals. At Lake Placid, the People’s Republic of China
726
■
Olympic Games of 1980
participated in the Olympic Games for the first time in thirty years. China had been a member of the International Olympic Committee (IOC) prior to the Communist takeover of the country in 1949. Soon thereafter, the Communists formed a new National Olympic Committee, arguing that they were the legitimate and legal authority over all Chinese sports. However, the Nationalist Chinese, who had fled to Taiwan, maintained that they were the leaders of all international sports in China. The IOC and the Communist and Nationalist Chinese debated this issue for almost a decade. Then, in 1958, believing that the IOC was not going to grant it control over the Olympic program in Taiwan, China withdrew from the IOC and the various international federations that governed individual sports. The Chinese did not win any medals at Lake Placid, but their return to the Olympics was important for the IOC because the absence of China—which represented one of the world’s largest populations—had many IOC members believing that the Olympics was than less a truly international event. The United States won six gold medals, but one team and one man were responsible for them. The men’s hockey team won one, while speed skater Eric Heiden stood atop the medal podium after all five events in which he participated. What made Heiden’s feats even more amazing was that he set Olympic records in all five distances. He capped off his year by being named the winner of the James E. Sullivan Award, given to the top amateur athlete in the United States. The twenty-one-year-old from Wisconsin had ensured his place in the annals of great Olympians. A few years later, Heiden demonstrated his athletic prowess and versatility when he participated in the preeminent event in cycling, the Tour de France. There seemed to be no question that the American figure skating pair of Tai Babilonia and Randy Gardner should have won a medal, probably gold; they were the reigning world champions in their event, and it seemed that they were the team to end the Soviet Union’s domination of the event. However, Gardner suffered a leg injury during a preOlympic practice, and then he aggravated it just prior to the Games. As a stunned crowd watched— both in the stadium and on television—he tried to warm up but could not. He fell while attempting to complete a jump; then he almost dropped his partner while trying to lift her. Moments later, an an-
The Eighties in America
nouncement indicating that Gardner and Babilonia were pulling out of the competition was made. She cried as she left the ice. A few days later, Irina Rodnina and her husband Alexander Zaitsev won the gold medal for the second straight time. The win gave Rodnina her third consecutive gold medal in that event. Rodnina also won ten World Figure Skating Championships is a row in pairs figure skating. Another American skater, Linda Fratianne, also entered Lake Placid with a reasonable expectation of gold. However, like Babilonia and Gardner, she came up short on the Olympic stage. Fratianne had won four consecutive U.S. championships in ladies’ figure skating; she also had finished first at the World Figure Skating Championships in 1977 and 1979. Her rival, East German Anett Pötzsch, narrowly won the gold medal. Pötzsch became the world champion in 1978, preventing Fratianne from being a three-time winner at that competition. Perhaps still suffering from a letdown, Fratianne later finished third at the 1980 world championships. She became a professional skater soon after. Another four-time U.S. champion, Charles Tickner, also found little luck in Lake Placid. He finished third in the men’s figure skating event. Britain’s Robin Cousins won the gold medal. Soviets and Their Allies Dominate in Moscow
The Olympic showcase left Lake Placid in late February. Its summer showcase began five months later in Moscow. More than five thousand athletes representing eighty nations competed. Because of the U.S.-led boycott, this was the fewest number of nations at a Summer Olympics in twenty-four years. No U.S. athlete took part in the Summer Games, ensuring that the host Soviet Union and its allies would be without serious competition in many of the events in which they took part. The results bore out that fact: Soviet athletes won more medals (80 gold, 195 total) than athletes from any other country. Perhaps the most impressive Soviet Olympian was men’s gymnast Alexander Dityatin, who took home eight medals (three gold), more than any other athlete in a single Games. Soviet swimmer Vladimir Salnikov won three gold medals. His best performance came in the 1,500-meter freestyle, in which he became the first person to swim that race in less than fifteen minutes. Soviet canoeist Uladzimir Parfianovich also won three gold medals in his sport. Soviet women also fared well, with gymnasts Natalia
The Eighties in America
Shaposhnikova (two gold, four total) and Yelena Davydova (two gold, three total) among the most decorated female athletes. Soviet and East German women nearly completely rewrote the track-and-field record book. Nine records were set in Moscow, four by Soviet women and the other five by the East Germans. East German women also set eleven Olympic records in swimming. Rumors that East Germany had used steroids and other performance-enhancing drugs to shape its Olympic champions were proven true after the collapse of the Soviet empire in the early 1990’s. Meanwhile, Cuban boxer Teófilo Stevenson became the first man to win three straight gold medals in the heavyweight division, leaving many members of the world’s media (especially in the United States) to wonder what kind of professional Stevenson might be if he were allowed to leave Cuba. British runners Steve Ovett and Sebastian Coe engaged in their own kind of Cold War competition during the Moscow Olympics. Despite sharing a homeland, the men were not friendly, and they had not raced in the same event in two years. Coe was the world’s best at the 800 meters, and Ovett was the undisputed king at 1,500 meters. Entering the Olympics, Ovett had not lost a race at that distance in almost three years. In the 800 meters final, Ovett trailed for most of the race but used a late charge to overtake Coe and win the gold medal. Coe took home the silver. Six days later, Ovett, who held the world record in the 1,500 meters, was expected to win his second gold. However, Coe chased him down and finished first. Ovett faded and ended up third. American athletes were not the only ones denied a chance at success in Moscow. So, too, was the National Broadcasting Company (NBC), the network that had paid more than $72 million for the rights to broadcast the Olympics in the United States. Because of economic sanctions imposed by President Jimmy Carter on the Soviet Union because of its military invasion of Afghanistan, NBC was not able to meet the terms of its contract and abandoned plans to cover the Games on a daily basis. Estimates state that the network lost somewhere between $15 million and $40 million because of the boycott. Impact The symbolic power of sports might never have been more evident than during the 1980 Olym-
Olympic Games of 1980
■
727
pic year. At Lake Placid, a group of young men, “college kids” as they were affectionately called in the United States, took down the mighty Soviet hockey team and every other one they faced to win a gold medal. In doing so, they provided a symbolic demonstration of the vitality of the U.S. political and social system. Then, in Moscow, the Soviet government was able to make the same argument about its political and social system because of the record-setting efforts of Soviet athletes. The idea that the Olympics were just games and celebrations of sport appeared a hollow argument. Only the most idealistic of IOC members could make the case that their organization had successfully kept politics from interfering with sports. Further Reading
Guttmann, Allen. The Olympics: A History of the Modern Games. 2d ed. Urbana: University of Illinois Press, 2002. An easy-to-read, detailed account of the positive and negative events that have affected the growth of the Olympic Games. Hazan, Baruch. Olympic Sports and Propaganda Games: Moscow 1980. New Brunswick, N.J.: Transaction Books, 1982. A well-researched book that examines multiple facets of the 1980 Summer Games. Novikov, I. T., ed. Games of the XXII Olympiad, Moscow, 1980: Official Report of the Organizing Committee of the Games of the XXII Olympiad, Moscow, 1980. Moscow: Fitzkultura i Sport, 1981. This official report glosses over deficiencies associated with the 1980 Summer Games. However, it is a strong primary source document for anyone studying the Olympics. Riordan, James. Soviet Sports Background to the Olympics. London: Washington News Books, 1980. Riordan is a prominent researcher and has written substantially about the Soviet Union and its sports programs. This book examines why the Soviet Union placed such importance on succeeding at the Olympics. Anthony Moretti See also Goodwill Games of 1986; Hockey; Miracle on Ice; Olympic boycotts; Olympic Games of 1984; Olympic Games of 1988; Soviet Union and North America; Sports.
728
■
Olympic Games of 1984
The Eighties in America
■ Olympic Games of 1984 The Event
The 1984 staging of winter and summer international athletic competitions, held every four years Date Winter Games, February 8-19, 1984; Summer Games, July 28-August 12, 1980 Place Winter Games, Sarajevo, Yugoslavia; Summer Games, Los Angeles, California The Sarajevo Games were the first Winter Olympics held in a Socialist country and the first Olympic Games under the International Olympic Committee president Juan Antonio Samaranch. The Summer Olympics were a success despite the boycott of sixteen nations, including the Soviet Union and its allies. Following the successful U.S.-led boycott of the 1980 Moscow Games to protest the Soviet invasion of Afghanistan, the Soviet Union in turn led a boycott of the 1984 Los Angeles Games. While the Soviet-led boycott had a significant impact, the 1984 Games saw a record turnout of 140 nations. Also, whereas the Montreal Games of 1976 proved financially disastrous for the city, the Los Angeles Games demonstrated that hosting the Olympics could indeed prove lucrative. Winter Olympic Games
On February 8, 1984, the opening ceremony to the fourteenth Winter Games took place in the host city, Sarajevo, of the Socialist Federal Republic of Yugoslavia. The Games would provide competition until the closing ceremony on February 19. Forty-nine nations were represented, with a total of 1,272 athletes, 274 women and 998 men. Thirty-nine events were on the Olympic program, which included a new event, 20-kilometer women’s Nordic skiing. Disabled skiing was added as a demonstration event. The Winter Games received a lucrative television contract of $91.5 million from the American Broadcasting Company (ABC), as compared to $15.5 million for television rights to the 1980 Lake Placid Winter Games. The host city received $60 million of this, thus offsetting any economic burdens of hosting the Games. Gaétan Boucher of Canada compiled three medals in men’s speed skating. He earned a bronze medal in the 500 meter and gold medals in the 1,000 meter and the 1,500 meter. Marja-Liisa Kirvesniemi of Finland won all three events in women’s crosscountry skiing. She became the only woman to compete in six Olympic Games.
American Carl Lewis participates in the qualifying round of the men’s long jump during the 1984 Olympic trials in Los Angeles. (AP/Wide World Photos)
Perhaps one of the most well-received performances of the Winter Olympics was the free dance figure skating pairs, when Jayne Torvill and Christopher Dean of the United Kingdom achieved a perfect score for artistic impression. American athletes exhibited success in alpine skiing. Twin brothers Phil and Steve Mahre placed first and second in the slalom. Bill Johnson became the first American to win an Olympic downhill event. Scott Hamilton of the United States won a gold medal in men’s figure skating. East Germany won the most gold medals (nine) and a total of twenty-four medals. The Soviet Union won six gold medals and the most medals with twenty-five. The United States won a total of
The Eighties in America
Olympic Games of 1984
eight medals and was tied with Finland and Sweden in winning four gold medals. Canada acquired two gold medals and a total of four medals. Summer Olympic Games
The twenty-third Summer Olympics were held in Los Angeles from July 28 to August 12, 1984. A total of 6,829 athletes, 1,566 women and 5,263 men, were in attendance representing 140 nations. The Olympic program featured 221 events in twenty-three sports. Seventeen new entries were added to the Olympic program that included thirteen for women. Among the new events for women were the marathon, 3,000-meter run, 400-meter hurdles, synchronized swimming, three rifle competitions, rhythmic gymnastics, and individual road racing in cycling. In addition, tennis re-
■
turned for the first time since the 1924 Olympics as a demonstration sport. Baseball was held as an exhibition sport. President Ronald Reagan officially opened the Games. Los Angeles had been selected to host the 1984 summer Olympic Games on May 18, 1978, during the eightieth International Olympic Committee (IOC) meeting in Athens, Greece. Los Angeles was the only city to bid for the 1984 Summer Games. The $1 billion debt that Montreal had encountered hosting the 1976 Olympic Games deterred many nations from submitting bids to host the Games. The $9 billion that was required to host the 1980 Moscow Games brought concern that the future of the Games may be in danger of economic disaster. Voters in Los Angeles by a margin of 5-1 voted
Leading Medal Winners of the 1984 Winter Olympics Total Medals
Athlete
Country
Sport
Gold-Silver-Bronze
4
Karin Kania
East Germany
Speed skating
2-2-0
4
Marja-Liisa Kirvesniemi
Finland
Cross-country skiing
3-0-1
4
Gunde Anders Svan
Sweden
Cross-country skiing
2-1-1
3
Gaetan Boucher
Canada
Speed skating
2-0-1
3
Andrea Ehrig
East Germany
Speed skating
1-2-0
2
Thomas Wassberg
Sweden
Cross-country skiing
2-0-0
2
Tomas Gustafson
Sweden
Speed skating
1-1-0
2
Matti Nykanen
Finland
Ski jumping
1-1-0
2
Jens Weissflog
East Germany
Ski jumping
1-1-0
Leading Medal Winners of the 1984 Summer Olympics Total Medals
729
Athlete
Country
Sport
Gold-Silver-Bronze
4
Carl Lewis
United States
Track and field
4-0-0
4
Michael Gross
West Germany
Swimming
2-2-0
3
Valerie Brisco-Hooks
United States
Track and field
3-0-0
3
Mary T. Meagher
United States
Swimming
3-0-0
3
Reiner Klimke
West Germany
Equestrian/dressage
3-0-0
2
Greg Louganis
United States
Diving
2-0-0
2
Marco Marin
Italy
Fencing
1-1-0
2
Sebastian Coe
Great Britain
Track and field
1-1-0
2
Merlene Ottey
Jamaica
Track and field
0-0-2
730
■
The Eighties in America
Olympic Games of 1984
against using municipal funds to support the hosting of the 1984 Games. Peter Ueberroth, president of the Los Angeles Olympic Organizing Committee, was presented with the challenge of organizing the Games with a budget of $500 million. Contributing to the budget was a lucrative television contract bringing in $225 million. The private sector for the first time in the Olympic movement provided a significant contribution of $140 million. The McDonald’s Corporation financed the construction of a swimming stadium at the University of Southern California (USC). Other companies, such as CocaCola, United Airlines, Anheuser-Busch, and SevenEleven, provided contributions. An advantage that Los Angeles had over Montreal was the opportunity to use existing venues. The Los Angeles Coliseum was used for opening and closing ceremonies as well as track-and-field events. Existing stadiums such as the Rose Bowl and Stanford Stadium as well as stadiums located on the East Coast at Harvard and Annapolis were utilized for staging soccer events. The Los Angeles Memorial Sports Arena was used for boxing, and the Forum for basketball. Dodger Stadium was used for baseball competition. Aquatic events were conducted at the newly constructed swimming stadium at USC as well as existing venues at Pepperdine University. Equestrian events were held at the Santa Anita racetrack. The Summer Games were an economic success. The Games were attended by six million spectators and had a record-breaking television audience. The championship soccer game between France and Brazil had an attendance of 101,799 spectators, the largest crowd ever to watch a soccer game in the United States. The championship baseball game between the United States and Japan was held at Dodger Stadium. The game attracted just one thousand fans less than the all-time record for a single game achieved by the Los Angeles Dodgers in a World Series game. The Summer Games resulted in a profit of $225 million. Political Controversies On December 28, 1979, the Soviet Union invaded Afghanistan. The United States officially condemned the invasion. As a result, President Jimmy Carter called for all free nations to boycott the 1980 Moscow Olympic Games. More than fifty nations boycotted the Moscow Games, resulting in the lowest athletic attendance of the Olympic Games since 1956. In response to the boycott, the So-
viet Union announced on May 8, 1984, that it, along with fourteen other Eastern Bloc nations, would boycott the Summer Games. Iran and Libya joined the boycott as well. The absence of these nations was significant in that they had accounted for 58 percent of the gold medals that were won during the 1976 Olympic Games. Romania and Yugoslavia did not join in the boycott, and Romania compiled a national record of fifty-three medals. The People’s Republic of China was in attendance for the first time since 1952. After China walked out of the 1976 Winter Olympics because of Taiwan’s participation under the name “Republic of China,” the IOC later recognized the People’s Republic as China and Taiwan as “Chinese Taipei.” In 1984, Moscow hosted the Friendship Games, an alternative to the Los Angeles Olympic Games held between July and August of 1984. Performances
The United States was the overwhelming medals leader with a total of 174 medals, including 83 gold medals; Romania had the secondhighest accumulation of gold medals with 20 and a total of 53; West Germany came in third with 17 gold medals and a total of 59 medals; Canada came in sixth with 10 gold medals and a total of 44 medals. Carl Lewis of the United States became the first man to win four gold medals in track and field—100 meter, 200 meter, 4-by-100 relay, and long jump— since Jesse Owens accomplished the feat in 1936. Valerie Brisco-Hooks of the United States became the first woman Olympian to win the 200 and 400 meters. Joan Benoit of the United States won the first gold medal in the women’s marathon. Achievements of American athletes included winning 9 of 12 gold medals in boxing and 21 of 34 gold medals in swimming. Mary Lou Retton became the first gymnast outside Eastern Europe to win the gymnastics all-around competition. Peter Vidmar became the first American to win an individual gold medal in men’s gymnastics in fifty-two years. In gymnastics, the men won the team gold medal and the women won the team silver medal. Greg Louganis of the United States became the first athlete in fifty-six years to win both diving events. Michael Jordan and Cheryl Miller led the men’s and women’s basketball teams to gold medals. Canadian swimmers won three gold medals. Michael Gross of West Germany won two gold medals with world-record times in the 200-meter freestyle and 100-meter butterfly. Nawal El Moutawakel of
The Eighties in America
Morocco won a gold medal in women’s 400-meter hurdles, becoming the first woman champion from an Islamic country. Ulrike Meyfarth of West Germany became the oldest person ever to win a track-andfield gold medal in the Olympics, in women’s high jump. In 1972, she had also become the youngest. Impact The 1984 Winter and Summer Olympics demonstrated that the Games could be staged without economic turmoil. Fueled by lucrative television contracts, the 1984 Games were financially successful, demonstrating that with contributions from the private sector and utilization of existing athletic venues, the hosting of the Games could be economically feasible. As a result, several nations provided bids to hold subsequent Olympic Games. The significant attendance at the soccer games at the Los Angeles Games demonstrated that soccer tournaments held in the United States would be well received and contributed to the United States’ hosting the 1994 FIFA World Cup of soccer. Further Reading
Albertson, Lisa H., ed. Athens to Atlanta: One Hundred Years of Glory. Salt Lake City, Utah: Commemorative Publications, 1995. This text is licensed by the U.S. Olympic Committee to provide an overview of the Summer Olympics from 1896 to 1996. Espy, Richard. The Politics of the Olympic Games. Berkeley: University of California Press, 1979. Provides a historical account of the political, economic, and social forces that have influenced the conduct of the Olympic Games. Frommer, Harvery, Myrna Frommer, and Mary Gaddie. Games of the Twenty-Third Olympiad: Los Angeles 1984 Commemorative Book. Salt Lake City, Utah: International Sport Publications, 1984. Commemorative text of the 1984 Summer Olympics that is officially sanctioned by the International Olympic Committee. Hugman, Barry J., and Peter Arnold. The Olympic Games: Complete Track and Field Results, 1896-1988. New York: Facts On File, 1988. Provides an overview of athletes and the medal winners in track and field. Ueberroth, Peter. Made in America. New York: William Morrow, 1985. Written by the President of the Los Angeles Olympic Organizing Committee. Wenn, Stephen R. “Conflicting Agendas: Monique Berlioux, Ahmed Karabegovic, and U.S. Television Rights Negotiations for the 1984 Sarajevo
Olympic Games of 1988
■
731
Olympic Winter Games.” Fourth International Symposium for Olympic Research, October, 1998, 115128. Provides an analysis of major American television stations vying for the television rights to the Sarajevo Winter Games. Wilson, Harold E., Jr. “The Golden Opportunity: Romania’s Political Manipulation of the 1984 Los Angeles Olympic Games.” Olympika: The International Journal of Olympic Studies 3 (1994): 83-97. Provides an in-depth analysis of Romania’s defiance of the Soviet boycott of the Games. Alar Lipping See also
Cold War; Goodwill Games of 1986; Lewis, Carl; Louganis, Greg; Olympic boycotts; Olympic Games of 1980; Olympic Games of 1988; Retton, Mary Lou; Soviet Union and North America; Sports; Ueberroth, Peter.
■ Olympic Games of 1988 The Event
The 1988 staging of winter and summer international athletic competitions, held every four years Date Winter Games, February 13-28, 1988; Summer Games, September 17-October 2, 1988 Place Winter Games, Calgary, Alberta, Canada; Summer Games, Seoul, South Korea The 1988 Calgary Games witnessed dominating performances by the Soviet Union and East Germany but disappointing results for the United States. The truly competitive Seoul Games were noted for impressive athleticism in multiple events as well as controversies over steroid use. Unlike the 1980 and 1984 Summer Olympics, there was no widespread boycott in 1988. Athletes from both sides of the Cold War competed at both Calgary (Winter Games) and Seoul (Summer Games). Scandal following the prestigious men’s 100-meter sprint in track and field brought front and center the rampant rumors of the illegal use of performanceenhancing drugs. Winter Games The Calgary Games were the first to be scheduled over sixteen days, ensuring that viewers worldwide would receive three weekends of Olympic activities. Those watching at home saw 1,423 athletes from fifty-seven nations compete. Demonstrating once again (though for the last time) that it was the
732
The Eighties in America
Olympic Games of 1988
■
Top Standings for the 1988 Winter Olympics Medals Won
Country
Gold
Silver
Bronze
29
Soviet Union
11
9
9
25
East Germany
9
10
6
15
Switzerland
5
5
5
10
Austria
3
5
2
8
West Germany
2
4
2
7
Finland
4
1
2
7
Netherlands
3
2
2
6
Sweden
4
0
2
6
United States
2
1
3
5
Italy
2
1
2
5
Norway
0
3
2
5
Canada
0
2
3
Top Standings for the 1988 Summer Olympics Medals Won
Country
Gold
Silver
Bronze
132
Soviet Union
55
31
46
102
East Germany
37
35
30
94
United States
36
31
40
West Germany
11
14
35
Bulgaria
10
12
33
Korea
12
10
28
China
5
11
24
Romania
7
11
24
Great Britain
23
Hungary
16 16
5
10
11
6
France
6
4
Poland
2
5
most dominant nation in the Olympics, the Soviet Union won eleven gold and twenty-nine total medals. Its political ally East Germany came in second in both categories (nine and twenty-five, respectively). The Soviet Union collapsed in 1991, and athletes from the former socialist state competed either for new nations or for the Commonwealth of Independent States in 1992.
Calgary will also be remembered as the Games in which freestyle skiing, curling, and short-track speed skating made their Olympic debuts. It was the city where the charismatic Italian skier Alberto Tomba won the gold medal in both slalom and giant slalom and then announced that he wished to date East German figure skater Katarina Witt. He even offered to give her one of his medals if she did not win her own gold. Finland’s Matti Nykänen won three gold medals in ski jumping, and Dutch speed skater Yvonne van Gennip also took home three gold medals. The host country did not win any gold medals, but perhaps the biggest winners were future Canadian Olympic hopefuls. The Calgary Games turned a profit estimated between $90 and $150 million, and these funds were used to ensure that the Olympic facilities in the city remained at their best when the Olympic athletes went home.
U.S. Shortfalls Far down the medal table was the United States. Indeed, Calgary was not kind to American 15 athletes, who took home fewer med13 als (two gold, six total) than had been 11 forecast. Speed skater Bonnie Blair 12 won one of those gold medals, in the 6 500 meter, and she also took home a 9 bronze medal in the 1,000 meter. Figure skater Brian Boitano earned 6 the other gold medal, narrowly de6 feating Canadian Brian Orser, the 9 silver medalist. The so-called Battle of the Brians made for great television viewing, with the home crowd booing lustily when the final results were announced. Orser led after the short program, but Boitano delivered a stunning, technically perfect long program that forced Orser to be perfect to have a chance to win. He almost delivered, but a slight error during one of his jumps was critical. Boitano won gold, and Orser, who also won the silver medal in 1984, was left wondering what might have been. 27
The Eighties in America
Olympic Games of 1988
■
733
Greg Louganis hits his head on the diving board during the men’s springboard competition at the 1988 Summer Olympics. (Hulton Archive/Getty Images)
Perfection also eluded an American female figure skater, Debi Thomas, who finished without a gold medal. As did her principal competitor, East Germany’s Witt, Thomas delivered her long performance to the music Georges Bizet’s “Carmen.” However, a lackluster effort, which included an inability to complete three triple jumps, left her with the bronze medal. Witt did not dazzle either during her long program, but she still won the gold medal, defending the title that she had won during the 1984 Winter Games. Witt was a fourth-time world champion in her sport in 1988. Perhaps the most tragic figure of the Games was American speed skater Dan Jansen. He was one of the favorites in the 500 meter, and he also stood a realistic chance of earning a medal in the 1,000 meter. However, on the morning of the first event, he
learned that his sister had died of leukemia. He chose to skate but fell on the first turn. A few days later, he was leading the 1,000 meter when he fell again. However, Jansen never publicly complained about what had happened to him, nor did he seek any special treatment. Summer Games The 1988 Summer Games in Seoul, South Korea, marked the first time in twelve years in which no organized, widely supported boycott took place. One hundred fifty-nine nations and 8,391 athletes competed in the Games. The Soviet Union once again dominated the competition, winning 55 gold and 132 total medals. The United States finished third, taking home 36 gold and 94 total medals. Those results were considered unacceptable by the American public, which was still
734
■
The Eighties in America
Olympic Games of 1988
fondly remembering the great performances by U.S. athletes four years earlier in Los Angeles. Forgotten by many was the absence from those Games of Soviet and Eastern European athletes. Performances and Controversies
The men’s 100meter sprint in track and field is probably the most anticipated event of any Summer Games. Sprinters race to determine who is the world’s (unofficial) fastest human being. That title appeared to belong to Canadian Ben Johnson, who set a world record at Seoul when he finished in 9.79 seconds. However, Johnson was later disqualified, stripped of his medal and his record after testing positive for stanozolol, an anabolic steroid. Not long after the Olympics, he lost whatever was left of his reputation by acknowledging that he had taken performance-enhancing drugs for several years. The Jamaican-born athlete went from hero to villain in his adopted Canada. American Carl Lewis, who finished second to Johnson, was given the gold medal. Lewis also won the long jump, giving him six career Olympic gold medals. He won two additional events during the 1992 Summer Olympics and was considered by many to be the greatest U.S. track-and-field athlete. Lewis was upstaged by the controversy surrounding Johnson. He also was upstaged by another American sprinter, Florence Griffith-Joyner, who won three gold medals and one silver medal at Seoul, and whose flamboyant outfits and nail polish made her a media celebrity. Later that year, she won the James E. Sullivan Award, which recognizes the top amateur athlete in the United States. Another athlete surrounded by controversy was U.S. diver Greg Louganis. At Seoul, the returning champion competed in the springboard and platform events, securing his place among history’s best divers. However, Louganis struck his head against the springboard during one of his dives, requiring stitches. Despite the accident, he went on to win the gold in both events. Years later, it was learned that Louganis had HIV at the time of the accident. Some critics wondered whether he should have revealed his medical condition, although it should be noted that no one who entered the pool after him was put in danger of contracting the virus. Hungarian swimmer Krisztina Egerszegi burst onto the international scene in Seoul, where she won a gold and silver medal in the two backstroke events. What made these accomplishments noteworthy was
that Egerszegi was only fourteen years old. Seventeenyear-old American Janet Evans won three gold medals and cemented her reputation as one of the greatest all-time distance swimmers. Despite these impressive performances, it was East Germany’s Kristin Otto who dominated the pool, taking home six gold medals—a record for any woman in any sport. Finally, tennis made its return to the Olympic stage in 1988 after a sixty-four-year hiatus. Germany’s Steffi Graf, the world’s best player in her sport, won the gold medal. Media Coverage The American television network the National Broadcasting Company (NBC) once again experienced bad luck with its broadcasting plans. (The network had lost millions of dollars because of the 1980 Olympic boycott.) NBC invested $300 million to gain the broadcast rights to the Summer Games, but a fourteen-hour time difference between Seoul and the eastern United States combined with the uneven performance by American athletes to bring the network lower-than-expected ratings. Meanwhile, American newspapers reported that the South Korean government was angry that NBC provided substantial coverage of a boxing judge being attacked by several South Korean coaches, trainers, and others after one of their boxers lost a match but did not sufficiently discuss the questionable judging associated with the bout. Media reports also indicated that the South Koreans were disappointed at NBC’s decision to ignore almost completely the theft of an $860 mask from a Seoul hotel by three American athletes. NBC was also chastised for airing a critical story about North Korea, which was not taking part in the Games. Impact The 1988 Winter and Summer Olympics were the last in which athletes from the United States and the Soviet Union competed against each other, as the Soviet Union dissolved just prior to the 1992 Olympic year. At the same time, 1988 marked a pivotal turning point in the long-running but never sufficiently resolved discussion of how to classify an amateur athlete. From this point forward, the Games essentially would be open to all athletes, including openly professional ones. Further Reading
Buchanan, Ian, and Bill Mallon. Historical Dictionary of the Olympic Movement. Lanham, Md.: Scarecrow
The Eighties in America
Press, 1995. A good resource to learn about general information about the Olympic Games, but it lacks depth about any specific topic. De Moragas Spa, Miquel, Nancy K. Rivenburgh, and James F. Larson. Television in the Olympics. London: John Libbey, 1995. Examines how television became a critical player in the development and presentation of the Olympics. Klatell, David A., and Norman Marcus. Sports for Sale: Television, Money, and the Fans. New York: Oxford University Press, 1988. Covers a wide sweep of sports on television and focuses on the difficulty of making a profit at the Olympics and other sports events. Larson, James F., and Heung-Soo Park. Global Television and the Politics of the Seoul Olympics. Boulder, Colo.: Westview Press, 1993. Excellent book that provides numerous examples and insights into how South Korea’s Olympic organizers used the media to advance their agenda for the 1988 Summer Games. Orser, Brian, and Steve Milton. Orser: A Skater’s Life. Toronto: Key Porter Books, 1988. Aimed at a general audience, this autobiography highlights the career of Canadian figure skater Brian Orser. Ungerleider, Steven. Faust’s Gold: Inside the East German Doping Machine. New York: St. Martin’s Press, 2001. Examines the extent to which the East Germans were willing to go to become an international power at the Olympics. The consequences of these immoral practices still haunt many former East German Olympians. Anthony Moretti See also
Boitano, Brian; Canada and the United States; Goodwill Games of 1986; Griffith-Joyner, Florence; Lewis, Carl; Louganis, Greg; Olympic boycotts; Olympic Games of 1980; Olympic Games of 1984; Sports.
■ On Golden Pond Identification American film Director Mark Rydell (1934) Date Released December 4, 1981
One of a few subdued family dramas among the blockbuster films of the 1980’s, triple Academy Award-winning On Golden Pond featured legendary screen actors in a sensi-
On Golden Pond
■
735
tive portrayal of old age—and of an American family attempting to overcome resentments and common intergenerational conflicts. Adapted for film by playwright Ernest Thompson from his original Broadway play, On Golden Pond was one of the highest-grossing films of 1981, winning three Oscars out of ten Academy Award nominations. The film centers on an older, long-married couple— crotchety retired professor Norman Thayer (played by Henry Fonda) and his steadfast and spirited wife Ethel (Katharine Hepburn)—summering at their lakeside cottage in New England. Norman, daunted by sight, memory, and heart problems, uses sarcasm to distance others; Ethel, the stabilizing force of the family, loves her husband wholeheartedly but is often exasperated by him and his antagonistic relationship with their adult child. To celebrate Norman’s eightieth birthday, the Thayers’ estranged daughter Chelsea (Jane Fonda), a middle-aged divorcée, visits from California with her dentist boyfriend, divorcé Bill Ray (Dabney Coleman), and his thirteen-yearold son, Billy (Doug McKeon). When the two middleagers depart for a European vacation, Billy stays with the senior Thayers for a month. Underlying the birthday party, board games, berry picking, swimming, and fishing excursions that constitute the action of this quiet film are key generational and family issues of the 1980’s, including the health and quality of life of an aging senior population, widespread divorce, family blending via remarriage, and coming of age in a media-driven decade. While other 1980’s films depict families distraught by suicide, fatal illness, or the death of a child, On Golden Pond explores more common domestic problems, offering glimpses into an American family’s multifaceted inner workings: The film reveals love and contention within a solid marriage, sparring and reassurance in family celebrations, and intimate conversations wherein family members sort out feelings. Arguments peppered with humor make On Golden Pond notably lighter in tone than many of its 1980’s film counterparts, as do awkward interactions that turn unexpectedly frank and insightful. During and since the 1980’s, critics and some audiences have taken issue with the sentimentality and predictable, tidy plot of On Golden Pond. Indeed, by the story’s end, Billy and Norman have bonded over fishing and salty language, Chelsea has begun to make peace with her father, Norman and Ethel have
736
■
O’Neill, Tip
weathered a coronary episode, and a boating accident has made life seem dearer. Nevertheless, to the film’s supporters in the 1980’s, On Golden Pond suggested that the decade’s complicated social and familial problems might find positive resolutions. Impact As depicted in On Golden Pond, the 1980’s family—for all its friction and strain—remained a viable structure for growth and emotional sustenance. The film is also notable as the last to feature legendary actor Henry Fonda, whose death a year later helped mark the passing of Hollywood’s golden age. Further Reading
Constantakis, Sarah, ed. Drama for Students: Presenting Analysis, Context, and Criticism on Commonly Studied Dramas. Vol. 23. Detroit: Thomson Gale, 2006. Shale, Richard. The Academy Awards Index: The Complete Categorical and Chronological Record. Westport, Conn.: Greenwood Press, 1993. Wendy Alison Lamb See also
Academy Awards; Age discrimination; Feminism; Film in the United States; Marriage and divorce; Ordinary People; Terms of Endearment; Theater.
■ O’Neill, Tip
The Eighties in America
tic and defense policies and believed that Reagan was ignorant of the intricacies of government. Reagan’s first act as president was to issue substantial tax cuts. O’Neill believed that this move was a mistake, but he also believed that the American people would not turn their backs on the president’s program until they had seen it fail. He felt that, given time, the plan would create larger budget deficits. In a few months, it became apparent that O’Neill had been right. The tax cuts had not worked, and a serious recession developed in 1982. Reagan made another error when he tried to reduce Social Security benefits for people who had chosen to leave the workforce. The public loudly voiced their opinion against this plan, and O’Neill’s popularity and that of the Democratic Party rose. In 1984, O’Neill was elected to serve a fifth term as Speaker. During his last two years in office, he experienced success in foreign policy. He led a U.S. bipartisan delegation to Russia to meet with President Mikhail Gorbachev regarding arms control. He also played an important role in the passage of the Immigration Reform and Control Act of 1986, which created sanctions against employers who knowingly hired illegal immigrants and granted amnesty to millions of undocumented workers who had established roots in the United States before 1982. One of O’Neill’s greatest accomplishments was his participation in the Anglo-Irish Agreement between the
Identification
Speaker of the U.S. House of Representatives, 1977-1987 Born December 9, 1912; Cambridge, Massachusetts Died January 5, 1994; Boston, Massachusetts O’Neill was the second-longest-serving Speaker of the House of Representatives, working under Presidents Jimmy Carter and Ronald Reagan. Thomas Philip “Tip” O’Neill, Jr., was a tough, outspoken politician from a bygone era. A die-hard liberal who led the Democratic Party into the Ronald Reagan era, O’Neill served as Speaker of the House from 1977 until 1987. While O’Neill had worked for the Jimmy Carter administration, the party failed in a time of economic, international, and political crises, and much of O’Neill’s efforts during the Reagan era centered on strengthening his party. The political relationship between O’Neill and Reagan was a rocky one. O’Neill disagreed with Reagan’s domes-
Speaker of the House Tip O’Neill accuses President Ronald Reagan of ignoring the plight of starving people in Africa during a 1984 press conference. (AP/Wide World Photos)
The Eighties in America
Ordinary People
■
737
United Kingdom and the Republic of Ireland that aimed to bring an end to the fighting in Northern Ireland. O’Neill retired as Speaker of the House in 1987. He was later diagnosed with colon cancer. Impact Tip O’Neill had a reputation as an outgoing liberal Democrat who was a master at convincing representatives to pass key legislation. O’Neill enjoyed national recognition yet remained loyal to his constituents, and he played a prominent role in assisting the poor and least-privileged Americans. Further Reading
Farrell, John A. Tip O’Neill and the Democratic Century. Boston: Little, Brown, 2001. O’Neill, Thomas P. Man of the House: The Life and Political Memoirs of Speaker Tip O’Neill. New York: Random House, 1987. Maryanne Barsotti See also Dukakis, Michael; Elections in the United States, 1980; Elections in the United States, 1984; Ferraro, Geraldine; Haig, Alexander; Immigration Reform and Control Act of 1986; Meese, Edwin, III; Military spending; Reagan, Ronald; Reagan Democrats; Reagan Revolution; Reaganomics; Social Security reform.
■ Ordinary People Identification American film Director Robert Redford (1936Date Released in 1980
)
Based on a 1976 novel by Judith Guest, this film depicts an upper-middle-class suburban American family that begins to disintegrate following the death of one son and the attempted suicide of the other. Set in the affluent Chicago suburb of Lake Forest, the story centers on teenager Conrad Jarrett (played by Timothy Hutton), who has just returned home after spending time in a mental institution, where he was sent following a suicide attempt. Conrad’s precarious emotional state has resulted from grief and feelings of guilt occasioned by his surviving the boating accident in which his older brother was killed. Both Conrad and his father, Calvin (Donald Sutherland), must navigate the tense domestic atmosphere created in the wake of the older son’s tragic death. Both are troubled by the emotional detach-
Robert Redford displays the Academy Award he won for his direction of Ordinary People at the annual ceremony in March, 1981. (AP/Wide World Photos)
ment of Beth Jarrett (played by Mary Tyler Moore), the mother and wife, who retreats into the busyness of maintaining bourgeois decorum rather than confront the family’s mounting dysfunction. Conrad’s return to emotional health is abetted by his therapist, Dr. Berger (Judd Hirsch), but it is hampered by the cold demeanor of his mother. The film is a portrait of late twentieth century middle-class American attitudes toward grieving, mental illness, adolescence, and divorce. Many critics hailed the film, praising the dialogue and welldeveloped characters. Some, however, found the film to be sentimental, observing that the Jarrett family were too conventional to be credible, that their mannered, WASP-y affluence was a stereotype and a convenient setup for inevitable breakdown in the face of domestic upheaval. The figure of Beth Jarrett has proved an interesting, if slight, subject for a few feminist critics.
738
■
Organized crime
The film, which was actor Robert Redford’s directorial debut, proved very successful. It received several awards, including Academy Awards for Best Picture, Best Director, Best Supporting Actor for Hutton, and Best Adapted Screenplay for Alvin Sargent, as well as Oscar nominations for Moore and Hirsch. Impact The award-winning film adaptation of Ordinary People generated a renewed and sustained interest in Judith Guest’s book, which gained popularity as an assigned reading in high schools and colleges. The novel continues to be read, particularly by young people, probably because of the appeal of its teenage protagonist. The film has also been credited with generating popular interest in the German Baroque composer Johann Pachelbel, whose Kanon in D was included in the sound track. Further Reading
Aguiar, Sarah Appleton. The Bitch Is Back: Wicked Women in Literature. Carbondale: Southern Illinois University Press, 2001. Canby, Vincent. Review of Ordinary People. The New York Times, September 19, 1980. Maddocks, Melvin. “Suburban Furies.” Time 108, no. 3 (July 19, 1976): 68, 70. Szabo, Victoria, and Angela D. Jones. “The Uninvited Guest: Erasure of Women in Ordinary People.” In Vision: Re-vision: Adapting Contemporary American Fiction by Women to Film, edited by Barbara Tepa Lupack. Bowling Green, Ky.: Bowling Green State University Popular Press, 1996. Thad Cockrill See also
Academy Awards; Feminism; Film in the United States; Literature in the United States; Teen films.
■ Organized crime Definition
The illegal activities of groups of professional criminals
Shifting socioeconomic patterns and changing penal codes during the 1980’s brought about changes in operations for gangsters and law enforcement officials alike. By the 1980’s, American gangsters had built multimillion-dollar enterprises, and even as the federal government sought to find ways to prevent the
The Eighties in America
growth of organized crime, the depiction of various types of gangsters as hardworking men simply making a living in spite of the system glamorized the image of the American gangster. Modern depictions in film and popular song increased the gangster allure. The Italian American Mafia At the beginning of the decade, probably no faction of organized crime loomed larger in the American imagination than the Italian American Mafia, immortalized on screen in motion pictures such as director Francis Ford Coppola’s The Godfather (1972) and The Godfather Part II (1974).With its focus on honor and loyalty, the cinematic portrait of the Mafia portrayed the Mafia don as a family man and a shrewd businessman with numerous complicated decisions to make—an image to which many mainstream American moviegoers could relate. The Italian American Mafia was based primarily on the East Coast of the United States, particularly the New York City area, although the group’s members did operate in other locations, including Arkansas and Las Vegas. The Mafia created its flow of income from territorial monopolies on both illegal and legal businesses (the latter often funded by illegal operations). Small-business owners were routinely bullied into accepting “protection” from the Mob for a fee, risking their lives and their businesses if they refused. By the late twentieth century, the Mafia was heavily involved in waste management, construction, drug trafficking, racketeering, loan sharking, and murder for hire. By 1980, the American Mafia had undergone obvious changes. The older gangsters were dead, dying, or retired. Younger members were challenging the antiquated means of generating income and were more willing to become involved with the sale and distribution of street drugs, an enterprise previously looked down upon because of its association with African Americans. However, racketeering— obtaining money through the threat of violence— remained a major source of income for organized crime factions in the United States. The most serious threat to organized crime in the United States was the Racketeer Influenced and Corrupt Organizations Act, or RICO. The statute was passed in 1970 by Congress under the Organized Crime Control Act. RICO was intended to prevent any individual or organization from receiving or using income obtained from racketeering. The statute
The Eighties in America
allowed law enforcement agencies to prosecute several members of organized crime factions at once, for crimes committed over several years. In Philadelphia, Nicodema “Little Nicky” Scarfo became the second Philadelphia Mob boss sentenced as a result of RICO, proving the vulnerability of organized crime to the new statute. However, in 1985, a Gambino family underboss named John Gotti orchestrated the murder of “Big” Paul Castellano, head of the Gambino crime family, outside a New York City steak house. As a result, Gotti became the boss of the Gambino crime family. Gotti’s rise to power symbolized the changing of the American Mafia guard. In previous decades, the Mafia as a whole had striven to maintain a low public profile. With Gotti as leader, the American Mafia gained a positive image in the American public imagination. Nicknamed the “Teflon Don” (because authorities found it difficult to make criminal charges stick to him) and the “Dapper Don” (for his designer suits), Gotti was more of a public figure than almost any don before him. While the Italian American Mob had garnered much attention from the U.S. federal government, other ethnic groups were not without their organized crime factions. Russian and Asian gangs had tremendous influence in their respective neighborhoods in a number of large American cities. However, particularly on the West Coast in the United States, African American gangs were growing in number. African American Gangs
The primary African American gangs were the Crips and the Bloods. After the decline of the Black Panther Party for Self-Defense (which officially disbanded in 1982), these two gangs emerged as opponents in a war for territory in economically deprived neighborhoods. The groups also included Latino members. The groups were known by their colors: the Bloods for their red bandannas and handkerchiefs (later, T-shirts, shoes, and other articles of clothing could be worn to denote an individual’s actual or implied allegiance with the group) and the Crips for their blue bandannas and other clothing items. Chiefly, the turf wars fought between the the Crips and the Bloods focused on the sale and distribution of cocaine and crack, with each side claiming certain city blocks as theirs to use as locations from which to sell the drugs. The inner-city gangs most thrived in the decade’s
Organized crime
■
739
changing socioeconomic conditions. West Coast gangsters could move around the country, wherever they had friends and family, and start another faction of their group. The gangs provided pseudofamilies in areas that suffered from large numbers of unemployed black males and fatherless households. Adolescent sons without father figures looked up to gang leaders when gangsters took them in. By meeting deprived youngsters’ unmet needs, gang leaders created street soldiers committed to maintaining the group’s territory. Inner-city gangsters were depicted as fearless men determined to make incredible amounts of money in the burgeoning genre of music known as “gangsta rap.” Groups such as Los Angeles-based N.W.A. illustrated the unfairness of life for African American inner-city youth, especially in terms of interactions with Los Angeles police officers. Impact The gangster activities of the 1980’s stimulated federal law enforcement efforts, resulting in a reduction in organized criminal activities. However, the romanticization and glorification of gang life in popular culture from videos to music left a lasting legacy, creating a variety of American gangster images that proved attractive to moviegoers and music fans, especially teenage youth. Further Reading
De Stefano, George. An Offer We Can’t Refuse: The Mafia in the Mind of America. New York: Faber & Faber, 2006. Discusses the decline of the Italian American Mafia and traces the history and significance of the organization in relation to its American incarnation. Also provides coverage of the Italian American gangster in film and television. Gambetta, Diego. The Sicilian Mafia: The Business of Private Protection. Cambridge, Mass.: Harvard University Press, 1993. Details the roots of the American version of the Mafia through examination of the Sicilian original. Explores the reasoning behind the rules and actions of the Sicilian Mafia that influenced the American Mafia. Kelly, Robert J. Encyclopedia of Organized Crime in the United States. Westport, Conn.: Greenwood Press, 2000. Covers not only the Italian American Mafia but also the organized crime activities of other ethnic groups in the United States, such as Asians, Russians, and African Americans. Includes definitions of criminal justice terms. Dodie Marie Miller
740
■
The Eighties in America
Osbourne, Ozzy
See also African Americans; Drug Abuse Resistance Education (D.A.R.E.); Epic films; Film in the United States; Indian Gaming Regulatory Act of 1988; Latinos; Scorsese, Martin.
■ Osbourne, Ozzy Identification Heavy metal rock musician Born December 3, 1948; Birmingham, England
Osbourne’s heavy metal rock music as well as his outrageous, drug-induced antics made him a cult hero to his fans. Parents and Christian groups, however, claimed that he was a negative influence on youths. Born John Michael Osbourne to a working-class family, Ozzy Osbourne suffered a difficult early life. Chronically depressed and an alcoholic, Osbourne made several suicide attempts as early as age fourteen. At fifteen, he dropped out of school; two years later, he served two months in prison for burglary. He had some success as a singer and in 1967, with three other Birmingham youths, formed the Polka
Tulk Blues Band. The name was changed to Earth until the group settled on Black Sabbath, which became one of the preeminent hard rock bands of the 1970’s. Because of his unpredictable behavior, exacerbated by drugs, Osbourne was fired by Black Sabbath in 1978. In 1980, his solo career took off when his first album, Blizzard of Ozz (1981), was certified platinum. CBS Records was eager to sign him to a long-term contract; however, during a contentious meeting, Osbourne, drunk and carrying two live doves as peace offerings, bit the head off one bird. The story escalated to his eating live animals, and in late 1981, at a concert in Des Moines, Iowa, fans threw various animals on stage. Mistaking a live bat for a rubber one, Osbourne bit off its head. In the same tour, he urinated on the Alamo. Such bizarre actions fueled his popularity, and the album he was promoting, Diary of a Madman (1981), was also certified platinum. Osbourne’s popularity continued to grow. Enhanced by the electric guitar of Randy Rhoads, Osbourne’s band was competing with the number one heavy metal band, Van Halen. When Rhoads was
Ozzy Osbourne. (Paul Natkin)
The Eighties in America
killed in a plane crash on March 19, 1982, Osbourne fell deeper into drug and alcohol abuse. His marriage to Sharon Arden in July, 1982, helped him out of his depression and put him back on track, writing music, recording albums, and performing. In 1985, at the historic Live Aid concert in Philadelphia to benefit victims of famine in Africa, he performed once again with Black Sabbath. Throughout the 1980’s, Osbourne battled drug and alcohol addiction but continued to produce albums, such as Bark at the Moon (1983), The Ultimate Sin (1986), and No Rest for the Wicked (1988); all reached platinum status. Because of his demeanor and the albums’ content, his albums were banned by the Moral Majority, and he was sued by three sets of parents who blamed his song “Suicide Solution” for the suicides of their sons. Impact Osbourne became a legend in the 1980’s. He was at the forefront of a number of heavy metal groups that wowed crowds and questioned the sensibilities of the establishment. Osbourne continued to tour, organized the heavy metal tour Ozzfest beginning in 1996, and starred in MTV’s popular television series The Osbournes, which premiered in 2002. Further Reading
Clerk, Carol. Diary of a Madman: Ozzy Osbourne—The Stories Behind the Classic Songs. New York: Thunder’s Mouth Press, 2002. Fricke, David. “For Ozzy Osbourne, There Is Reality Television—and There Is Real Life.” Rolling Stone 25 (July, 2002): 62-67. Marcia B. Dinneen See also
Guns n’ Roses; Heavy metal; Live Aid; Mötley Crüe; MTV; Music; Music videos; Parental advisory stickers; Van Halen.
■ Ozone hole Definition
The depletion of stratospheric ozone by artificial chemicals
Manufacture of chlorofluorocarbons was banned during the 1980’s because of their role in depleting ozone in the stratosphere. This depletion of ozone, which protects the earth from some types of ultraviolet radiation, was associated with an increased risk of skin cancer. Despite the ban, the ozone layer was slow to heal.
Ozone hole
■
741
Chlorofluorocarbons (CFCs) initially raised no environmental questions when they were first marketed by DuPont Chemical during the 1930’s under the trade name Freon. CFCs were first manufactured in 1930, having been invented by Thomas Midgley, a chemist working for the Chevron Corporation. By the early 1980’s, manufacturers in the United States were producing more than 750 million pounds of CFCs a year, having found multiple uses for them: as propellants in aerosol sprays, as solvents used to clean silicon chips, in building and automobile air conditioning systems, and in the manufacture of polystyrene cups, egg cartons, and containers for fast food. CFCs were cheap to manufacture, nontoxic, and nonflammable. By the time their legal manufacture was banned internationally during the late 1980’s, CFCs were a $28-billion-a-year industry and had been used in roughly 90 million U.S. car and truck air conditioners, 100 million refrigerators, 30 million freezers, and 45 million air conditioners in homes and other buildings. CFCs and Ozone Depletion
During June, 1974, chemists F. Sherwood Rowland and Mario J. Molina reported in Nature that CFCs were working into the stratosphere and depleting the ozone column. This research, augmented during the 1980’s, led to a Nobel Prize in Chemistry for Rowland and Molina and resulted in DuPont’s ceasing the manufacture of CFCs. Rowland and Molina’s work was theoretical. In the early 1980’s, their theories were confirmed as scientists discovered that CFCs were, in fact, rapidly thinning the ozone layer over the Antarctic. The major human concern with ozone depletion is ozone’s role in shielding flesh and blood from several frequencies of ultraviolet radiation. One of these frequencies, ultraviolet B, is strong enough to break the bonds of deoxyribonucleic acid (DNA) molecules, which carry the genetic coding of all living beings, including humans. While plants and animals are generally able to repair damaged DNA, on occasion the damaged DNA molecules can continue to replicate, leading to dangerous forms of basal, squamous, and melanoma skin cancers in humans. The probability that DNA can be damaged by ultraviolet light varies with wavelength, shorter wavelengths being the most dangerous. During 1985, a team of scientists working with the British Antarctic Survey reported a startling decline in “column ozone values” above an observation sta-
742
■
The Eighties in America
Ozone hole
tion near Halley Bay. During the mid-1980’s, the cause of dramatic declines in ozone density over the Antarctic was open to debate. Some scientists suspected variability in the Sun’s radiational output, and others suspected changes in atmospheric circulation. A growing minority began to suspect CFCs. In 1987, a majority of the world’s national governments signed the Montreal Protocol to eliminate CFCs. Definite proof of the role of CFCs in ozone depletion arrived shortly thereafter, as J. G. Anderson and colleagues implicated the chemistry of chlorine and explained a chain of chemical reactions (later broadened to bromides as a bit player), the “smoking gun” that explained why ozone depletion was so sharp and limited to specific geographic areas at a specific time of the year. Impact Despite the Montreal Protocol, ozone depletion over the Arctic and Antarctic accelerated during the following years, exacerbated at least in part by rising levels of greenhouse gases (carbon dioxide, methane, and others) near the Earth’s surface. In an act of atmospheric irony, warming near the surface causes the ozone-bearing stratosphere to cool significantly. The atmosphere’s existing cargo of CFCs (with a lifetime of up to a century) consumes more ozone as it becomes colder. An increasing level of carbon dioxide near the Earth’s surface acts as a blanket, trapping heat that would have radiated through the stratosphere into space. The influence of global warming near the surface on declining stratospheric temperatures has continued, as loss of ozone over Antarctica reached new records after 1989. Thus, until humanity reduces its emissions of greenhouse gases, ozone depletion will remain a problem long after most of CFC production has ceased. Subsequent Events By the year 2000, the ozonedepleted area over Antarctica had grown, at its maxi-
mum extent, to an area two-thirds the size of Africa. While the polar reaches of the Earth have been suffering the most dramatic declines in ozone density, ozone levels over most of the planet have declined roughly 15 percent since the mid-1980’s. Late in September, 2006, the World Meteorological Organization reported that the ozone hole’s size expanded to 10.6 million square miles (28 million square kilometers), larger than the previous record extent during 2000. This area of depleted ozone was larger than the surface area of North America. Further Reading
Aldhous, Peter. “Global Warming Could Be Bad News for Arctic Ozone Layer.” Nature 404 (April 6, 2000): 531. Examines the possibility that ozone depletion could increase in the Arctic. Austin, J., N. Butchart, and K. P. Shine. “Possibility of an Arctic Ozone Hole in a Doubled-CO2 Climate.” Nature 414 (November 19, 2001): 221-225. Addresses the relationship of global warming to stratospheric ozone depletion. Hartmann, Dennis L., et al. “Can Ozone Depletion and Global Warming Interact to Produce Rapid Climate Change?” Proceedings of the National Academy of Sciences of the United States of America 97, no. 4 (February 15, 2000): 1412-1417. Covers the relationship of ozone depletion to global warming. Rowland, F. Sherwood, and Mario Molina. “Stratospheric Sink for Chlorofluoromethanes: Chlorine Atom-Catalyzed Destruction of Ozone.” Nature 249 (June 28, 1974): 810-812. The seminal scientific article establishing, in theory, that CFCs could deplete the ozone layer. Bruce E. Johansen See also
Air pollution; Environmental movement; Science and technology.
P ■ Pac-Man Identification Video game and character Date Released as an arcade game in the United
States in 1980 Initially a coin-operated video arcade game, Pac-Man broke the usually violent mold of arcade games, appealing to dedicated and casual gamers alike. The game grew into a licensing franchise, as other games, merchandise, and even a television cartoon series featuring its title character appeared during the early 1980’s. Pac-Man was a U.S. variation on a Japanese video game called Puck-Man. It became a classic by stressing nonviolent action, humor, and the “personality” of its main character, a bright yellow, dot-gobbling circle. After it was released in the United States in 1980, Pac-Man quickly became extremely popular. In a triumph of merchandising, the property expanded to include dozens of licensed spin-off, nonvideo games. The original PuckMan was developed by Toru Iwatani for the Japanese firm Namco and released in Japan in 1979. The game was then licensed and distributed in the United States by Bally’s Midway division. In the game, Pac-Man had an insatiable hunger for dots and a fear of ghosts, the onscreen enemies Blinky, Inky, Pinky, and Clyde. Players controlled Pac-Man, navigating a maze while munching dots and avoiding ghosts. Pac-Man could also swallow power pills— which would temporarily enable him to eat the ghosts—and fruits, which were worth bonus points. When all the dots and power pills were consumed, the player would progress to the next, more difficult, level. At the time Pac-Man
was introduced, most other arcade games involved either killing enemies or destroying objects with weapons—usually in outer space. Pac-Man was largely nonviolent. Even when the title character ate a ghost, the ghost was not destroyed. Instead, its impervious eyes would float back home, where it would then regrow its body. By inventing a new model for video games, Pac-Man was able to appeal to both women and men, growing the arcade-game market. Pac-Man sold more than 350,000 arcade units in the 1980’s, dethroning leading games of the era, such as Space Invaders and Asteroids. It was able to endure through an industry slump in the middle of the decade. As the best-known arcade game, Pac-Man was ported to many other video-game platforms, including home game consoles, handheld games, and personal computers. Pac-Man also spawned sequels, such as Ms. Pac-Man, Pac-Man Plus, and Baby PacMan. While most were not successful, Ms. Pac-Man
Pac-Man video game. (Ullstein Bild)
744
■
The Eighties in America
Pan Am Flight 103 bombing
achieved a level of success and cultural recognition worthy of the original. In addition to board, card, and video games, licensed Pac-Man products included toys, clothes, chalkboards, pillows, erasers, bubble pipes, costumes, shower curtains, pens, jewelry, lunchboxes, bumper stickers, and books. The game also inspired a 1982 hit single (Jerry Buckner and Gary Garcia’s “Pac-Man Fever”) and a Hanna-Barbera cartoon show, starring Marty Ingels as Pac-Man, which ran on the American Broadcasting Company (ABC) from 1982 to 1984. A motion picture was planned but never filmed. Impact Selling hundreds of thousands of units, Pac-Man injected humor and minimized the violence common to many video games of the 1980’s. The popularity of the Pac-Man character also proved decisive to the history of video games. In later years, companies would discover that iconic franchise characters—such as Pac-Man, Mario, or Sonic the Hedgehog—were essential, not only for merchandising but also to drive sales of new games and new consoles. Further Reading
DeMaria, Rusel, and Wilson, Johnny L. High Score! The Illustrated History of Electronic Games. San Francisco: McGraw-Hill Osborne Media, 2003. Johnson, Rick. “What’s Round and Yellow and Laughs All the Way to the Bank?” VIDIOT (February/March, 1983). Kohler, Chris. Power-Up: How Japanese Video Games Gave the World an Extra Life. Indianapolis: BradyGames/Penguin Group, 2004. Bill Knight See also
Fads; Japan and North America; Toys and games; Video games and arcades.
■ Pan Am Flight 103 bombing The Event Terrorist attack on a civilian airliner Date December 21, 1988 Place Lockerbie, Scotland
Following a terrorist attack on a civilian jet that killed 259 passengers and crew and 11 people on the ground, airline safety had to be tightened considerably, and a long investigation and legal process ensued.
At 6:25 p.m., Pan Am Flight 103 departed London Heathrow Airport bound for New York. It had on board 244 passengers, 47 of whom had transferred from a feeder flight from Frankfurt, Germany, as well as 15 crew members. It crossed the EnglandScotland border at 7:00 p.m. At 7:02 p.m., Scottish Air Traffic Control at Prestwick lost the plane on the radar. The airliner had exploded over the small town of Lockerbie in southern Scotland. The explosion was so massive and so sudden that the plane broke apart in thousands of fragments, scattering over a hundred square miles. The wings fell on a small row of houses, demolishing them and killing eleven people as the aviation fuel went up in a huge fireball. The Investigation Soldiers were deployed to locate the plane’s fragments and search for bodies. After forensic testing at the scene, the bodies were taken to a makeshift mortuary at the town hall, before being released to relatives. The pieces of the plane were taken to nearby Longtown, where they were reassembled to reveal a large hole in the forward baggage hold under the cockpit. Forensic tests undertaken at the Royal Armaments Research Establishment at Kew found traces of Semtex, a plastic explosive, and a timer packed in a radio, in a suitcase. Thus, within a few days, terrorism was seen as the likely cause. The local Scottish police force was given charge of the investigation, with help from the U.S. Federal Bureau of Investigation (FBI) and other agencies as needed. The Suspects
Middle Eastern groups were immediately suspected, especially a faction of the Popular Front for the Liberation of Palestine led by Ahmed Jabril, based in Damascus, Syria. Another suspect was the Abu Nidal Organization, based in Libya. A number of revenge motives were proposed, including a recent bombing raid on Libya in 1986 by the U.S. military—which had killed an adopted daughter of the Libyan president, Colonel Muammar alQaddafi—and the accidental downing of an Iranian passenger jet by the USS Vincennes in the Persian Gulf in July, 1988. Frankfurt was where one such terrorist group had a cell that had been responsible for the 1986 bombing of a West Berlin nightclub that killed two Americans. West German police had soon discovered bomb-making equipment and had arrested a number of suspects. It was thought that a member of one of the suspected terrorist groups could have
The Eighties in America
checked in an item of baggage at Frankfurt, then left the plane before it took off from London. However, no immediate leads were forthcoming, and the police investigation had to begin sifting through the evidence at the crime scene, including thousands of items of clothing and suitcases. Impact The majority of the victims were American; thirty-five of them were students from Syracuse University in New York. Many of the victims’ relatives came to Lockerbie to attend a memorial service held on January 4, 1989, staying with local townspeople, some of whom had been made homeless themselves. Later, townspeople collected and cleaned every personal item found and returned them to the relatives. Many friendships were formed. It was also immediately clear that airport security must have been lax to allow explosives and unattended baggage to pass checkpoints. A warning message delivered to the U.S. embassy in Helsinki, Finland, had not had the full circulation it should have. In September, 1989, U.S. president George H. W. Bush established a commission to investigate aviation security and its relation to the bombing. Immediate tighter security was ordered. Subsequent Events The commission reported back on May 15, 1990, detailing many faults in the security system as operated and recommending tightening up other procedures. The airline and the Federal Aviation Administration (FAA) were blamed; subsequent claims for compensation against the airline were successful. Pan Am filed for bankruptcy in 1991 and ceased operations by the end of that year, partly as a result of the attack. The police investigation led eventually to Libya and the issuing of arrest warrants for two Libyans, both working for Libyan Arab Airlines. One of them, Abdulbaset Ali Mohmed al-Megrahi, a Libyan secret agent, was found guilty when an international trial was finally arranged in 2000 at Camp Zeist, the Netherlands. The other, Al Amin Khalifa Fhimah, was found not guilty. For most of the 1990’s, Libya refused to hand the suspects over, and U.N. sanctions were imposed. After Megrahi’s appeal was turned down in 2002, Libya paid compensation to the victims’ families of $8 million for each victim. However, a further appeal is still being considered. Many of the families feel that there was deliberate government obstruction to their efforts for compensation, and some legal experts believe that the
Panama invasion
■
745
trial was flawed. In such circumstances, conspiracy theories abound, and the case is by no means closed. In 1995, President Bill Clinton dedicated a plot to the victims in the National Cemetery at Arlington. Syracuse University set up a scholarship fund for students of Lockerbie Academy and holds an annual service of remembrance. Memorials of various sorts have been created in Lockerbie and at Syracuse. Further Reading
Cohen, Daniel, and Susan Cohen. Pan Am 103: The Bombing, the Betrayals, and a Bereaved Family’s Search for Justice. Rev. ed. New York: New American Library, 2000. The Cohens are one of a number of families who have struggled for justice and compensation. This is their account. Cox, Matthew, and Tom Foster. Their Darkest Day: The Tragedy of Pan Am 103, and Its Legacy of Hope. Berkeley, Calif.: Grove Press, 1992. Includes a number of profiles of victims and their families, as well as of the investigators. Also examines the airline’s failures. Crawford, John. The Lockerbie Incident: A Detective Tale. Victoria, B.C.: Trafford, 2006. One of the fullest accounts of the investigation of the incident. David Barratt See also
Air India Flight 182 bombing; Libya bombing; Middle East and North America; Terrorism; USS Vincennes incident; West Berlin discotheque bombing.
■ Panama invasion The Event
The United States invaded Panama and overthrew the Panamanian government Date December, 1989 Place Panama The 1989 invasion of Panama by the United States was one of the most significant acts of the George H. W. Bush presidency. This invasion, known as Operation Just Cause, was touched off by the worsening state of relations between the two nations. General Manuel Noriega was the military dictator of Panama in the late 1980’s. Often referred to by the American media as a “strongman,” he was heavily involved in the drug trade, helping traffickers move
746
■
Panama invasion
The Eighties in America
Flames erupt as a result of combat between U.S. and Panamanian troops during the invasion of Panama in December, 1989. (U.S. Department of Defense)
drugs through Panama and into the United States, among other nations. Noriega had retained power despite the results of two presidential elections. When it became clear that his candidate would lose an election in 1984, Noriega had halted the count long enough to manipulate the tally. In 1989, unable to rig another election before the accurate results were made public, Noriega invalidated those results and appointed an acting president of his choosing. On the basis of the 1989 election results, however, the United States recognized Guillermo Endara as the president of Panama. Threats were made against American interests in the Panama Canal Zone, and finally one United States soldier was killed in Panama City on December 16, 1989. While the soldier may have been in Panama City on a mission to incite an incident to facilitate American military action, his death still caused major problems between the two nations. It was at this point that President George H. W. Bush decided that it was time for action against Panama. Bush had a variety of
reasons for the invasion, including protecting Americans in Panama, halting Panamanian drug trafficking, protecting U.S. positions in the Canal Zone, and helping restore democracy in Panama. In addition, Noriega would later claim that Oliver North asked him to provide military aid to the Nicaraguan Contras and that his refusal to do so was a precipitating cause of the invasion. The Invasion Begins On December 20, 1989, the invasion of Panama began. Operation Just Cause was carried out by a variety of units of the U.S. armed forces. These units included parts of the Eightysecond Airborne Division, the Seventy-fifth Ranger Regiment, the Seventh Infantry Division, the Fifth Infantry Division, and some Marine units. Nearly twenty-eight thousand troops were involved in all, supported by the U.S. Air Force. Their goals were to neutralize the Panamanian Defense Force, secure the country, and capture Manuel Noriega. Most of the military action that took place during the inva-
The Eighties in America
sion occurred in and around the capital, Panama City. Quickly, the U.S. forces involved in the operation moved to secure the Punta Paitilla Airport, another airfield at Rio Hato, and the central headquarters of the Panamanian Defense Force, know as La Comandancia. The fight that occurred while securing these sites caused fires throughout Panama City that destroyed some heavily populated sections of the city, leaving many civilians homeless. It took several days for military operations to come to an end. While the Panamanian Defense Forces very quickly collapsed, many individual Noriega loyalists continued the fight. Also, because of the breakdown of the Panamanian government and the fires sweeping parts of the city, lawlessness broke out, and there was a great deal of looting and vandalism that needed to be contained. The Hunt for Noriega While U.S. forces did secure the capital, they failed to capture Noriega. The dictator had been able to avoid capture by the United States, but with military forces trying to track him down and a one-million-dollar reward being offered for his capture, he was running out of options. Finally, he sought refuge in the Vatican embassy in Panama City, hoping to seek asylum from the Vatican or another country that might take him in. The United States had no intention of letting Noriega slip through its fingers. He and his drug trafficking ties were the main reasons for the invasion. The Army therefore began a vigil outside the Vatican embassy that soon became a media spectacle. The Americans demanded that Noriega surrender, and to keep the pressure up, they blared loud rock music from specially equipped trucks parked outside the embassy day and night. The U.S. government also put a great deal of diplomatic pressure on the Vatican to turn over Noriega. Noriega realized that he had run out of options, and he finally surrendered, turning himself over to U.S. forces on January 3, 1990. He was immediately removed from Panama and brought to the United States to stand trial on a variety of drug-related charges. Impact The United States achieved all of its goals during Operation Just Cause. U.S. soldiers removed Noriega as a cog in drug trafficking operations in
Panama invasion
■
747
Latin America, restored democracy in Panama, and protected American interests in the area. The invasion, however, did nothing to help the U.S. reputation in the region, as many Latin American countries saw the action as one more example in a long history of U.S. interference and domination. The almost two-week-long spree of looting and vandalism that the United States allowed to continue contributed to this negative reaction among the nations of Central and South America. The invasion thus further soured already fraught diplomatic relations between the United States and many Latin American countries. Subsequent Events Manuel Noriega was convicted of drug trafficking and related offenses in 1992. He was sentenced to forty years imprisonment (later reduced to thirty) and sent to a U.S. federal prison in Miami, Florida, to serve out his sentence. Further Reading
Albert, Steve. The Case Against the General: Manuel Noriega and the Politics of American Justice. New York: Scribner, 1993. Fascinating look at the ins and outs of the case against Noriega that formed the basis for the United States’ invasion of Panama; gives readers an inside look at the process leading to war. Gilboa, Eytan. “The Panama Invasion Revisited: Lessons for the Use of Force in the Post Cold War Era.” Political Science Quarterly 110 (Winter, 19951996): 539-562. Overview of the effects of using military force to achieve foreign policy goals. Allows readers to see both sides to the use of force in Panama. Musicant, Ivan. The Banana Wars: A History of United States Military Intervention in Latin America from the Spanish-American War to the Invasion of Panama. New York: Macmillan, 1990. Outlines all of the interventions of the U.S. military in Latin America, giving the historical background to the invasion of Panama. Michael S. Frawley See also Bush, George H. W.; Foreign policy of the United States; Iran-Contra affair; Latin America.
748
■
Parental advisory stickers
■ Parental advisory stickers Identification
Warnings of potentially objectionable content, affixed to recorded music
In response to pressure from the Parents’ Music Resource Center and Congress, the Recording Industry Association of America agreed to label recorded materials that contained excessive violence, strong language, or sexually explicit lyrics. In 1984, Tipper Gore, the wife of Senator Al Gore of Tennessee, purchased Prince’s Purple Rain sound track for the couple’s daughter Karenna. Upon listening to the sexually explicit lyrics to “Darling Nikki,” which included references to masturbation, Tipper Gore decided that parents should be provided with tools that would help them in making informed decisions about the music they purchased for their children. Gore enlisted the help of Susan Baker (wife of Treasury Secretary James Baker), Pam Hower (wife of Washington Realtor Raymond Hower), and Sally Nevius (wife of Washington City Council Chairman John Nevius). Together, the four women founded the Parents’ Music Resource Center (PMRC) in May of 1985. They also became known as the Washington Wives. Feeling that lyrics that promoted drug abuse, sexual promiscuity, and violence were contributing factors in the increase in violence and drug addiction that was facing the United States, the PMRC sought to work with the recording industry to create and implement a rating system for music similar to that adopted by the Motion Picture Association of America (MPAA) for feature films. The PMRC wanted any album that contained offensive lyrics or graphic cover art to be labeled as such with a parental advisory warning sticker, so parents would have the tools necessary to make educated choices as to what their children listened to and purchased. The PMRC tried working with the Recording Industry Association of America (RIAA) to establish a voluntary rating system to use for the parental advisory stickers, but at first it received little cooperation. The Washington Wives published a manifesto in The Washington Post listing their demands. In addition to demanding a rating system, the manifesto stated that warnings and song lyrics should be printed directly on album covers, albums with suggestive covers should kept under counters, television and cable broadcasters should stop airing sexu-
The Eighties in America
ally explicit or violent music videos, the contracts of artists who were found to be “offensive” should be reviewed by their record labels, and an independent panel should be created to enact and enforce all these requirements. The Senate Hearing On September 19, 1985, the PMRC and representatives of the music industry testified in front of the Senate Committee on Commerce, Science, and Transportation about the potential benefits and dangers a rating system could have on society and music. Paula Hawkins, Tipper Gore, Senator Al Gore, Susan Baker, Millie Waterman (vice president for legislative activities of the National Parent Teacher Association), Professor Joe Stuessy of the University of Texas at San Antonio, and child and adolescent psychiatrist Paul King urged the committee to take seriously the dangers they believed to be posed by modern rock music, especially rap and heavy metal. They cited explicit covers and lyrics and expanded on the effects that music referring to violence, drug usage, and sex had upon children and society, arguing that the increased occurrence and acceptance of these themes was linked with rises in actual violence. Among those arguing against the PMRC were avant-garde musician Frank Zappa, folk musician John Denver, and heavy metal front man Dee Snider of Twisted Sister. They asserted that the goal of the PMRC was censorship, which they maintained should not be tolerated in the United States or any other country. They questioned the causal nature of the link the PMRC asserted between children hearing descriptions of violence and then acting violently. Further, they connected the goals of the PMRC to the passage of other bills, including H.R. 2911, a proposed tax on blank audiocassettes. They also pointed out a conflict of interest, as many of the PMRC’s husbands were involved in the committee overseeing the hearing. Finally, the opposition insisted that the proposed federally mandated parental advisory stickers would infringe upon the civil liberties of Americans who were not minors. They also claimed that any type of censorship or rating system would ultimately have the opposite effect of its intended purpose; warning stickers would increase children’s desire to hear or view the forbidden material. Outcome On November 1, 1985, before the Senate hearings were completed, the RIAA agreed to place voluntary stickers on albums it deemed indecent, or
The Eighties in America
inappropriate for minors. All such stickers—which came to be nicknamed Tipper Stickers—would read simply, “Parental Advisory: Explicit Lyrics.” There was to be no indication of why the content was considered to be “explicit.” Albums’ lyrics, contrary to the original demand, would not be printed on their covers. Major record labels were allowed to determine which albums would receive the sticker, while independent record labels did not have to use the sticker at all. Impact Many major retailers, including Wal-Mart, Sears, and JCPenney refused to carry any album that displayed the parental advisory sticker, and other stores refused to sell any recordings bearing the sticker to minors. In reaction to the boycott, many record companies began releasing “clean” versions of labeled albums that did not bear the sticker and could be sold anywhere. Ultimately, the effects of the sticker are debatable. While the parental advisory sticker does alert parents to a potential problem, it does so in such a vague way that further investigation is warranted, and not all parents are willing to expend the time to determine why each album has been labeled. It is unclear how many people actually adhere to warnings displayed on the sticker. Many critics argue that the sticker has increased albums’ popularity precisely by making them taboo. Further Reading
Carroll, Andrew, R. Torricelli, and D. Goodwin, eds. In Our Words: Extraordinary Speeches of the American Century. New York: Washington Square Press, 2000. Includes excerpts from speeches by PMRC members and their opponents. Croteau, David R., and William Hoynes. Media/Society: Industries, Images, and Audiences. Thousand Oaks, Calif.: Pine Forge Press, 2002. Thorough look at who creates media, how media products are sold, and their effects upon society. Grossberg, Lawrence, Ellen Wartella, and D. Charles Whitney. Mediamaking: Mass Media in a Popular Culture. Thousand Oaks, Calif.: Sage, 2006. Includes a chronological history of the founding and lifespan of the PMRC. Sara Vidar See also
Crime; Drug Abuse Resistance Education (D.A.R.E.); Heavy metal; Hip-hop and rap; MTV; Music; Music videos; Pop music; Pornography.
Pauley, Jane
■
749
■ Pauley, Jane Identification American television journalist Born October 31, 1950; Indianapolis, Indiana
A wholesome appearance and earnest, likable demeanor made Pauley an unusually popular broadcast journalist and an inspiration to women in 1980’s America. Jane Pauley helped pave the way for the acceptance of diversity in television journalism. Pauley, unlike most of her female predecessors, did not try to emulate the journalistic style of her male colleagues and thus provided an alternative role model for women within television news programs, talk shows, and other media venues. Pauley rose from obscurity in 1976, when she was hired by the National Broadcasting Company (NBC) as cohost with Tom Brokaw of The Today Show, an early-morning talk and news program. Pauley was twenty-five years old. The position had been most recently held by Barbara Walters, a woman so experienced and successful in television news reporting that she was touted as a superstar journalist. NBC’s decision to hire Pauley to replace Walters was met with heavy criticism from industry insiders. Critics cited her youth, relative inexperience in broadcast media, cheery personality, and lack of an aggressive style. The hiring decision was crucial, because The Today Show was an important component of NBC’s programming schedule. Like all morning talk shows of the 1980’s, The Today Show featured both “hard” and “soft” news, combining interviews with world leaders, authors, and other notables; reporting on world and national news; and human interest stories. The show was in tense competition with other morning news shows vying for viewer ratings; chief among these rivals was Good Morning America on the American Broadcasting Company (ABC). Pauley remained a regular personality and cohost of The Today Show for thirteen years, from 1976 to 1989. Although her early years were marked by pressure to measure up to women in broadcast media who were considered to be more “serious” journalists—such as Walters, Linda Ellerbee, Diane Sawyer, and Betty Rollin—Pauley was a huge hit with American viewers. During only her second week on the show, Pauley helped it achieve its best viewer ratings in six months. Her down-to-earth delivery, wholesome good looks, and midwestern values and de-
750
■
The Eighties in America
Pei, I. M.
Jane Pauley hosts The Today Show with Chris Wallace, left, and Bryant Gumbel in December, 1981. (AP/Wide World Photos)
meanor were very popular with the public, and the show shot to the top of the ratings. Its ratings dipped when Pauley took time off to give birth to twins with husband Garry Trudeau. They went up again when she returned. Her new cohost was Bryant Gumbel, and the pair inspired a devoted following from their morning audience. In spite of this popularity, in 1989, at the age of thirty-eight, Pauley was replaced by a younger woman of thirty-one.
Pauley, Jane. Skywriting: A Life Out of the Blue. New York: Random House, 2004. Twyla R. Wells
Impact Jane Pauley changed the way the public and the media industry of 1980’s America perceived female television journalists. Her professional example and personal style helped women in television journalism and in the general workplace realize that a woman need not define herself or her career by an exclusively male standard in order to achieve success.
■ Pei, I. M.
Further Reading
Chambers, Deborah, Linda Steiner, and Carole Fleming. Women and Journalism. London: Routledge, 2004.
See also
Brokaw, Tom; Jennings, Peter; Journalism; Rather, Dan; Television; Women in the workforce.
Identification Chinese American architect Born April 26, 1917; Guangzhou, China
With his clarity of vision, purity of geometric form, and utilization of innovative materials, Pei transformed the skylines of cities across America. The 1980’s provided I. M. Pei with the opportunity to explore exciting design concepts and to utilize new materials. Farsighted American entrepreneurs and civic leaders awarded Pei commissions that gave
The Eighties in America
free rein to his innovative creativity. Within this atmosphere of big business and big spending, Pei led American architecture out of the rectangular box of the International style, which had been in place since the 1940’s. In the early 1980’s, Pei revolutionized the museum experience with his design for the west wing of the Museum of Fine Arts in Boston (1981 and 1986). He connected the existing Beaux Arts building to a new geometric gallery space with a two-hundredfoot-long, barrel-vaulted glass galleria that included restaurants, an auditorium, and a museum store. In the same spirit of social innovation, Pei transformed the aging urban center of Denver, Colorado, with his plan for the Sixteenth Street Mall (1982). The mall
Pei, I. M.
■
751
was intended to transform downtown Denver from a nine-to-five business area into a multipurpose, allhours professional and social gathering place. Rekindling the energy and excitement of the urban centers of America, Pei designed the Jacob K. Javits Convention Center in New York City (1986) and the Morton H. Meyerson Symphony Center in Dallas, Texas (1989). Both structures were exercises in the potential of glass to create seamless sculptural transitions between interior and exterior space. Pei applied his social transformations to office architecture as well. For decades, office buildings were rectangular structures filled with square offices. Pei broke that mold by introducing new shapes and new materials that invigorated the urban environment. Buildings such as his one-thousand-foot-tall J. P. Morgan Chase Tower in Houston, Texas (1982) and the Energy Plaza in Dallas (1983) broke out of the rectangular box, introducing new shapes and new ways of working within the business environment. Perhaps Pei’s boldest design was that of Fountain Place in Dallas (1986), a 720-foot-high, ten-sided, faceted prism of green reflective glass and steel that appeared to slice through the air. Each of these structures was a superb example of the profound power of geometric form; the sensual potential of glass, metal, concrete, and stone; and the intellectual appeal of purely rational design. Impact I. M. Pei’s striking designs and use of innovative materials redefined the urban skylines of the United States’ great cities and served as the model of the new American modernism, stimulating new ways of thinking and reenergizing America’s great cities. As the workplace and urban environment were altered by Pei’s revolutionary designs, so were the ways in which people worked and related within the urban environment. Further Reading
Cannell, Michael. I. M. Pei: Mandarin of Modernism. New York: Clarkson Potter, 1995. Wiseman, Carter. The Architecture of I. M. Pei: With an Illustrated Catalogue of Buildings and Projects. New York: Thames & Hudson, 2001. _______. I. M. Pei: A Profile in American Architecture. New York: Harry N. Abrams, 2001. Sonia Sorrell Architect I. M. Pei poses in his New York office in 1981. (AP/ Wide World Photos)
See also
Americans.
Architecture; Art movements; Asian
752
■
Peller, Clara
■ Peller, Clara Identification
American television-commercial actor Born August 4, 1902; Chicago, Illinois Died August 11, 1987; Chicago, Illinois Peller was featured in a remarkably successful marketing campaign for the Wendy’s fast-food franchise in the 1980’s. Her trademark line, “Where’s the beef?,” became a catchphrase and one of the most frequently invoked popular culture slogans of the decade. Dancer, Fitzgerald, Sample—a prestigious Madison Avenue advertising agency—came up with a promotional campaign for the Wendy’s fast-food hamburger restaurants that made relatively unknown actress Clara Peller the unlikely media sensation of the year in 1984. Peller was tiny—only four feet, eleven inches tall—and eighty-two years old when the first commercial in the series debuted nationally on January 10, 1984. She was presented as grumpy, angry, and outraged as she kept searching for a hamburger patty big enough to satisfy her and for a fast-food restaurant that delivered the kind of service she expected. Throughout the series, Peller was shown holding a plate with an enormous bun, on which sat an extremely small piece of meat, as she demanded, “Where’s the beef?” Americans fell in love with her. The fast-paced, tongue-in-cheek tone of the commercials increased their appeal, as did the opportunities for double entendres and sexual allusions to be made when people throughout the media and across the country began appropriating Peller’s query and repeating it themselves. Stand-up comedians, T-shirt manufacturers, and other merchandisers, including one promoting a line of underwear, cashed in on the sudden popularity of Peller, asking, “Where’s the beef?” Wendy’s also took advantage of Peller’s fifteen minutes of fame by quickly cross-marketing a single, released as a 45 rpm (revolutions per minute) record on the Awesome label. The recording featured Nashville country music deejay Coyote McCloud performing his musical composition “Where’s the Beef?” with vocals by Peller. Peller also made an appearance on the popular late-night comedy sketch show Saturday Night Live. Her influence extended even to politics: During the 1984 Democratic presidential primary campaign, Walter Mondale asked “Where’s the beef?” to dismiss Senator Gary Hart’s
The Eighties in America
vague campaign theme of “new ideas,” a strategy that gained plenty of commentary from the media and ultimately worked to Mondale’s advantage in securing the nomination. Peller herself had no success moving beyond being typecast as the “Where’s the beef?” lady. In 1985, she made a commercial for Prego spaghetti sauce in which she seemed to answer her own question as to the beef’s location. Looking at Prego’s meat sauce, Peller exclaimed, “I found it!” As a result, Wendy’s terminated her contract. She had two very minor roles in Moving Violations (1985) and The Stuff (1985), both critically panned films, and in both cases she merely alluded to the persona she had created for Wendy’s. She also made a personal appearance, along with a slew of other major and minor media celebrities, in Vince McMahon’s Wrestlemania 2 in 1986.
Clara Peller stares at a tiny hamburger and wonders, “Where’s the beef?” in 1984. (AP/Wide World Photos)
The Eighties in America Impact The playful commercials featuring Peller offered an endearing portrait of a determined octogenarian who, despite her size and gender, was not going to let anyone shortchange her. They combined a pervasive 1980’s cultural focus on demanding personal satisfaction with the suggestion that contemporary institutions were not to be trusted. Further Reading
Sullivan, Luke. Hey, Whipple, Squeeze This: A Guide to Creating Great Ads. 2d ed. Hoboken, N.J.: John Wiley & Sons, 2003. Twitchell, James B. Adcult, USA. New York: Columbia University Press, 1997. Scot M. Guenter See also
Advertising; Elections in the United States, 1984; Fads; Food trends; Hart, Gary; Mondale, Walter; Slang and slogans; Television.
■ People’s Court, The Identification American television program Date Original run aired from 1981 to 1993
The People’s Court was the first popular court show featuring actual litigants in small-claims court. The People’s Court premiered on September 12, 1981, featuring Judge Joseph A. Wapner as he presided over an actual small-claims civil court case. The show was syndicated to air daily on television stations across the country. The program’s researchers culled interesting cases from actual filings in California’s small-claims court. The litigants agreed, prior to appearing on the show, to drop their lawsuits and abide by the program’s binding arbitration in exchange for their appearance. Their lawsuits were originally limited to $1,500, as this was then the limit of smallclaims decisions in California. However, as the state raised small-claims maximums, so did the show. By the time its initial run ended in 1993, litigants could claim up to $5,000 in damages. Both parties were paid to appear on the program, and when Wapner voted a monetary verdict, the losing party forfeited that amount of the payment. Claimants were shown into the courtroom, and plaintiffs and defendants had a chance to present their cases to the judge. After hearing both sides, Wapner retired to chambers to consider his deci-
People’s Court, The
■
753
sion. He could support the plaintiff, defendant, or, as he sometimes chose, neither party. Litigants filed suits and countersuits, so that sometimes each party was both defendant and plaintiff. Judge Wapner was joined by a regular crew, including bailiff Rusty Burrell and court reporter Doug Llewelyn. Llewelyn also hosted the show, introducing the cases and providing the wrap-up at the end. Each case generally took about fifteen minutes, or half of the allotted program time. Rare cases took the entire thirty minutes. If there was time at the end, Wapner might address legal questions from the audience, or legal consultant Harvey Levin might offer advice on some legal quagmire. Impact Melding talk shows with courtroom dramas, The People’s Court enjoyed such popularity that it spawned a television subgenre, as the American public’s appetite for real-life legal drama proved to be great enough to sustain several shows. Unlike its fictional counterparts, The People’s Court did not focus on the judges or the lawyers, but instead concentrated on cases and the people presenting them. It was also one of the first programs to institute audience polls. In some controversial cases, Llewelyn would tell viewers how the studio audience would have decided the case after Wapner had rendered his verdict. Llewelyn ended each case by reminding viewers “Don’t take the law into your own hands: You take ’em to court,” and viewers tuned in weekly for more than a decade to see who was suing whom. Subsequent Events Four years after the show’s original run ended, it was revived in an hourlong format starting in 1997. The first judge to preside in the new show was former mayor of New York City Ed Koch. Further Reading
Cohn, Marjorie, and David Dow. Cameras in the Courtroom: Television and the Pursuit of Justice. Jefferson, N.C.: McFarland, 1998. Kammen, Michael. American Culture, American Tastes: Social Change and the Twentieth Century. New York: Basic Books, 2000. Meyer, Jon’a, and Paul Jesilow. “Doing Justice” in the People’s Court: Sentencing by Municipal Court Judges. Albany: State University of New York Press, 1997. Jessie Bishop Powell See also
sion.
Crime; Supreme Court decisions; Televi-
754
■
Performance art
■ Performance art Definition
Art form that incorporates live performance alongside other aesthetic modes
Performance art continued some of the practices of earlier generations in the 1980’s. However, young artists, reared in a culture saturated with media, adopted new, sometimes marketable forms. As a result peformance art sometimes blurred together with video art, cabaret, and mass cultural forms. Performance art of the 1980’s built upon the avantgarde developments of the 1960’s and 1970’s. Older performance artists pushed their experimentation, while a raft of young practitioners defined themselves as only performance artists. Multidisciplinary collaborations, changes in available technology, new venues, and evolving relations with art institutions gave the decade its particular flavor. Performance art explored many themes, such as autobiography, identity, and the body, as well as such political issues as AIDS, homelessness, and multiculturalism. Many works sought to question and perhaps to bridge the gulfs between art and life, as well as those between high art and mass culture. Definitions and Origins Precise definitions of performance art are difficult, especially in the 1980’s. It is generally live art. It may, however, be improvisational work by a single artist presented only once, or it may be a large collaboration of actors, dancers, musicians, and visual artists with casts, scripts, and sets that are repeated in many nearly identical offerings. Some pieces were lost after their presentation; others were intensely documented, and many continued to exist in video form. Some artists defied the art market’s desire to commodify their work, while other artists “cashed in” on their rising fame. The origins of late twentieth century performance art are to be found in the often outrageous works of the avant-garde during the early part of the century. The Futurists and the Surrealists, especially, set the parameters many later artists obeyed. The period of the late 1960’s and 1970’s is sometimes called a “golden age” of performance: During this period of activism, artists drew on and contested modernism and conceptual art. “Identity politics”—especially some forms of feminism—found expression through performance art, and much of that work (especially in the 1960’s) was polemical in nature.
The Eighties in America Performance Art in the 1980’s Unlike the artists of earlier generations, who had created performance art by rejecting the confines of traditional arts, artists of the 1980’s frequently defined themselves as performance artists throughout their careers. They created bodies of work that evolved and developed over time. They also sought recognition and a livelihood from this work. Many of the artists coming of age in the decade were thoroughly familiar with mass cultural forms such as television and rock music, and many were comfortable with evolving computer and video technologies. They used these mass culture elements, deliberately blurring the lines between high art and popular culture. As a result, some artists moved into film, and some performance art developed into “standard entertainment.” Still other work retained its political and outrageous edginess. The decade culminated in the 1989 congressional debate over censorship and funding for the National Endowment for the Arts (NEA). Among the artists precipitating the NEA crisis were two performance artists, Karen Finley and Tim Miller. The performance art of such controversial artists was often confrontational, responding to and targeting public ignorance and entrenched institutions, including the government that sometimes funded the work. As engaged in by significant artists of color, performance art of the 1980’s acted out multiculturalism, joining in a broader cultural conversation taking place in literature and the academy as well. In addition to taking on institutions, performance artists of the 1980’s found themselves negotiating their own institutionalization. On one hand, performance art was recognized as a valid art form by museums, resulting in numerous shows and even retrospective exhibitions. Specialized performance publications were created, art critics analyzed performance, and art schools incorporated it into their curricula. Artists in the process of denying the separation of high art from mass culture had to determine how to respond to the decision of artistic institutions to label their work as art. On the other hand, performance art also remained in the sphere of mass culture, as it was institutionalized through the market as well. New performance venues were created to showcase the form and to make its production lucrative. Special performance art galleries appeared; performance clubs or cabarets presented monologue artists; and largescale operatic performances and works combining
The Eighties in America
sound, video, and live performers played on traditional stages. Symptomatically, at the beginning of the decade, Laurie Anderson was signed to a recording contract with Warner Bros., and her performance song “O Superman” rose to number two on the British pop charts. The Blue Man Group, which in the 1980’s began as three street performers, developed into a worldwide enterprise with a longterm contract to appear at the Venetian Hotel and Casino in Las Vegas. Impact Performance art both expressed and rejected the culture of the 1980’s. It addressed topical social and political issues, exploited new electronic technologies, and participated in the exuberant art market of the decade. It also blurred the distinctions between high art and popular culture, at a time when the broader movement of postmodernism was beginning to reject those very distinctions. Further Reading
Goldberg, RoseLee. Performance Art: From Futurism to the Present. Rev. ed. New York: Harry N. Abrams, 1988. First attempt to place the contemporary practice of performance art in historical context; significant bias toward New York-based performances. _______. Performance: Live Art Since 1960. New York: Harry N. Abrams, 1998. Goldberg concentrates on modern developments. Unlike in her earlier work, she is comfortable here with theorizing and analyzing recent developments. Useful appendixes include a chronology of performance events and artists’ biographies. Roth, Moira. “A History of Performance.” Art Journal 56 (Winter, 1997): 73-83. Included in a special issue on performance art, this is not actually an article but a syllabus for Roth’s exhaustive course on modern and contemporary performance art. Valuable for its many sources for additional exploration. Jean Owens Schaefer See also
ACT UP; Art movements; Camcorders; Feminism; Glass, Philip; Homelessness; Homosexuality and gay rights; Music; Theater.
PG-13 rating
■
755
■ PG-13 rating Definition Film rating Date Introduced on July 1, 1984
The first adjustment since 1972 to the MPAA film rating system, the PG-13 rating filled the void that had developed between the ratings PG and R. This gap reflected shifting public standards regarding acceptable levels of violence and adult language in movies aimed at children in their mid-teens. In 1968, the Motion Picture Association of America (MPAA) adopted a rating system for commercially released films. The purpose of the system was to stave off government censorship by instead adopting a means for the film industry to regulate itself. By 1984, there were four possible ratings: G (general audiences), PG (parental guidance suggested), R (restricted), and X (persons under seventeen not admitted). Films receiving an R rating under this system could not expect to generate significant revenue from the teen market, because children under seventeen years of age were allowed to see such films only in the presence of a parent or adult guardian. Thus, directors of films that were geared toward teens were often contractually obligated to ensure that those films received PG ratings, reediting Rrated films as necessary to gain the lower rating. Several incidents revealed that the PG rating covered too wide an age range of viewers and was being inconsistently applied. In 1975, the MPAA ratings board wanted to give Jaws (1975) an R rating for strong violence, but Universal Pictures, fearing the consequent loss of revenue from teenage viewers, slightly edited the film then successfully lobbied to receive a PG rating with the warning “May be too intense for younger viewers.” Such movies as Poltergeist (1982) and Disney’s Dragonslayer (1981) received PG ratings, despite containing what many considered to be R levels of violence. The same ambiguities existed concerning adult language. The Academy Award winner for Best Picture, Ordinary People (1980), received an R rating solely because a character uttered a strong expletive once during the film. Some critics protested that it should have been rated PG. Another Best Picture winner, Terms of Endearment (1983), received a PG despite its characters’ use of adult language in several scenes. In each case, the MPAA received angry letters from parents protesting the assigned ratings.
756
■
Phantom of the Opera, The
Matters reached a head in 1984, when the violence in two Steven Spielberg films sparked national outcries. Both Indiana Jones and the Temple of Doom and the Spielberg-produced Gremlins contained graphic sequences of hearts being ripped from victims’ chests and animals exploding. Faced with protests from both parents and the press, MPAA president Jack Valenti announced on July 1 the creation of a new PG-13 rating, with the explanatory language “Parents Strongly Cautioned. Some material may be inappropriate for children under 13.” The first film to receive the new rating was the sciencefiction thriller Dreamscape (1984). The first film actually released bearing a PG-13 rating was the Cold War action film Red Dawn (1984). Impact The creation of the new rating reflected audiences’ growing tolerance of strong language while highlighting parents’ desire to prevent exposing pre-teen minors to strong violence. It also reflected the perception in Hollywood that teenagers would shun films that they perceived as being innocuous because they contained insufficient sexual or violent content. The rating thus allowed the studios to cater to teenagers’ tastes without alienating their parents. During the 1980’s in particular, the new rating allowed films to incorporate adult language without being rated R. As a result, a whole subgenre of teen films including Sixteen Candles (1984) and The Breakfast Club (1985) appeared that more accurately portrayed the language and lifestyles of 1980’s teenagers. Further Reading
Keough, Peter, ed. Flesh and Blood: The National Society of Film Critics on Sex, Violence, and Censorship. San Francisco: Mercury House, 1995. Vaughan, Stephen. Freedom and Entertainment: Rating the Movies in an Age of New Media. Cambridge, England: Cambridge University Press, 2005. Richard Rothrock See also
Academy Awards; Action films; Brat Pack in acting; Film in the United States; Horror films; Hughes, John; Ordinary People; Raiders of the Lost Ark; Science-fiction films; Spielberg, Steven; Teen films; Terms of Endearment.
The Eighties in America
■ Phantom of the Opera, The Identification
Hit musical based upon Gaston Leroux’s classic horror novel Director Hal Prince (1928) Authors Music by Andrew Lloyd Webber (1948); lyrics by Charles Hart (1962), with Richard Stilgoe (1943) and Mike Batt (1949); book by Stilgoe, Hart, and Lloyd Webber Date Opened on Broadway on January 26, 1988 Among the most successful Broadway musicals in history, The Phantom of the Opera was a national phenomenon in the late 1980’s. When Andrew Lloyd Webber decided to create a musical based on Gaston Leroux’s 1910 novel Fantôme de l’opéra (The Phantom of the Opera, 1911), the book was largely forgotten, and the story was known primarily through campy horror-film adaptations. While Lloyd Webber emphasized the romantic aspects of the book, the horror and mystery elements provided opportunities for him to indulge his love of spectacle: In the play, gas candles mysteriously appear, lighted, from a fictional lake, and chandeliers fall to the stage. As he had in Cats (pr. 1982), Lloyd Webber brought the audience into the spectacle. The set was designed to merge the fictional opera house with the actual theater, so that, when the characters in the musical were performing operas, the play’s audience would become the fictional audience within those operas as well. As a result, when the Phantom’s face was dramatically revealed in the show’s climax, the audience participated as part of the drama. Like many of Lloyd Webber’s works, The Phantom of the Opera combines older musical and theatrical elements with modern music and effects. Elements of twentieth century music, including rock, are used to symbolize the Phantom’s incompatibility with society. The story is set in the 1880’s, and Lloyd Webber wrote the Phantom’s opera, Don Juan Triumphant, as a twentieth century opera. The cast members scoff at it, but the Phantom is merely ahead of his time. Impact In 1988, The Phantom of the Opera won seven Tony Awards. The Phantom’s mask joined the Statue of Liberty and the Empire State Building as symbols of New York City. Many theatergoers saw the show more than once, and its success was driven in part by
The Eighties in America
Photography
■
757
Sarah Brightman, star of The Phantom of the Opera and wife of composer Andrew Lloyd Webber, takes her curtain call after the musical’s Broadway premiere on January 26, 1988. (AP/Wide World Photos)
repeat business, a rarity on Broadway. The show spawned three national tours and a long-running Los Angeles production. Two of its ballads, “All I Ask of You” and “Music of the Night,” became instant standards, recorded by numerous artists. At the same time, its title song, originally recorded as a heavy metal single, became quite popular for rock groups to cover. In the late 1980’s, even young people who otherwise had no interest in musicals were excited about The Phantom of the Opera. Its ominous theme was recognized all over the country. Further Reading
Leroux, Gaston. The Phantom of the Opera: The Original Novel. New York: HarperPerennial, 1988. Perry, George. The Complete Phantom of the Opera. New York: Owl Books, 1991. John C. Hathaway
See also
Broadway musicals; Cats; Heavy metal; Music; Theater.
■ Photography Definition
Artistic production or reuse of photos
During the 1980’s, photography rivaled sculpture and painting for prominence and critical importance in the art world. Heavily influenced by postmodernist and poststructuralist theory, as well as by the pop aesthetic of Andy Warhol, photography in the 1980’s focused on themes of representation, gender, race, and consumerism. Throughout the decade, photography was often at the center of heated debates over free-
758
■
Photography
dom of expression and the public funding of contemporary art. Philosophical Underpinnings
Much of the photography of the 1980’s found its ideological roots in poststructuralism, a philosophy that first emerged in the mid-1960’s. The work of such French poststructural theorists as Jacques Derrida and Michel Foucault focused on the process of deconstructing or historicizing the complex relationship between institutions and power by exploring the ways in which signs (including images) could be used as instruments of control. One French poststructuralist, Roland Barthes, frequently discussed photography in his writings. In 1980, he published Camera Lucida, which comments on the nature of representation and truth in the photographic image. Poststructuralist theory is closely linked to postmodernism, a late twentieth century movement that rejected many of the tenets of early twentieth century modernism. According to French critical theorist Jean Baudrillard in his influential treatise Simulacres et simulation (1981; Simulations, 1983), the postmodern world depends so heavily on mechanical reproduction and mass media that “the real no longer exists.”
Appropriation Art Appropriation art uses borrowed elements in the creation of a new work. The photography-based appropriation art of the 1980’s reflected the spirit of postmodernism through its mass-media influence and its ability to call into question concepts of originality and authenticity. Many appropriation artists began their careers as designers and commercial artists for advertising agencies and popular magazines. In the late 1970’s and 1980’s, American artist Richard Prince rephotographed photographs from popular culture, including images of a preteen Brooke Shields, and everyday advertisements, including ads featuring the Marlboro Man. He then placed these photographs of photographs within his own photo-collages, without providing any copyright information on the appropriated images. American photographer and conceptual artist Sherrie Levine first gained critical attention with her 1981 solo exhibition, After Walker Evans. In her work, she rephotographed photographs by Walker Evans, an American photographer best known for his work documenting the effects of the Great Depression, and presented the images as her own.
The Eighties in America Feminist Photography Although the 1980’s saw a backlash against the feminist movement that had emerged in the 1970’s, several prominent women photographic artists examined the social construction of gender and sexuality in their work. Barbara Krueger worked as a designer for such popular magazines as Mademoiselle before becoming an appropriation artist. She used commercial photographs and font types in her feminist-charged photomontages. Her photo-silkscreen from 1989, Your Body is a Battleground, not only interrogates mass-media representations of femininity but also evokes battles over reproductive rights that were prevalent throughout the decade. In 1980, American photographer Cindy Sherman completed a series of sixty-nine photographs entitled the Untitled Film Stills. In these photographs, Sherman placed herself in different costumes and contexts that evoke B-movies from the 1950’s and 1960’s. Like Krueger’s photomontages, these images comment on the powerful influence of the media on identity. In addition, because Sherman is identifiable in each role, they also explore the performative aspects of gender and sexuality. The women’s movement of the 1970’s was largely white and middle class. However, feminists in the 1980’s increasingly focused on women’s difference. Such African American photographers as Lorna Simpson and Carrie Mae Weems frequently examined racist stereotypes pervasive in Western culture, including mass-media images, in their work. Cultural Controversy Controversy dominated the art headlines in the 1980’s, and many of the most heated debates centered on photography. Conservative Republican Ronald Reagan was president, and the Moral Majority arose and gained political power. Much of the furor focused on whether national funding agencies like the National Endowment for the Arts (NEA) should support art that some deemed to be obscene. American photographer Robert Mapplethorpe’s work was often at the center of these storms. While his strongly lit, carefully composed, black-and-white photographs evoke a classical, nineteenth century aesthetic, their homoerotic content was often labeled pornographic by some. In 1989, the NEAfunded Corcoran Gallery of Art in Washington, D.C., canceled an exhibition of Mapplethorpe’s photographs entitled The Perfect Moment, because the gal-
The Eighties in America
lery feared that Congress would object to the exhibition’s content. Another controversial photographer of the decade was Andrés Serrano. In “Piss Christ,” Serrano, an NEA grant recipient, photographed a plastic crucifix submerged in three gallons of urine. Many of his other works also featured icons of Catholicism submerged in bodily fluids, including blood and semen. Impact
The photography of the 1980’s explored issues of ethnicity, gender, and sexuality that were important to art and literature of the decade generally. Because photographs were at once more visceral and more easily reproducible than some other forms, however, photographers were often magnets for more widespread cultural controversy. Both the photographic interests of the decade and their controversial nature spilled over into the next decade. In 1990, NEA grant recipients were forced to sign decency pledges, vowing that they would not produce obscene art. Further Reading
Cruz, Amanda. “Movies, Monstrosities, and Masks: Twenty Years of Cindy Sherman.” In Cindy Sherman: Retrospective. Chicago: Thames and Hudson, 1997. Provides a concise biography of Sherman, as well as a thorough overview of the major series in Sherman’s body of work. Danto, Arthur. “Playing with the Edge: The Photographic Achievement of Robert Mapplethorpe.” In Mapplethorpe. New York: Random House, 1992. Contextualizes Mapplethorpe’s work within the history of photography and the art scene of the 1980’s. Provides a close, personalized reading of Mapplethorpe’s photography and a provocative interpretation of its more controversial elements. Doss, Erika. “Culture Wars: The 1980’s.” In Twentieth Century American Art. Oxford, England: Oxford University Press, 2002. In this richly illustrated chapter, Doss chronicles the major trends and controversies in the art world of the 1980’s. Smith, Joshua P. Photography of Invention: American Pictures of the 1980’s. Boston: MIT Press, 1989. Covers the work of ninety American artists, many of whom used appropriation and computer technology in their experimental photography. Corinne Andersen See also Abortion; Advertising; African Americans; AIDS epidemic; Art movements; Basquiat, Jean-
Plastic surgery
■
759
Michel; Feminism; Homosexuality and gay rights; Moral Majority; Neoexpressionism in painting; Performance art; Pornography; Reagan Revolution; Schnabel, Julian; Shields, Brooke.
■ Plastic surgery Definition
Surgical intervention for reconstructive or cosmetic purposes
Plastic surgery procedures increased in frequency during the 1980’s, and the corpus of medical techniques expanded, as did the invasiveness of surgical intervention. Popular acceptance and encouragement of plastic surgery for women accompanied these methodological developments, although awareness of surgical mishaps associated with breast augmentation increased during the decade. Histories of medicine describe the development of plastic surgery as a response to the horrific injuries experienced by soldiers during World War I. Surgeons attempted to repair these men and to return them to civilian life without massive disfigurement, and they invented of necessity new procedures to accomplish these goals. Broader cultural histories include the social emphasis in American popular culture on appearance and specifically ideas about American beauty that diverged from actual American bodies. These broader historical conceptualizations assist in understanding the progression of a medical specialization from its male, military origins to its 1980’s focus on correcting minor flaws of appearance in women. The 1980’s cultural emphasis on the need for women to be both beautiful and youthful was the result of several social trends. Many decades of American cultural connections between women and the pursuit of domestic harmony encouraged an ideal in which the beautiful home was inhabited by the beautiful woman. Glossy magazines focusing on interior decoration fostered this linkage, which seldom included similarly objectified men. During the same decade, women’s youthful, unflawed bodies were displayed in tabloids, in advertisements, on television, on highway billboards, and in other media. The featured women were those deemed beautiful according to American cultural norms, which did not include larger women, elderly women, much ethnic diversity, or, indeed, the majority of women in the United States. Cultural emphasis was therefore
760
■
The Eighties in America
Plastic surgery
placed on the small percentage of women who fit within these narrow parameters of American beauty— and even images of those women were often airbrushed and otherwise altered to remove perceived imperfections. The resulting images placed increasing pressure on the remaining members of the population to make themselves fit the standard, using such available technologies as uncomfortable fashions, disguising makeup, diets, or painfully invasive plastic surgery. Cosmetic surgery became increasingly popular during the 1980’s, and the options for facial and bodily transformation expanded exponentially, with increasingly complicated interventions being performed successfully. During the decade, more women than men underwent the procedures, with some individuals radically remaking their appearance to correspond more exactly to culturally supported ideals of beauty. Popular personages also played a role in promoting cosmetic surgery for the masses. During the 1980’s, readers of gossip and fashion tabloids, targeted predominantly to a female readership, learned about the plastic surgeries of Hollywood stars, aspiring models, and performance artists. It was a man, Michael Jackson, who was the most famous example of extreme plastic surgery during the decade. It is also informative that Jackson’s surgeries, while part of a trend toward surgical minimization of nonCaucasian ethnic traits, were presented as “cautionary tales,” rather than an example for other men to emulate. Cultural icons including Jackson normalized plastic surgery, with the result that many women believed that they could also participate in the glamorization of America. In contrast, some 1980’s role models, such as Barbra Streisand, refused to change their appearance. Streisand received positive and negative press about her decision to retain her prominent nose. Streisand proved herself to be unusually strong in withstanding the mainstream American preoccupation with standardizing female beauty, however, as many other actresses did submit to the knife, the needle, and the breast implant. Breast Implants The implanting or injection of materials designed to enhance women’s breast size was a procedure first developed in the 1950’s, often by “surgeons” with questionable credentials. In the following decades, various factors, including
substandard operations—often performed by unlicensed providers—and poor quality implant materials combined with women’s culturally induced desire for large breasts to culminate in medical disaster for many individuals. The results included hard lumps, decreased sensation, pain, and gangrene, sometimes necessitating implant removal and even breast amputation, among other harmful outcomes. By the 1980’s, various types of bust enlargement procedures had been performed on large numbers of American women, both by legitimate medical practitioners and by individuals with limited surgical qualifications, but there had been little public questioning of the need for these procedures or consideration of their impact on women’s health. The lack of concern about unnecessary surgical intervention on women’s bodies can be understood only in the context of American beliefs about beauty and sexual attractiveness that often focused on the size of women’s breasts. Magazines such as Playboy became incredibly successful and generated huge economic returns based on their images of female attributes. Well-endowed women also featured in 1980’s television dramas, such as Dallas and Dynasty, popular films, and music videos. Amid the proimplant hyperbole, however, there were also some hints of concern from the medical establishment and from observers of popular culture. For example, in 1978, an article appeared in Ms. reporting on complications experienced by thirty women with breast implants. During the 1980’s, debate continued, with critiques becoming increasingly vocal after scientific study showed the development of cancer in rats exposed to silicone gel (a main ingredient in silicone breast implants). However, it was not until late 1991 that the U.S. Food and Drug Administration (FDA) began public hearings to examine the safety of breast implant procedures. Impact Plastic surgery during the 1980’s provided women with a socially approved mechanism for improving their physical attributes, including breast size, to fit within American standards of beauty. This emphasis on self-improvement also meant that cosmetic procedures received less scientific scrutiny, resulting in medical harm to an unknown number of women. Further Reading
Banner, Lois W. American Beauty. New York: Alfred A. Knopf, 1983. Historical study of the trend toward
The Eighties in America
increased objectification and the idealization of women’s bodies and faces, including the commercial applications of the beauty industry. Cole, Thomas R. The Journey of Life: A Cultural History of Aging in America. New York: Cambridge University Press, 1992. Analysis of American cultural beliefs about aging, including Western philosophical antecedents to these beliefs. Haiken, Elizabeth. Venus Envy: A History of Cosmetic Surgery. Baltimore: Johns Hopkins University Press, 1997. History of plastic surgery grounded in American social history and evolving popular perceptions of beauty. Nelson, Adie, and Barrie W. Robinson. “Gender and Aging.” In Gender in Canada, edited by Adie Nelson and Barrie W. Robinson. 2d ed. Toronto: Prentice Hall, 2002. Study of the differing experiences and representations of aging in men and women. Susan J. Wurtzburg See also
Cher; Consumerism; Dallas; Diets; Dynasty; Fashions and clothing; Feminism; Film in Canada; Film in the United States; Jackson, Michael; Medicine; MTV; Music videos.
■ Platoon Identification American film Director Oliver Stone (1946) Date Released December 19, 1986
Platoon was the breakout film of director Oliver Stone, who went on to make several of the most famous and controversial films of the next few decades. The film chronicled an American soldier’s experiences in Vietnam in 1968. It resulted in a wider appreciation for the grunts of Vietnam, if not for the war itself. Until Platoon (1986) was released, American films about the Vietnam War usually focused either upon the wrongness of the war, upon the atrocities of the war, or upon the psychological devastation wrought upon veterans who returned from the war. Routinely castigated by popular culture representations, few veterans felt free to discuss openly their experiences or to exorcise their own personal demons. Two things made Platoon not only possible but also necessary: the rebirth of American national pride and the opening of the Vietnam Veterans Memorial.
Platoon
■
761
For a decade following the end of American involvement in Vietnam, the United States appeared to suffer from what pundits referred to as “Vietnam syndrome,” a hesitance to become embroiled in major hostilities, lest earlier mistakes be repeated. The Iranian hostage crisis (1978-1981) and the Soviet invasion of Afghanistan (1979) showed a United States incapable of effective response. Ronald Reagan won the 1980 presidential election in part through his platform of renewed American pride and a promise to roll back Soviet adventurism. Meanwhile, without any federal money, the Vietnam Veterans Memorial Fund raised $8.2 million to build the Vietnam Veterans Memorial, dedicated in 1982. A new appreciation for the military, an awareness of the sacrifice made by so many young Americans, and a growing realization of the unrecognized service of American veterans meant that the time was right for a reinterpretation of the image of the American soldier who served in Vietnam. Oliver Stone had enlisted in the military and had seen extensive combat in Vietnam in 1968. He both wrote and directed Platoon in order to reinterpret the war for American audiences. The film’s story itself is not unusual: A platoon patrols, a village is demolished, a firefight erupts, and soldiers die in the jungle. What made Platoon important was Stone’s assertion throughout that these were ordinary American kids inserted into a situation for which they were ill-prepared. If they survived, they were changed— but they were still Americans doing a job that their country ordered them to do. Stone intentionally mixed this literal message with a more thematic presentation, as the protagonist of the film finds himself caught between two sergeants, who come to symbolize a good father and a bad father as the narrative progresses. Platoon was the film of the year in 1986; at the annual ceremony in 1987, it won the Academy Awards for Best Picture and Best Director, as well as Best Film Editing and Best Sound. Impact It was not uncommon to find Vietnam veterans weeping at the conclusion of a viewing of Platoon. More important, it was common to find them comforted by those in attendance who had simply not known what combat troops in Vietnam had experienced. Platoon allowed Vietnam veterans to articulate their feelings without remorse or fear of being called “baby killers,” and it portrayed the horrible complexities of a war in which one could be
762
■
The Eighties in America
Play, the
Tom Berenger, center right, as Sergeant Barnes, in Platoon. (dpa/Landov)
murdered by one’s fellow soldiers, as well as by the enemy. In some ways, Platoon might be considered one of the most important salves that allowed Americans to heal after one of their most divisive wars. In 1987, hundreds of thousands of Vietnam veterans came together to parade through Houston, Texas, finally staging the welcome-home parade they had never received on their return from combat.
Toplin, Robert Brent. Oliver Stone’s USA: Film, History, and Controversy. New ed. Topeka: University Press of Kansas, 2003. William S. Brockington, Jr. See also Academy Awards; Film in the United States; Full Metal Jacket; Iranian hostage crisis; Rambo; Reagan Doctrine; Stone, Oliver; Vietnam Veterans Memorial; Wall Street.
Further Reading
Appy, Christian G. Working-Class War: American Combat Soldiers and Vietnam. Chapel Hill: University of North Carolina Press, 1993. Lavington, Stephen. Oliver Stone. London: Virgin, 2004. Stone, Oliver, and Charles L. P. Silet. Oliver Stone: Interviews. Oxford: University Press of Mississippi, 2001.
■ Play, the The Event
The University of California defeats Stanford University with a bizarre, last-second kickoff return Date November 20, 1982 Place Memorial Stadium in Berkeley, California
The Eighties in America
The 1982 NCAA football game between California and Stanford had little meaning apart from the intense rivalry between its participants, since neither one was ranked that season. However, the wild and unorthodox nature of California’s last-second kickoff return, the heated controversy it caused, and the appearance of the Stanford band on the field while the play was still ongoing combined to produce what is often considered American football’s most memorable play. The football rivalry between the University of California Golden Bears and the Stanford University Cardinal is one of National Collegiate Athletic Association (NCAA) football’s longest and most intense. The 1982 game between the two schools, the eightyfifth in the rivalry, featured two teams that had experienced disappointing seasons. They were tied for sixth place in the Pac-10 conference, and each had
Play, the
■
763
suffered several lopsided defeats. The game proved to be as close as their similar records would have suggested. With little time remaining, Stanford trailed 19 to 17 and faced a fourth down with seventeen yards to go. The team’s star quarterback, John Elway, completed a pass for a first down and then drove the Cardinal to within field-goal range. Kicker Mark Harmon then kicked a successful field goal, putting Stanford ahead, 20 to 19, with only four seconds remaining. What happened next left everyone in shock. Harmon kicked a short rolling kickoff that was fielded by California’s Kevin Moen on the Golden Bears’ fortythree-yard line. As the Stanford players approached, he lateraled the ball to Richard Rodgers along the left sideline. Rodgers quickly pitched the ball to Dwight Garner, who ran straight into a crowd of tacklers and was swarmed. As several Cardinal players be-
At the end of the Play, California Golden Bear Kevin Moen leaps into the air after scoring a touchdown while surrounded by fleeing Stanford band members. (AP/Wide World Photos)
764
■
Poetry
gan to celebrate, the ball flew out of the pile back to Rodgers, who raced across the field with a pack of teammates close behind. On the Stanford forty-fiveyard line, he confronted a defender and pitched the ball to Mariet Ford. Ford took the ball another twenty yards where, as he was being wrapped up by a trio of tacklers, he blindly threw it over his shoulder to Moen. Moen had only one Cardinal between him and the goal with a teammate already blocking the defender, but surprisingly, the Stanford band had wandered onto the field, assuming the game to be over, so Moen had to run the final twenty yards through a sea of red-coated students racing wildly to get out of his way. As he crossed the goal line, he leaped into the air to celebrate and fell directly onto trombone player Gary Tyrrell, who was oblivious to the chaos occurring behind him. Though penalty flags had been thrown all across the field and with Stanford insisting that Garner had been tackled before lateraling the ball, officials decided that the play was legal and that California had won the game, with a final score of 25 to 20. Impact The incident—which became known as the Play—quickly became legendary. Sports Illustrated published an extensive analysis of the play in a feature article less than a year later, and the play regularly appeared in polls and retrospectives highlighting the most memorable moments in sports history. Further Reading
Bradley, Michael. Big Games: College Football’s Greatest Rivalries. Dulles, Va.: Potomac, 2006. Mandell, Ted. Heart Stoppers and Hail Marys: The Greatest College Football Finishes, Since 1970. South Bend, Ind.: Hardwood, 2006. Devon Boan See also
Elway, John; Football; Sports.
■ Poetry Definition
A form of concentrated expression through meaning, sound, and rhythm
Several factors combined to create a break with the traditional place of poetry in American life, causing the form to undergo many changes in the 1980’s After World War II, traditional forms and concepts began to break down among American poets. The
The Eighties in America
social protests of the 1960’s, the growing popularity of consumer electronics, and especially the rise of mass media transformed the United States from a private, literate, book-based culture into a media culture. American poetry was directly influenced by these changes. Films, videos, tapes of poetry readings, and interviews with poets became widely available, while new, cheap methods of printing encouraged young editors to start literary magazines, which by the end of the 1980’s numbered over two thousand. In the 1980’s, there was a great proliferation of poetic styles and poems, which were presented in new ways by new media. Ethnic and Women’s Poetry Following the establishment of university ethnic studies programs in the 1970’s, by the 1980’s many academic journals, professional groups, and magazines intended for specific ethnic groups were founded. The work of minority poets became widely published, heard, and read. There was a great flowering of such poets as Gary Soto, Leslie Marmon Silko, Louise Erdrich, Rita Dove, Maya Angelou, Cathy Song, and Simon Ortiz. Chicano, or Mexican American, poetr y achieved a new prominence in the 1980’s, and the works of Soto and Ortiz were often anthologized. Erdrich, a Native American poet, wrote of families coping with poverty, unemployment, and a diminishing of their culture on the Chippewa reservation. Her long poem “Family Reunion” (1984) highlighted some of these problems. African American poetry was influenced by such phenomena as jazz, history, popular culture, and Afrocentrism. Dove, an African American poet, won the Pulitzer Prize in 1987 for Thomas and Beulah: Poems (1986), which celebrated her grandparents and illustrated the rich inner lives of the poor and uneducated. She was appointed poet laureate of the United States in 1993. Asian American poets, a group consisting of descendants of Japanese, Chinese, Filipino, Korean, Thai, and Vietnamese people, among others, also grew in prominence during the 1980’s. Among the notable Asian American poets were the Chinese American Song, whose 1983 collection, Picture Bride, dramatized the life of her family and won a Yale Younger Poet Award. Li-Young Lee, a first-generation native-born Chinese American, won New York University’s Delmore Schwartz Memorial Poetry Award in 1986 for his first collection, Rose. Japanese American David Mura saw his
The Eighties in America
collection After We Lost Our Way selected for the tenth annual National Poetry Series in 1989. Women in the 1960’s, many caught up in the new feminism or influenced by the counterculture, had realized their own marginalization in the literary world. In the following decade, a new form of feminist criticism developed that challenged the academic canon and argued that literary standards were culturally formed, not universal and timeless. By the time Sandra M. Gilbert and Susan Gubar’s Norton Anthology of Literature by Women came out in 1985, a new tradition of women’s literature had been established. Among prominent women poets of the decade were Silko, Erdrich, Dove, Angelou, Song, Mona Van Duyn, Adrienne Rich, Louise Gluck, May Swenson, Sharon Olds, Jorie Graham, Amy Clampitt, and Maxine Kumin. The appointment of Gwendolyn Brooks, who had been the first African American to win the Pulitzer Prize, as U.S. poet laureate for 1985-1986 advanced the prominence of both women and minorities. Brooks advocated the cause of bringing poetry to the inner city by promoting poetry classes and competitions for young people. New Styles
Rap poetry grew from the African tradition of the storyteller in tribal society, who memorized the history of his tribe and recited it to music. When Africans came to the Western Hemisphere as slaves beginning in the 1500’s, they brought this tradition of talking to a beat with them, and it eventually grew into talking blues and then to hip-hop and rap. In the 1970’s, rap was associated with protest, drugs, and life on the street, but by the 1980’s, many rap artists sought mainstream success by veering away from political rhetoric to light, entertaining, and more universal themes. The decade also saw the emergence of intellectual, political rap as a viable commodity, however, especially in the work of Public Enemy. The 1980’s saw female rappers rise to answer their male counterparts. Rap became a successful movement by melding poetry with music even more obviously than other popular musical genres. Poetry oriented toward performance proliferated in the 1980’s with the rising popularity of mixedmedia performances that included choreography, video, and space-ge technology. Prominent among these poetry performances was the poetry slam, a competitive event in which poets gathered to perform in such venues as alternative art galleries, cafés,
Poetry
■
765
and bookstores. Their performances were scored by members of the audience, and the highest-scoring poet or team of poets was declared the winner. Some of the poets read their work, some recited from memory, and some were accompanied by music. Rap was sometimes featured, but poets also performed many different styles of poetry in slams, including such traditional forms as sonnets and haiku. Slams were usually marked by audience enthusiasm and participation. Impact Poetry was transformed during the 1980’s. By the end of the decade, traditional forms and ideas were no longer as important to poets as they had been, and poetry seemed more relevant than it had been previously in the century. Many believed writing poetry was a way to assert one’s individualism in the face of an increasingly uniform culture. By the 1980’s, poetry was decentralized, idiosyncratic, and often highly experimental. At the same time, an active school of new formalism featured a return to form, rhyme, and meter, restoring the lapidary effect of poetry. Although the new formalist poets worked in traditional forms, they experimented with these forms to make them new and relevant to contemporary society. Further Reading
Darcy, Philip, and David Jauss, eds. Strong Measures: Contemporary American Poetry in Traditional Forms. New York: Harper, 1986. Showcases the new formalism, including examples of seventy-five traditional forms written by nearly two hundred poets. Myers, Jack, and Roger Weingarten, eds. New American Poets of the 80’s. Green Harbor, Mass.: Wampeter, 1984. Anthology of work by sixty-five poets, including some promising poets whom the editors aim to introduce and some good poets who have not been included in previous anthologies. Ostriker, Alicia Suskin. Stealing the Language: The Emergence of Women’s Poetry in America. Boston: Beacon Press,1986. Extensive, feminist analysis of the way language and themes developed in women’s poetry. Sheila Golburgh Johnson See also Book publishing; Children’s literature; Erdrich, Louise; Hip-hop and rap; Keillor, Garrison; Lennon, John; Literature in Canada; Literature in the United States; Performance art; Public Enemy.
766
■
The Eighties in America
Poindexter, John
■ Poindexter, John Identification
U.S. Navy admiral and Iran-Contra defendant Born August 12, 1936; Odon, Indiana Poindexter was the highest-ranking defendant in the IranContra scandal, which injured the reputation of Ronald Reagan’s presidential administration. When John Poindexter joined the Ronald Reagan administration, he had a long career of outstanding naval service behind him. He had graduated first in his class from the United States Naval Academy and subsequently earned master’s and doctorate degrees from the California Institute of Technology in nuclear physics. He held numerous prestigious commands in the Navy, specializing in destroyers when at sea but also holding significant staff appointments. In 1981, President Reagan asked Vice Admiral Poindexter to become his deputy national security adviser. Poindexter subsequently became national security adviser in 1983. He played a significant role
in the development of the Strategic Defense Initiative (SDI), sometimes called the “Star Wars” defense, which was intended to create a space-based defense against ballistic missles. He was also involved in the Achille Lauro incident, in which American Leon Klinghoffer was murdered, and the Reykjavik Summit with the Soviet Union. However, it was his involvement with the Iran-Contra affair that brought him into the public view. As national security adviser, Poindexter was deeply involved in the covert operation to circumvent the will of Congress by continuing to aid the Nicaraguan Contras. He helped organize the clandestine program to sell weapons to Iran and to funnel the money from those sales to the Contras. By 1986, the scheme was beginning to unravel, and Congress summoned Poindexter, along with several other key Reagan administration figures, to testify about their actions. Poindexter repeatedly asserted his Fifth Amendment right against self-incrimination. Although this was legally correct, and probably done on the advice of his lawyers, it created an impression that he was both guilty and willfully obstructing the constitutionally mandated checks and balances upon the executive branch by the legislative branch. His actions, along with those of coconspirator Oliver North, were also suspected to be intended to cover up the personal culpability of the president. In March, 1988, a criminal indictment was leveled against Poindexter, charging that he had acted deviously, not for the benefit of the president or American interests, but for his own selfish profit. The trial quickly devolved into a circus of finger-pointing and blame-shifting, but the naturally self-effacing Poindexter was found guilty on all five counts. Impact Poindexter never received the level of public acclamation that North received, largely because of his self-effacing nature and the sense that he had acted selfishly rather than patriotically. His name did remain associated with the Iran-Contra affair throughout his later career, however. Subsequent Events
John Poindexter in 1985. (U.S. Department of Defense)
Poindexter had been compelled to testify before Congress on the condition that his statements made at the hearings could not be used against him. In 1991, a court concluded that it was possible the evidence in his later criminal trial had been tainted by his public testimony, and his conviction was therefore overturned.
The Eighties in America
In the early twenty-first century, Poindexter was employed in the George W. Bush administration to design and oversee computer systems that would ferret out signs of terrorist activity by analyzing the purchasing and communication patterns of all Americans. The program was ended by Congress, in large part because Poindexter’s prior involvement in the Iran-Contra affair called into question his claims that the government could be trusted not to abuse such a massive surveillance system. Further Reading
Walsh, Lawrence E. Firewall: The Iran-Contra Conspiracy and Cover-Up. New York: W. W. Norton, 1989. Wroe, Ann. Lives, Lies, and the Iran-Contra Affair. New York: I. B. Tauris, 1992. Leigh Husband Kimmel See also
Iran-Contra affair; Iranian hostage crisis; Klinghoffer, Leon; Latin America; North, Oliver; Reagan, Ronald; Reagan Doctrine.
■ Political correctness Definition
A term for the use of words or behavior intended to be inoffensive to identity groups or social minorities
The late 1980’s witnessed language wars about proper terminology for social groups, raising free speech issues. Although the term “political correctness” did not come into widespread use in the United States until the late 1980’s, it had its modern roots in rapid societal changes taking place in the 1970’s related to ethnicity, feminism, multiculturalism, and the disabled. The term “black”—which had formerly progressed from “Negro” and “colored,” terms used until the 1960’s—was replaced by “Afro-American” and finally “African American.” The term “handicap” was replaced by “disability” in the early 1980’s. In schools, problem learners were termed students with special education needs. On college campuses, ostensibly bastions of free speech, “politically incorrect” points of view were denounced as hateful, sexist, racist, or Eurocentric. “Political correctness,” once a term used only in a communist context (to be within the party line, the correct line), took on new shades of meaning within a culturally, ethnically, and politically diverse American society.
Political correctness
■
767
The large number of discrimination and sexual harassment suits in the 1970’s and 1980’s further served to make a number of terms taboo both in schools and in the workplace. Language and terminology were viewed as powerful tools for labeling people in a group in positive, neutral, or negative terms. Such language affected not only how others viewed an individual but also how individuals viewed themselves and made an impact on everything from social inclusiveness to career opportunities. Logos also underwent scrutiny. Major League Baseball’s Cleveland Indians’ logo, for example, was redrawn in 1973, and many school teams dropped their “Indian” logos altogether. Meanwhile, the term “Indians” underwent scrutiny and was replaced by more accurate terms such as “American Indians,” “Amerindians,” “Amerinds,” “Indigenous,” “Aboriginal,” “Original Americans,” and “Native Americans.” From Campus to Society During the 1980’s, new affirmative action programs were implemented on college campuses; non-Western studies programs rapidly expanded along with women’s studies and ethnic studies. The production of radical feminists, gay rights activists, globalists, and ethnic militants was a natural and perhaps needed outcome to generate a wider variety of educational viewpoints. However, at the same time, it fostered cultural separatism and intolerance for other values and beliefs. By the late 1980’s, political correctness moved from the campus to the larger society. Americans using politically incorrect terms could be denounced as racist, sexist, homophobic, or chauvinistic, or, in a kinder vein, labeled prejudiced or insensitive. As victims, identity groups became empowered to denounce critics. By the end of the decade, a backlash had occurred; “politically correct” (PC) had become a pejorative term used to describe an intellectual straitjacket. Many critics of political correctness were conservatives who used the term as a vehicle for attacking what they viewed as left-wing college curricular, liberal educational reform in public schools and rapidly changing social values. Philosopher Allan Bloom’s The Closing of the American Mind (1987) became a best-seller because of its powerful denunciation of “thought control” in academia. Liberals also denounced the censorship and freedom-limiting aspects of political correctness. Professional organizations such as the American Historical Association
768
■
The Eighties in America
Pop music
warned of substituting subjectivity for historical objectivity and of the lack of intellectual freedom imposed by political correctness. In short, it was believed that the true purpose of higher education, namely the search for truth and open dialogue, had been terribly twisted. Impact The late 1980’s began a battle between advocates of new terminology and increased sensitivity toward the multifold identity groupings constituting a pluralistic society and those who viewed such changes as a dictatorial restraint of freedom of expression and thought. The battle continued well into the twenty-first century. Symbolically, the battle pitted the political left as proponents and the political right as opponents. In reality, “political correctness” was more of an artificial concept than an actual movement; proponents and opponents depended on the particular issue at hand and cut across conservative and liberal ideologies. By the 1990’s, political correctness had become grist for a wide variety of satirical works and media comedy sketches. The concept formed the basis of Bill Maher’s television program Politically Incorrect, which ran on Comedy Central from 1993 to 1996 and the American Broadcasting Company (ABC) from 1997 to 2002, as well as an Internet e-magazine entitled Politically Incorrect, published from 1997 to 2000. However, in schools and in the workplace one had to be far more cautious about the words one used. Textbook publishers had to sanitize their publications repeatedly so as to avoid controversy of any kind that might reduce sales, while advertisers had to carefully select words and images so as to avoid product boycotts. The irony of political correctness, critics might point out, is that the fear of upsetting anyone upsets a great many. Further Reading
Friedman, Marilyn, and Ian Narveson. Political Correctness: For and Against. Lanham, Md.: Rowman & Littlefield, 1995. A debate between two prominent philosophers on the pros and cons of political correctness and the larger issues involved. Keith, Allan. Forbidden Words: Taboo and the Censoring of Language. New York: Cambridge University Press, 2006. A linguistic analysis of the ways people use language and how individuals and institutions censor language. Includes bibliographic references and index. Levine, Lawrence W. The Opening of the American
Mind. Boston: Beacon Press, 1997. A scholarly defense of multicultural studies and diversity from a historical perspective. The work was written as a rebuttal to conservative critics of political correctness. Endnotes and index. Ravitch, Diane. The Language Police: How Pressure Groups Restrict What Students Learn. New York: Alfred A. Knopf, 2003. A leading educator’s indictment of the extent to which the misuse of bias guidelines by government and textbook publishers has damaged education in the United States. Bibliography, index, endnotes, and useful appendixes. Irwin Halfond See also
Affirmative action; African Americans; Conservatism in U.S. politics; Disability rights movement; Education in the United States; Feminism; Women’s rights.
■ Pop music Identification
Mainstream, mass-marketed popular music
In their common ability to cross audience boundaries despite their stylistic and demographic diversity, the most popular musicians of the 1980’s represented what may have been the last generation of performers capable of creating a generation-unifying pop-cultural sound track. Although the best-selling musicians of every generation since the explosion of rock and roll have constituted a heterogeneous group, no decade saw so many disparate performers appealing to so broad an audience as did the 1980’s. Unlike the musicians of the 1990’s, the popularity of whom in many cases had the unintended effect of splintering a oncemass music-buying audience into smaller and smaller enclaves that had less and less to do with each other, the musicians of the 1980’s indulged in a musical cross-fertilization that was eventually mirrored in the unifying effect that their music had on their audiences. If Prince and Michael Jackson, for instance—two of the decade’s most ubiquitous superstars—were merely the latest links in a chain of African American musical innovators stretching back to Little Richard, James Brown, Jimi Hendrix, George Clinton, and Sly Stone, their blending of such influences into
The Eighties in America
new, race-transcending syntheses gave voice to such nonmusical ideals as the judging of people by the content of their characters (or musical talents) rather than by the color of their skin. To put it another way, if Prince’s flamboyantly aggressive eroticism was every bit as threatening to parents as Elvis Presley’s or Little Richard’s had once been (Prince’s song “Darling Nikki,” from his 1984 chart-topping album Purple Rain, single-handedly gave rise to the censorious Parents’ Music Resource Center), the fact that he was an African American had relatively little to do with his menace. Prince exercised a greater influence over the content of the pop-music airwaves than had any other single act since the Bee Gees in the late 1970’s. Like the Bee Gees, whose disco-era hits often sat side by side on the charts with hits that they had written or produced for other acts, Prince was represented on the charts by others as frequently as by himself: Prince’s “Take Me with U” shared chart space with Sheena Easton’s “Sugar Walls” (which Prince wrote and coproduced), as did his “Kiss” with the Bangles’ “Manic Monday” (which he wrote) and his “Pop Life” with Sheila E.’s “A Love Bizarre” (on which he sang). “Kiss” even provided middle-aged lounge singer Tom Jones with his first hit in more than a decade, when Jones teamed with the deconstructionist pop group the Art of Noise to cover the song in 1988. New Wave, Old Wave
Despite the popularity during the early 1980’s of the punk-derived style known as New Wave music, the decade’s most consistently lucrative stars took little if any of their inspiration from social discontent. Prince, for all of his raunchy notoriety, scored his biggest hits the way performers had for decades, with songs that redirected his obsession with sex until it was possible (if not always easy) to overlook. Michael Jackson, on the other hand, deemphasized erotic content altogether, focusing instead on a combination of rock, funk, and soul that not only highlighted his dancing but also for a time made his 1982 album Thriller the best-selling album of all time. Meanwhile, although Michael’s sister Janet Jackson would acquire a very erotic image in the 1990’s, her 1980’s output mirrored her brother’s in its emphasis on danceability, as was highlighted by her decision to title her sextuple-platinum 1989 album Rhythm Nation, 1814. The socially conscious music of Bruce Springsteen and Irish quartet U2 did not so much down-
Pop music
■
769
play hedonism as eschew it. Springsteen became a symbol, at times almost a caricature, of the earnest, hardworking, blue-collar common man, whether sketching solo, stark, acoustic parables on Nebraska (1982); creating celebratory, full-band anthems on Born in the U.S.A. (1984); or dissecting the difficulties of marriage on Tunnel of Love (1987). So wholesome was his image that in 1984 President Ronald Reagan cited him as a positive role model for American youth. (It was an endorsement that Springsteen, who donated generously to Democratic Partysupporting unions, vigorously repudiated.) To a large extent, Bon Jovi, John Cougar Mellencamp, Bob Seger, Bryan Adams, and Tom Petty and the Heartbreakers plowed similar terrain. U2 added a boldly religious twist to their social conscience, which was every bit as sensitive as Springsteen’s. The group’s first platinum album, War (1983), not only mentioned Jesus (in “Sunday Bloody Sunday”) but also contained a song taken directly from Psalm 40 (“40”). As if to drive home the Gospel element of “I Still Haven’t Found What I’m Looking For,” one of two number-one singles from the Grammy-winning 1987 album The Joshua Tree, the group rerecorded the song with a gospel choir for the sound track of its 1988 documentary film Rattle and Hum. If the rest of the decade’s superstars dealt mainly with less weighty matters, they did so in a distinctively inoffensive way. The music of Billy Joel’s most popular 1980’s album, An Innocent Man (1983), was rooted firmly in street-corner doo-wop, and its lyrics were generally interpreted to have been inspired by his marriage to supermodel Christie Brinkley. In 1984, the hard-rock veterans of the band Foreigner abandoned the image of macho aggression that they had cultivated for years and scored the biggest (and only number-one) hit of their career with “I Want to Know What Love Is.” Phil Collins, who had spent the 1970’s embodying somber pretension as the drummer and eventual lead singer of Genesis, reinvented himself, both solo and with Genesis, as a purveyor of upbeat pop (“Sussudio” and “Don’t Lose My Number”) and romantic ballads (“Against All Odds” and “Groovy Kind of Love”). Journey and its lead singer Steve Perry, a group that like Genesis entered the decade saddled with a 1970’s progressive-rock reputation, shed its past in a similar fashion. When Huey Lewis and the News declared it “Hip to Be Square” in 1986,
770
■
Pop music
they summarized a great deal of the pop-music sentiment that had by then taken hold. One notable exception to the decade’s overridingly proper decorum was the Rolling Stones, whose members maintained their image as lecherous bad boys even as they entered middle age, most notably by prominently displaying a nude woman’s body on the cover of their 1983 album, Undercover. Considered in the light of the Rolling Stones’s debonair jadedness, the on- and offstage debauchery of the decade’s two biggest hard-rock acts, Van Halen and Def Leppard, came off as reckless, if not downright juvenile, delinquency. Material Girls
Another exception to the prevailing effort to cultivate a sense of good taste was Madonna, the most successful and controversial female performer of the decade. So routinely did she flout traditional morality and confound the stereotypes associated with women in pop music that she sometimes attracted attention just by recording music that did neither (such as her implicitly pro-life “Papa Don’t Preach,” her introspective ballad “Live to Tell,” or her carefree love song “Cherish”). Although frequently criticized for her blatant sexuality (in the video for her 1986 hit “Open Your Heart,” she portrayed a peep-show stripper whose onlookers included a boy who was clearly a minor), Madonna provoked the most outrage when she blended eroticism with an iconoclasm that many regarded as antiCatholic. From her statement that she liked crucifixes because they had a “naked man” on them to her unorthodox deployment of Catholic imagery in the video for “Like a Prayer” (1989), she seemed intent not only on expressing her own sensuality but also on criticizing the root of any authority that would condemn her for doing so. Compared to Madonna, the decade’s other topselling women seemed tame. It would be many years before Whitney Houston became more notorious for drug abuse and Barbra Streisand for liberal political activism than for their wholesome, resolutely middleof-the-road music. Olivia Newton-John scored her biggest 1980’s hit with the erotically charged “Physical,” but the main difference between her edgy technopop 1980’s hits and her soft country-rock 1970’s hits was stylistic. Pat Benatar, who had begun the decade as a tough hard-rocker, seldom played up her considerable sensual appeal, eventually easing so smoothly into soft pop with “We Belong” in 1984 that the song
The Eighties in America
hit number five, becoming one of her biggest hits. Cyndi Lauper became more identified with the sentimentality of her number-one hit “True Colors” (partly because it was used in a high-profile Kodak advertising campaign) than with her implicitly anticapitalist “Money Changes Everything” or her explicitly pro-onanism “She Bop.” Donna Summer, whose orgasmic disco song “Love to Love You Baby” had made her the quintessential “bad girl” of the 1970’s, became a born-again Christian, devoting songs on The Wanderer (1980) and She Works Hard for the Money (1983) to her newfound faith. The Revolution Televised Unquestionably, the popularity of music videos in general and MTV in particular played a very important role in establishing common ground for an otherwise fragmented audience. Radio stations remained local in their outreach, but MTV was viewed nationwide and broadcast its content twenty-four hours a day. Thus, whereas acts such as Whitney Houston and Wham! would once have had to build their fan bases incrementally, MTV enabled them to implant their sound and image simultaneously in the minds of millions, much as Elvis Presley and the Beatles had done on The Ed Sullivan Show in the 1950’s and 1960’s, respectively. The video phenomenon was not invulnerable to criticism. The accusation that bands and singers were becoming popular for their taste in hairstyles, fashions, and video directors—for reasons, in other words, that had nothing to do with their music—was often made to account for the popularity (and to dismiss the musical merit) of such visually provocative or cinematically creative performers as Boy George and Culture Club, Duran Duran, Thompson Twins, Thomas Dolby, A-Ha, and any of number of latterday heavy-metal acts (Twisted Sister chief among them). Implied in such criticism was the idea that MTV was devaluing the quality of popular music by making the music itself superfluous. In one sense, such fears proved unfounded. Of the decade’s thirty-seven best-selling acts, the vast majority had either already established themselves as musicians or otherwise demonstrated their ability to captivate audiences without television. Like every major-label act of the 1980’s (and like many of the decade’s minor-label acts as well), Daryl Hall and John Oates, Stevie Wonder, Paul McCartney, REO Speedwagon, and Aerosmith produced videos, but most of them seemed like the glorified promotional
The Eighties in America
spots that they were, rather than becoming the driving force behind their careers. The well-established acts that did develop a reputation for uncommonly imaginative videos (including the Police, the Cars, David Bowie, and Dire Straits) ultimately proved no more durable or lucrative than those that did not. On the other hand, performers who began the 1980’s with considerable commercial momentum but who had little visual appeal (such as Christopher Cross, Toto, Dan Fogelberg, Air Supply, and most of the 1980’s country stars) found it difficult as MTV took hold to keep pace with their more photogenic competitors. Certainly, clever videos helped visually drab acts such as the Moody Blues and the Grateful Dead remain afloat despite both groups’ consisting of members who were as old as the parents of MTV’s target demographic. Charity Begins at Home Perhaps the best example of the boundary-dissolving nature of 1980’s music was the ensemble that gathered in Hollywood’s A&M Recording Studios on January 28, 1985, to record the Michael Jackson-penned anthem “We Are the World.” Like “Do They Know It’s Christmas?”— recorded several months earlier by an all-star cast of British pop stars calling itself Band Aid—“We Are the World” was intended to raise money to feed the famine victims of Ethiopia. Its roster of featured vocalists—who billed themselves as USA for Africa— constituted a Who’s Who of the decade’s most successful performers, including Jackson, Joel, Lewis, Perry, Springsteen, Diana Ross, and Tina Turner. It also included superstars of previous decades (Bob Dylan, Paul Simon, Dionne Warwick) and genres (Ray Charles, Willie Nelson, Al Jarreau, Kenny Rogers). That the racially integrated nature of the ensemble generated no commentary showed how far pop music and the culture it had helped shape had come in transforming itself from an outlet for youth rebellion into a force for social good. “We Are the World” eventually sold seven million copies (four million as a single and three million more on the We Are the World album), and its video remained on MTV’s heavy-rotation roster for weeks. By the summer of 1985, the popularity of both the Band Aid and the USA for Africa songs led to the bicontinental fund-raising concert Live Aid, which was broadcast live by MTV and generated nearly two million dollars in donations. If the question of whether the money actually alleviated any suffering
Pop music
■
771
remains debated, the fact that the publicity directly benefited the performers who were involved has never been in doubt. At the very least, Live Aid illustrated the unprecedented extent to which the popmusic world was intent on establishing a morally impeccable beneficent image. Hair Today, Gone Tomorrow
Like any decade, the 1980’s had its share of one-hit wonders, short-lived successes, and fads. Despite dominating the radio and MTV airwaves in 1983 and 1984 with hits like “Karma Chameleon” and “Do You Really Want to Hurt Me,” Culture Club and its flamboyant, crossdressing frontman Boy George were commercially passé by mid-decade. The brevity of their popularity entailed the failure of several acts that had tried to capitalize on the androgyny trend that Culture Club had popularized. To some extent, fashionable androgyny resurfaced in the heavy-metal bands that became briefly popular during the decade’s second half and were labeled “glam rock” groups. Mötley Crüe and Poison, heavily made-up acts that in their more flattering photos could have been mistaken for women, released nine Top 40 hits between them, giving rise to a spate of major-label signings of other “hair bands” that for the most part proved less popular. By 1988, however, Mötley Crüe and Poison were eclipsed in sales, critical acclaim, and notoriety by Guns n’ Roses, a determinedly non-effeminate band whose album Appetite for Destruction sold more than thirteen million copies and yielded the top-five hits “Sweet Child o’ Mine,” “Welcome to the Jungle,” and “Paradise City.” Even Guns n’ Roses turned out not to be immune to the drawbacks of overnight fame. By the mid-1990’s—in keeping with one of rock and roll’s hoariest traditions—the group had collapsed beneath the weight of drug abuse and internecine acrimony. Female groups, too, sometimes fell victim to the nation’s fickle tastes. No sooner had the Go-Go’s, who from 1981 to 1984 placed five hits in the Top 40, demonstrated the ability of an all-women band to hold its own than they were superseded by the Bangles. It would be well into the 1990’s, however, before another all-female band followed in either group’s footsteps, and even then the band that did, the Donnas, made far less of their moment than either the Bangles or the Go-Go’s had. Although the consecutive late-1980’s hit streaks first of Tiffany, then of
772
■
The Eighties in America
Pornography
Debbie Gibson, seemed to portend a never-ending parade of chart-topping teenage females, both Tiffany and Gibson were long forgotten when Britney Spears and Christina Aguilera began their careers a decade later. One apparent fad, however, turned out to have staying power. When the old-school-rap classic “Rapper’s Delight” by the Sugarhill Gang peaked at number thirty-six on the Billboard singles chart in 1979, few observers guessed that it marked the commencement of what would turn out to be the furthest-reaching black-pop revolution since Motown. By the time Tone-Loc reached number two on Billboard’s singles chart with “Wild Thing” in 1988, singles and albums by rappers both African American (Run-D.M.C.) and white (Beastie Boys) were selling millions of copies, with subgenres within hip-hop itself (including the beginnings of “gangsta” rap in the recordings of Schoolly D and N.W.A.) rapidly proliferating. When Public Enemy’s militantly political It Takes a Nation of Millions to Hold Us Back was voted the best album of 1988 in the Village Voice’s influential “Pazz and Jop” critics poll, rap’s transformation from an urban novelty to a cultural force became undeniable. Impact In establishing the commercial viability of numerous and ever-proliferating styles within the increasingly image-conscious context of MTV, the most influential musicians of the 1980’s brought together a greater variety of musical and theatrical influences than the musicians of any previous decade. They thus suggested creative possibilities that subsequent generations of performers have continued to explore. If the 1980’s marked the peak of a relatively homogeneous mass market to which diverse groups could hope to appeal across the board, it marked the beginning of a truly viable multiplicity of markets, among which almost any group could find a comfortable—if no longer blockbusting—niche. Further Reading
Banks, Jack. Monopoly Television: MTV’s Quest to Control the Music. New York: Perseus, 1996. Critical examination of MTV, its origins and development, and its effects on the music industry and popular culture. Christgau, Robert. Christgau’s Record Guide: The ’80’s. New York: Pantheon, 1990. Authoritative and exhaustive collection of brief but informative and opinionated reviews of more than two thousand of the decade’s albums.
Hahn, Alex. Possessed: The Rise and Fall of Prince. New York: Watson-Guptill, 2003. Detailed biography that relies heavily on input from those who have worked on the periphery of Prince’s music. Jones, Jel D. Lewis. Michael Jackson: The King of Pop. Phoenix, Ariz.: Amber Communications, 2005. Biography that attempts to explain and, at least implicitly, excuse the controversial eccentricity of Jackson as a superstar adult by focusing on his childhood experiences. Marcus, Greil. In the Fascist Bathroom: Punk in Pop Music, 1977-1992. Cambridge, Mass.: Harvard University Press, 1999. Postmodernist examination of the effects of the 1970’s punk-rock explosion on the music of the subsequent decade. Marsh, Dave. Bruce Springsteen, Two Hearts: The Definitive Biography, 1972-2003. New York: Taylor and Francis, 2003. Compilation of critic Marsh’s two previous Springsteen biographies (Born to Run and Glory Days) with additional material focusing on the years after their publication. Summers, Andy. One Train Later: A Memoir. New York: Thomas Dunne Books, 2006. Autobiography of the guitarist of the Police that is rich in behindthe-scenes detail. Taraborrelli, J. Randy. Madonna: An Intimate Biography. New York: Simon & Schuster, 2001. Unauthorized biography of the most successful female performer of the 1980’s. Arsenio Orteza See also; Boy George and Culture Club; Duran Duran; Heavy metal; Hip-hop and rap; Jackson, Michael; Journey; Live Aid; Madonna; Mellencamp, John Cougar; Michael, George; MTV; Music; Prince; Richie, Lionel; Springsteen, Bruce; Sting; USA for Africa; U2; Van Halen.
■ Pornography Definition
Sexually explicit material intended for the purpose of sexual arousal
As video and other forms of new media began to enter society in the 1980’s, pornography entered an era of unprecedented accessibility, even as it became substantially easier to produce. The proliferation of pornography increased its controversial nature, as advocates of free expression collided with champions of decency and feminist activists.
The Eighties in America
During the 1980’s, pornography came in a variety of forms such as movies, videos, books, magazines, paintings, audio recordings, and photographs. First Amendment advocates, such as pornography publisher Larry Flynt, argued that protecting pornography was vital to democracy. Pornographers and their supporters believed pornography was much more that just sexually explicit material and could be used to express discontent and to convey unpopular ideas, forms of speech that were protected by the First Amendment. Critics of pornography, however, did not share this view. Religious groups and some feminists opposed pornography and supported laws that restricted or banned it. Religious groups believed pornography was immoral, and they discouraged their members from viewing or reading pornography. Some feminists, such as Andrea Dworkin and Catharine MacKinnon, argued that pornography violated the civil rights of women, because it dehumanized them and could be linked to sex discrimination, rape, and abuse. Regulation of Pornography Dworkin and MacKinnon helped cities, most notably Minneapolis and In dianapolis, draft antipornography ordinances. These laws would allow women to sue producers and distributors of pornographic material if the women could prove that they had been harmed by that material. In 1986, the U.S. Supreme Court ruled that the Indianapolis law and others like it were unconstitutional. Most pornography was defined by the Court as “indecent,” rather than “obscene.” Obscene materials had no First Amendment protections, but only the most extreme pornography fell into that category. Indecent material, on the other hand, could not be banned—although it could be regulated. In another 1986 decision, the Court upheld laws that allowed cities and towns to use zoning laws to keep adult movie theaters away from homes, schools, churches, and parks. In 1988, pornographers began providing sexually oriented telephone services known as “dial-a-porn.” Congress immediately passed a law making “dial-aporn” illegal. In 1989, the U.S. Supreme Court overturned the federal law, because it violated the free speech rights of pornographers. The Court said that the “dial-a-porn” industry should be regulated to protect minors, but it could not be outlawed altogether, because banning “dial-a-porn” would deny adults access to this sexually oriented telephone ser-
Pornography
■
773
vice. Congress then passed a different law that required telephone companies to block “dial-a-porn” services unless they received written authorization from the subscriber. While adult access to pornography was constitutionally protected, children were to be protected from indecent material. The selling of hardcore pornography was limited at first to adult bookstores, mail-order companies, and pay television channels that parents could control. As home video took off, however, pornographic videos became much more easily accessible through local rental stores. Individuals had to be at least eighteen years old before they were allowed to enter pornographic stores, and video stores had to segregate their pornographic material from their other videos in an “adults-only” area. Other stores that sold pornography displayed pornographic magazines partially covered. Making pornography accessible to minors was illegal, as was creating pornography that featured children under the age of eighteen. Making or trafficking in child pornography was a felony. In 1984, U.S. president Ronald Reagan appointed a commission, headed by Attorney General Edwin Meese III, to study pornography. In July, 1986, the Report of the Attorney General’s Commission on Pornography (also called the Meese Report) concluded that pornography was harmful to society. This conclusion was the opposite of the conclusion reached by a 1970 presidential commission that had found no proof that pornography caused crimes. The Meese Report was highly critical of pornography, and the report itself was criticized for being biased and inaccurate. Also during the 1980’s, convenience stores across the country were ordered to remove from their shelves men’s magazines such as Playboy, Hustler, and Penthouse. A federal judge later overturned this order, ruling that the removal of pornographic magazines from stores was a form of censorship. Antipornography activists also launched attacks against the National Endowment for the Arts (NEA) for funding museums that displayed the photos and other works of art that the activists believed to be pornographic. Impact During the 1980’s, efforts to ban pornography failed. Antiporn activists believed pornography should be illegal, because it was immoral, was addictive, and dehumanized women. Supporters argued
774
■
that pornography was a form of free speech and should be protected under the First Amendment. Throughout the 1980’s, the Supreme Court consistently ruled that indecent material—unlike obscene material—must be made available to adults, but laws could be passed to regulate pornography as a way to keep it out of the hands of children and away from neighborhoods, churches, and schools. Further Reading
Cornell, Drucilla, ed. Feminism and Pornography. New York: Oxford University Press, 2000. This collection of essays explores the wide range of issues about pornography from a variety of viewpoints. Harrison, Maureen, and Steve Gilbert, eds. Obscenity and Pornography Decisions of the United States Supreme Court. Carlsbad, Calif.: Excellent Books, 2000. Overview of major obscenity and pornography decisions by the U.S. Supreme Court, explained in nonlegal language for the general reader. MacKinnon, Catherine. Only Words. Cambridge, Mass.: Harvard University Press, 1993. Written by one of the nation’s foremost proponents of feminist legal theory, this book is a collection of essays that are strongly opposed to pornography and its protection under the First Amendment. Slade, Joe W. Pornography in America: A Reference Handbook. Santa Barbara, Calif.: ABC-Clio, 2000. Comprehensive history of pornography in the United States that includes biographies of pornography’s supporters and opponents. It also is available in an electronic format. Strossen, Nadine. Defending Pornography: Free Speech, Sex, and the Fight for Women’s Rights. New York: New York University Press, 2000. The president of the American Civil Liberties Union and selfproclaimed defender of pornography argues that free speech has always been a strong weapon to fight sex discrimination and refutes the position of antipornography feminists that all pornography is inherently degrading to women. Eddith A. Dashiell See also
The Eighties in America
Post office shootings
Domestic violence; Dworkin, Andrea; Feminism; Flynt, Larry; Koop, C. Everett; Meese, Edwin, III; Parental advisory stickers; PG-13 rating; Sexual harassment; Swaggart, Jimmy; Women’s rights.
■ Post office shootings The Event
Several mass shootings in post offices related to job anxieties
In the 1980’s, violence in post offices seemed to illustrate the new stressful relationship between employees and supervisors, as well as increased job demands for postal workers. After a mass shooting in 1986, violent eruptions in post offices were nationally reported and scrutinized. Congress assumed in 1971 that the newly organized U.S. Postal Service (USPS) would pay for itself by 1984. In 1973, the USPS lost $13 million. It lost $438 million in 1974 and $1.2 billion in 1975. Stamps cost 6 cents in 1971 but 13 cents by 1977, despite 55,000 fewer postal workers. In 1977 Congress voted a billion dollars to keep the USPS solvent. Worker productivity was low. Mail volume declined due to competition from United Parcel Service (UPS) and private mail carriers. There was talk of ending Saturday mail delivery. The Carter administration and Congress increased the work expectations of USPS employees, and USPS job pressures grew. Post Office Angst in the 1980’s In 1980, a New Orleans postal worker named Curtis Collins killed his supervisor with a .30 caliber carbine. Amid the media scrutiny of the incident, Hugh Bates, president of the National Association of Postmasters, said threats of violence were common in post offices across the nation. Bates said that two postmasters in Alabama had been killed in the last ten years. In August, 1986, Patrick Sherrill, a part-time letter carrier in Edmond, Oklahoma, walked into his post office and murdered thirteen persons, then killed himself. An ex-Marine and member of the National Guard, Sherrill had been repeatedly censured by superiors for bad job performance and may have been close to being fired. He was verbally reprimanded the day before the shootings. Sherrill brought three pistols to work and killed whoever he saw, chasing some victims down. Sherrill was a loner, described as “weird” and “angry” by fellow workers. He was fascinated by guns and peeped into neighbors’ windows at night. He was unhappy in Edmond and twice passed tests to be transferred, but these tests had not been acted upon. Two weeks after the Sherrill killings, six Dallasarea postal workers were suspended and ordered to undergo psychiatric evaluations after they made
The Eighties in America
Post office shootings
■
775
Escondido, California, where he killed two coworkers and wounded another, then shot himself in the head. Taylor had worked for the post office for twenty-seven years and had won work awards. He had talked about the Sherrill shootings two days earlier. Two weeks later, another twenty-seven-year veteran postal worker from Escondido hanged himself in his garage; after amassing twenty-five hundred hours of sick leave, he had been reprimanded for taking a sick day just before he left on vacation. The USPS said that the rate of violence within the postal service was no greater than it was in other professions. From 1958 to 1989, there were 355 instances of assaults on supervisors and 183 instances of supervisors assaulting employees in post offices across the country.
This statue was erected in 1989 to commemorate the fourteen victims of Patrick Sherrill’s August, 1986, post office shooting spree in Edmond, Oklahoma. (AP/Wide World Photos)
War of Words Vincent Sombrotto, president of the National Association of Letter Carriers, told reporters that Patrick Sherrill had no doubt been pushed over the brink by the management style of his superiors, which had been irresponsible and coercive. Beryl Jones, president of the Oklahoma City Postal Workers Union, agreed, saying there was intimidation and pressure and Sherrill just snapped. The USPS’s communication administrator said Sherrill had not been close to being fired but was only given an elementary counseling session. The president of the National Association of Postal Supervisors said Sombrotto’s comments blamed postal supervisors for the actions of a disturbed individual. Letter carriers around the country wrote to newspapers, agreeing with Sombrotto that employees were routinely harassed by supervisors. Arguments of this sort erupted with each post office shooting.
threats. There were similar occurrences in Oklahoma and Arkansas. In May, 1987, an Oklahoma City mail carrier threatened several people, mentioned the Sherrill massacre, and was arrested by federal agents for possession of three mortar grenades. In December, 1988, mail handler Warren Murphy shot three fellow workers with bird shot in New Orleans, then held a woman hostage for thirteen hours. None of the victims sustained life-threatening wounds. Murphy had recently been promoted. In August, 1989, postal worker John Merlin Taylor shot his wife to death, then drove to work in
Impact In 1987, lawsuits totaling $166 million were filed against the Edmond post office, the Air Force, the Army, the City of Edmond, and the Edmond Police Department, by families of Patrick Sherrill’s victims. Most of these lawsuits were dismissed by 1989. In 1988, a General Accounting Office investigation found that the job histories of 63 percent of those hired by the USPS were not checked, a violation of government rules. The USPS replied that, under government rules, it was almost impossible not to hire veterans such as Patrick Sherrill for civil service jobs regardless of their work history. Spree killings of all sorts continued in the United States into the 1990’s.
776
■
The Eighties in America
Power dressing
Further Reading
Douglas, John, and Mark Olshaker. The Anatomy of Motive. New York: Pocket Books, 1999. A famous FBI profiler, Douglas gives insights into post office killers in his chapter “Guys Who Snap.” Lasseter, Donald. Going Postal. New York: Pinnacle Books, 1997. Study of post office violence in the United States. Blames USPS practices for creating a stressful work place. Pantziarka, Pan. Lone Wolf. New York: Virgin Books, 2002. A British mystery writer describes and explains real-life spree killings in the United Kingdom and the United States in a readable style. James Pauff See also
Business and the economy in the United States; Crime; Goetz, Bernhard; Income and wages in the United States; Inflation in the United States; Reaganomics; Recessions; San Ysidro McDonald’s massacre; Stockton massacre; Unions.
■ Power dressing Definition
Wearing formal professional clothing in order both to feel and to appear competent and powerful
Power dressing became particularly important for women during the 1980’s as they entered corporate America in evergreater numbers and attempted to break the glass ceiling. However, business women were not content to follow limited formulas for power dressing, and they looked for ways to modify the business “uniform” while maintaining a professional appearance. Recognizing the needs of a new generation aspiring to enter the business professions, John Molloy prescribed a “uniform” of appropriate dress for men (1975) and women (1978). His rules were widely adopted by men in the 1980’s; women also followed his advice, though with significant modifications. Appropriate business attire was established for men in the nineteenth century. Molloy in the 1970’s adapted those fashions for new college graduates in the late twentieth century, prescribing conservative dress of good manufacture and fit. The business suit was the basic uniform, and it was to be worn in a limited range of colors (navy, black, gray, or brown) and modestly accessorized with white shirt and a tie. Business dressing for women was a newer and more
problematic issue. Women re-entered higher education in the 1960’s and 1970’s and took their place beside men in business and the professions in the 1970’s and 1980’s. Earlier women’s fashion had been designed to display the social position of their husbands, not themselves. In the late 1970’s, Molloy adapted his conservative male “uniform” for the female business aspirant: She was to wear a skirted suit, soft blouse, and a feminine version of the tie, floppy and bowed. This too became a standard that, with modifications, continued in effect for decades. Women, however, were less content with the limitations of Molloy’s prescription than were men. Some felt masculinized in such business attire, and many resisted the sense of wearing a uniform. Accustomed to more pronounced and rapid changes in fashions, many women found the rules boring and their duration interminable. Women sought out other models of fashions that combined power and femininity. One such model was the United Kingdom’s Princess Diana. Throughout the decade, television series such as Dynasty, Designing Women, and Moonlighting also portrayed alternative modes of power dressing: The powerful women on these shows wore daywear that included skirted suits and jacketed dresses with slim, broad-shouldered silhouettes, a much wider range of textures and colors, and stronger accessories. They also wore opulent evening wear. Especially emblematic of power dressing for women were large shoulder pads and wide lapels. By the end of the decade, the movie Working Girl (1988) served as a fashion handbook for getting ahead. Business dressing was at base antifashion. It was “investment dressing”: classic, well made, not responding to fads. However, fashion responded. For men, designers such as Armani made available both couture and ready-to-wear power suits, while the ready-to-wear lines of Jones of New York and Liz Claiborne dressed America’s businesswomen. Impact In the 1980’s, appearance was an important part of performance. Iterated in the 1970’s and reiterated in popular media throughout the 1980’s and beyond, power dressing made up the rules helping both men and women dress with confidence in business environments. Further Reading
Davis, Fred. Fashion, Culture, and Identity. Chicago: University of Chicago Press, 1992.
The Eighties in America
Johnson, Kim K. P., and Sharron J. Lennon, eds. Appearance and Power. Oxford, England: Berg, 1999. Welters, Linda, and Patricia A. Cunningham, eds. Twentieth-Century American Fashion. Oxford, England: Berg, 2005. Jean Owens Schaefer See also Advertising; Business and the economy in the United States; Designing Women; Dynasty; Fashions and clothing; Feminism; Hairstyles; Leg warmers; Moonlighting; Preppies; Television; Women in the workforce.
■ Preppies Definition
People who fit the stereotypical description of students and graduates of prestigious preparatory schools
The stereotypical culture of college-preparatory schools became an influence upon and focus of mainstream American culture during the 1980’s. Preppies dressed and acted conservatively, and the preppy look and manner came to stand as a conservative alternative to the more outlandish fashions of the decade. The term “preppy” received widespread recognition after Ali McGraw’s character applied it to Ryan O’Neal’s character in Love Story (1970). It quickly caught on as both an identifier of a certain class of privileged young people and, for those who disparaged them, a derogatory label for a snob. The term gained renewed currency in 1980, when The Official Preppy Handbook was published. Similar to books cataloging the features of mythical creatures such as gnomes, the book used illustrations and lists to explain the characteristics and behavior of the modern preppy. The most innocent usage of the term “preppy” meant a student or alumnus of a private collegepreparatory school, such as Phillips Exeter Academy, Andover, or Emma Willard School in the Northeast, or the Altamont School, Baylor School, or Foxcroft Academy in the South. Those schools’ alumni seemed able to recognize one another, even if they were graduates of different years or different schools. Used negatively, the term referred to those thought to display “superior” attitudes of ennui, sarcasm, exclusivity, and an excessive enthusiasm when around their peers.
Preppies Outstanding Characteristics
■
777
Preppies were associated with moneyed families, usually “old” money, and were the product of selective “good” breeding, with couples matched to each other with careful thought and planning. Their children were raised with instruction in good manners and good taste. When of school age, the children were sent to day schools or boarding schools, such as the Chapin School or the Dalton School in New York, Deerfield Academy or Groton in Massachusetts, or the Hotchkiss School in Connecticut. By the time they were ready for college, they had experienced extensive travel on at least one continent other than North America and had a fair to excellent fluency in at least one language other than English. A preppy was likely to attend the same college or university as had his or her grandparents, parents, uncles, and aunts. High on the list of preferred schools were Harvard, Yale, Princeton, Dartmouth, Bowdoin, and lesser-known preppy schools such as Colorado College and Trinity. The most often remarked feature of preppies was their style of dress. They tended to follow classic American fashion, wearing clothes made of natural fiber fabrics, with simple lines and fine workmanship. The Brooks Brothers suit was the uniform for business and formal attire for men, and it was worn with button-down oxford cloth shirts in pastel shades. More casual wear included cuffed khaki pants, Izod Lacoste polo shirts with their distinctive alligator icon, and topsiders (sailing shoes) worn without socks. Female preppies wore ribbons in their hair, headbands, cardigan sweaters, and pearls. Both sexes seemed partial to the color combination of pink and green. It was this style of dress that made preppies most significant in American culture, because it was easily imitated. As a result, over the course of the 1980’s, “preppy” ceased to refer solely to attendees of elite schools and came instead to refer to anyone who dressed in similar clothing or cultivated the haughty and overly cultured attitudes associated with the preppy stereotype. Preppies spent their leisure time or vacations in Nantucket or Martha’s Vineyard, for instance, a tradition of several generations. Because they were usually in good physical condition, they were often avid sportsmen and -women, indulging in skiing, tennis, golf, swimming, squash, horseback riding, and boating. Preppies hoping to impress a potential boss or future fiancé strove to participate in at least the trials
778
■
The Eighties in America
Prince
of the America’s Cup races, the ultimate display of expert boatmanship. Impact During the 1980’s, class became a significant topic of discussion both in the media and in popular culture. Some of the most popular teen films of the decade were written or directed by John Hughes, who often produced stories about the fraught relations between teens of different classes. As the Ronald Reagan administration successfully rejuvenated the U.S. economy, class disparities became more evident than ever, as poor and rural people failed to benefit from an economy that was making others wealthy. Moreover, as Reagan’s conservatism both mirrored and increased that of the country at large, many subcultures and countercultures arose, and many of them featured outlandish styles of dress. The image of the preppy in American society became laden with meaning during the decade. For some, it was a signifier of the detachment and simple cluelessness of the wealthy in the face of poverty. This version of the stereotype received one of its most famous portrayals by Dan Aykroyd in Trading Places (1983). For others, preppy styles represented a palatable mode of dress in the face of punks, goths, and Madonna fans who wore their lingerie outside their clothes. Preppies’ association with old money complicated their class significance still further, as they stood in opposition to the rise of the young, newly rich yuppies. Meanwhile, actual preppies had to negotiate all of these positive and negative associations. Further Reading
Birnbach, Lisa, ed. The Official Preppy Handbook. New York: Workman, 1980. The definitive (and humorous) book describing all aspects of the preppy lifestyle, from birth to married adulthood. Photos, drawings, and lists are included, as well as a lexicon of preppy words and phrases. Fussell, Paul. Class: A Guide Through America’s Status System. New York: Dorset Press, 1992. Deals with the “visible and audible signs of social class” reflected by choice, not race, religion, or politics. Walker, Matt “Johnny,” and Marissa “Mitzy” Walsh. Tipsy in Madras: A Complete Guide to 80’s Preppy Drinking. New York: Berkley, 2004. Commentary on how to drink in preppy fashion, with recipes for “classic” preppy drinks, and on literature and customs to help the reader identify a preppy. Jane L. Ball
See also
Business and the economy in the United States; Fashions and clothing; Golf; Hairstyles; Hobbies and recreation; Hughes, John; Reaganomics; Tennis; Yuppies.
■ Prince Identification African American recording artist Born June 7, 1958; Minneapolis, Minnesota
During the 1980’s, the multitalented Minneapolis-based musician Prince recorded several albums and singles that were both wildly successful and critically acclaimed. He experimented with different styles and genres and pushed the boundaries of acceptable subject matter in song lyrics. He also displayed a highly idiosyncratic and eccentric persona. Prince had already begun to make inroads as an R & B musician when he released his third album, Dirty Mind (1980), which he wrote, produced, and played almost entirely solo. He mixed rock, funk, pop, and New Wave and topped them off with his distinctive, falsetto vocals. The album also contained unusually sexually explicit lyrics. Dirty Mind was the first Prince album to capture the attention of prominent music critics. The album employed a minimalist approach to popular music—coming at a time when a great deal of mainstream rock and R & B relied on ornate production values—that would prove to be highly groundbreaking and influential. Prince’s 1982 double album, 1999, gave him his first sizable hit singles, the anthemic title track and “Little Red Corvette.” Both songs were especially popular on MTV and rock radio, helping break the barriers that often kept African American music segregated from the mainstream. Prince continued to hone his sound, experimenting with synthesizers and drum machines. “Baby, I’m a Star” Prince’s next project raised his profile considerably higher. He starred in the film Purple Rain (1984) and also wrote and performed the film’s sound track with his new backing band, the Revolution. The film was popular, but the accompanying album brought Prince the most critical and commercial acclaim to date. He had his first number-one hits with “When Doves Cry” and “Let’s Go Crazy,” and the album eventually spent twentyfour weeks at number one on the Billboard 100 chart and sold over ten million copies in the United States.
The Eighties in America
Prince
■
779
character masturbating. Gore’s response was to form, with several other senators’ wives, the Parents’ Music Resource Center (PMRC). In 1985, the group held a number of highly publicized, controversial hearings in Washington, D.C., regarding its concerns about the explicit content in popular music. Prince’s Purple Rain marked the peak of his popularity, but his 1985 release Around the World in a Day yielded two more hits, “Raspberry Beret” and “Pop Life.” He returned to film in 1986, starring in and directing the film Under the Cherry Moon. The film was a commercial and critical flop, but its accompanying sound track album, Parade, was more successful, and it contained one of his biggest hits, “Kiss.” By this time, Prince had written or produced hit songs for Chaka Khan, the Bangles, Sheena Easton, Vanity Six, Apollonia Six, and the Time. During the recording sessions for the follow-up to Parade, Prince broke up the Revolution. The resulting double album, Sign o’ the Times (1987) was hailed by many critics as his masterpiece.
Prince in concert in Cincinnati, Ohio, in 1985. (AP/Wide World Photos)
Prince also won an Academy Award for the sound track. Purple Rain officially marked Prince’s arrival in the mainstream music scene as one of the top artists of the day, joining the ranks of Michael Jackson, Madonna, and Bruce Springsteen, but Prince was much more eccentric than most other popular musicians. Short, thin, and frequently dressed in colorful clothes, Prince was also soft-spoken and rarely gave interviews with the press. The Revolution was a distinctive backing band—a racially mixed group of men and women, also wearing bright, eye-catching fashions. Prince’s notoriety increased further when Tipper Gore, the wife of U.S. senator Al Gore of Tennessee, noticed her daughter listening to one of the racier Purple Rain songs, “Darling Nikki.” She was disturbed to discover that the song’s lyrics referred to the title
Declining Fortunes Prince’s next move confused many people. At the last minute, he decided not to release his completed follow-up to Sign o’ the Times, entitled The Black Album. In its place, he released Lovesexy (1988). The album was less successful than his past two releases had been, and his decision to shelve an entire album confused many fans. (The Black Album was heavily bootlegged. Prince would eventually release it in 1994.) In 1989, Prince released his sound track to the summer blockbuster Batman, although only two of his contributions were heard in the film. The song “Batdance” was a number one hit, but as the decade came to a close, Prince was no longer at the cutting edge of popular music. Another African American music form, hip-hop and rap, took Prince’s place in the vanguard of popular music and upped the ante on controversial subject matter as well. Impact Prince was one of the most influential and groundbreaking popular musicians of the 1980’s. His best songs and albums became classics. He influenced many artists, including his contemporaries, such as Janet Jackson, George Michael, and Terrence Trent D’Arby, and subsequent music makers who grew up on his music, such as Outkast, Justin Timberlake, and D’Angelo.
Prozac
The Eighties in America
Further Reading
Prozac was marketed as a green and yellow capsule (later available as a pill as well) with a normal introductory dosage of ten milligrams. The dosage was often increased over time, as it could take up to four weeks for the drug to provide full effect and the effectivity of a given dosage to be evaluated for a given patient. Prozac was recognized as having minor physical side effects, but it also had psychological side effects that made it controversial. The drug was extremely popular upon its initial release, not only because of its effectiveness in treating depression, but also because of an extensive marketing campaign launched by Eli Lilly. Indeed, the advertising blitz accompanying the drug’s launch was the most extensive pharmaceutical campaign of the decade. During the first week of production, over 400,000 prescriptions were filled for Prozac, which the media and some researchers referred to as a “wonder drug.” The media, however, also began to consider the demand for such a drug and the consequences of its availability. As a result, it began to employ the term “Prozac Generation” to describe the combination of stress, depression, self-awareness, and open discussion of mental illness that characterized the late 1980’s and early 1990’s. Eli Lilly’s advertising campaigns and studies regarding Prozac were later subject to criticism. An internal 1984 study by Eli Lilly indicated that at least 1 percent of Prozac users reported increased suicidal thoughts, and studies in 1985 concluded that patients who tested the drug experienced an increase in violent behavior. As these findings were later addressed and brought to the attention of the public and the civil courts, Eli Lilly denied any links between Prozac and suicide. Notwithstanding later revelations and controversy, Prozac’s initial arrival in the late 1980’s was considered a breakthrough for treatment in depression, as well as many other mental disorders. As a result, it became one of the most successful psychiatric drugs in history. Moreover, the advertising campaign associated with Prozac had the side effect of reducing the shame associated with mental illness. The drug and its media representations combined to change the way society addressed depression, and they altered individuals’ assessments of personal feelings and emotional restrictions.
780
■
Hahn, Alex. Possessed: The Rise and Fall of Prince. New York: Billboard Books, 2003. This expose looks into how and why Prince went from his highly acclaimed 1980’s success to his subsequent loss of popularity. Jones, Liz. Purple Reign: The Artist Formerly Known as Prince. Secaucus, N.J.: Carol, 1998. This biography features a rare interview with Prince. Also of note are reviews of every Prince album to that point from newspapers and music magazines. Nilsen, Per. Dancemusicsexromance: Prince, the First Decade. London: Firefly, 1999. This biography was written by the editor of the long-running Prince fan magazine Uptown. Michael Pelusi See also Academy Awards; African Americans; Film in the United States; Hip-hop and rap; MTV; Music; Music videos; New Wave music; Parental advisory stickers; Pop music; Synthesizers.
■ Prozac Definition Psychiatric drug Manufacturer Eli Lilly Corporation Date Approved by the Food and Drug
Administration on December 29, 1987 Prozac was the first of a new class of drugs to be used to treat depression in the United States. By affecting the rate at which the neurotransmitter serotonin is reabsorbed from synapses, or gaps between nerve cells, the drug could be used to reestablish a desirable chemical balance in the brain. As a new class of drug known as a selective serotonin reuptake inhibitor (SSRI), Prozac (fluoxetine) was introduced as a healthier alternative to existing antidepressants on the market in the late 1980’s. These other drugs, including monoamine oxidase inhibitors (MAO inhibitors) and tricyclics, were known to have more serious side effects and were thus considered more dangerous than was Prozac. Prozac was created for the Eli Lilly Corporation, one of the world’s largest pharmaceutical companies, by a team led by Ray Fuller. Originally called Lilly 110140, Prozac was renamed by a marketing team once it was ready for sale. The brand name was a simple connotation of the positive-sounding prefix “pro” and a neutral suffix.
Impact As the first of the SSRIs, Prozac became a product and a term that bridged psychiatry and the
The Eighties in America
mainstream public. Prozac’s approval led to the creation of many other antidepressant drugs used for the treatment of not only depression but also other mental disorders, and SSRIs came to be considered the number one treatment for depression. Further Reading
Chambers, Tod, and Carl Elliot, eds. Prozac as a Way of Life. North Carolina: University of North Carolina Press, 2004. Healy, David. Let Them Eat Prozac: The Unhealthy Relationship Between the Pharmaceutical Industry and Depression. New York: New York University Press, 2004. Wurtzel, Elizabeth. Prozac Nation: Young and Depressed in America. Boston: Houghton Mifflin, 1994. Jean Prokott See also Health care in the United States; Medicine; Psychology; Slang and slogans.
■ Psychology Definition
The science of human behavior and mental processes and the application of this science
During this decade, advancing brain science, sophisticated large-scale genetic research, and disappointments about earlier social programs dampened overconfidence in the transformative power of social environments and instigated new respect for the role of individual people each with a unique biological, social, and genetic past. With new respect for the biology of behavior, clinical psychology moved closer to becoming a medical field. For the decades preceding the 1980’s, prevailing models of psychology emphasized the control of plastic, easily molded individuals by a powerful environment. American behavioral psychologist B. F. Skinner had long maintained that a proper reinforcement procedure could train obedient children as well as obedient dogs. An earlier generation of psychologists implicated the bad mothering of children as a major cause of mental illness. Post-1965 psychologists participated in a movement to improve society by designing “Great Society” social programs expected to eliminate poverty and crime through better socialization and educational practices.
Psychology Personal Traits and Genetic History
■
781
Research during the 1980’s supported the proposition that the personal characteristics (traits) of individuals exert a powerful influence. A major contribution of this decade was the development of an agreed-upon basis for describing personality. Before 1980, each personality theory had its own list of trait descriptions. During the 1980’s, massive studies measured and compared traits and clustered similar traits into more comprehensive factors. Evidence from these studies suggested that personality could be approximated best by locating a person on five summary dimensions. These dimensions, referred to as “supertraits,” were extraversion (assertive sociability), neuroticism, conscientiousness, agreeableness, and openness (curiosity and tolerance). Paul Costa and Robert McRae showed that most people in a variety of cultures retained stable scores on these characteristics throughout their adult lives. Individuals who showed differences on these trait characteristics both responded differently in the same environment and, given the freedom to do so, selected different environments. Several lines of research converged in the 1980’s to support the proposition that these supertraits had genetic roots. Even six-month-old infants showed reliable differences in social smiling and babbling (extraversion) and in fussiness (neuroticism). The similarity between genetically identical twins reared by different parents and between adopted children and their absent natural parents was shown to be substantial. Such similarities could be due only to heredity. Evolutionary psychologists argued that these supertraits became important over the centuries because they were adaptive, providing useful information to people in a variety of cultures as they selected allies and leaders. During the decade, it became apparent that major types of psychopathology were also linked to genetic predispositions and mediated by brain conditions. The brains of patients suffering the bizarre thinking of schizophrenia differed from normal brains in the concentration of chemical substances on fibers transmitting nerve impulses. Twin and adoption studies suggested, furthermore, that hereditary predispositions contributed heavily both to schizophrenia and to those bipolar patients experiencing cycles of depression and elation. There also appeared to be a genetic component predisposing some individuals to alcoholism and others to delinquency.
782
■
The Eighties in America
Psychology
Changes in Psychological Services
Consistent with the renewed respect for the biology of behavior, clinical psychology moved closer to an identity as a medical discipline. The medically oriented treatment of emotional disturbances by antianxiety, antipsychotic, or antidepressive medication became widespread, almost standard. Psychotherapy was often deemed advisable as well. In the 1980’s, most psychotherapy was paid for and controlled by such third parties such as health maintenance organizations (HMOs). HMOs demanded clear treatment plans oriented to fast and measurable results. Many hospitalized patients received medication without follow-up therapy. An unfortunate spin-off of this medicalization was that a large number of institutionalized patients, once medicated, became sufficiently free of flamboyant symptoms to be released. Still passive, withdrawn, without friends or incomes, and often discontinuing medication, many former patients became resigned to living on the streets, thus swelling the ranks of the homeless. The cognitive psychotherapy of Dr. Aaron Beck became especially popular. Beck assumed that underlying such symptoms as phobias, depressions, and compulsions were hidden, irrational assumptions. Cognitive therapists, therefore, shared with Freudian psychoanalysts the conviction that patients suffer from illusions. Unlike psychoanalysts, however, cognitive therapists were forceful and direct in encouraging changes in irrational thought patterns. The therapist would first explore with the patient his or her irrational belief system; then the therapist would assign homework confronting the patient with life situations testing the validity or exposing the invalidity of these beliefs. In the 1980’s, the most common presenting problems of patients shifted. Youth with learning disorders, especially those with difficulties in sustaining attention, were seen with greater frequency. Because of demographic changes, psychologists grew more concerned with the problems of single and divorced adults, blended households, custody disputes, twowage-earner households, and the elderly. Depressions among adolescents and young adults became more common. Programs dealing with substance dependency grew apace. As use of fixed sentences for
criminal acts expanded, programs aimed at rehabilitating offenders contracted. After John Hinckley, Jr., who attempted to assassinate President Ronald Reagan, was found “not guilty by reason of insanity,” the public demanded the criteria for insanity pleas be tightened. “You do the crime, you do the time” reflected the mood of the era. Impact Trends within psychology during the decade reflected the conservative temper of the times. Targeted, cost-conscious psychotherapy may yield insights less deep than classic approaches, but it confers measurable improvement for more people. Research in the decade to come included a genome (gene-mapping) project, the results of which further supported the significance of hereditary factors. Of course, social factors and culture must still be considered, since the potential bestowed by heredity can be modified for good or ill by environmental input. Cases abound where a felicitous family situation has moved a potential schizophrenic or delinquent away from the trajectory of genetic doom. Nurture still counts, but the enduring lesson of the 1980’s is that so does nature. Further Reading
Gelman, David. “Cognitive Therapy for Troubled Couples.” Newsweek, January 9, 1989. Describes this popular therapy applied to typical problems of the 1980’s. Loehlin, John C., et al. “Human Behavior Genetics.” Annual Review of Psychology 39 (1988): 101-133. Reviews the several types of research establishing the importance of heredity in identity. McRae, Robert, and Paul Costa. Personality in Adulthood. New York: Guilford Press, 1990. Discusses the massive research culminating in five-factor theory and the meaning of the factors. Thomas E. DeWolfe See also Androgyny; Conservatism in U.S. politics; Crack epidemic; Crime; Demographics of the United States; Domestic violence; Feminism; Genetics research; Health care in the United States; Health maintenance organizations (HMOs); Homelessness; Medicine; Prozac; Reagan assassination attempt; Science and technology.
The Eighties in America
■ Public Enemy Identification African American rap group Date First album released in 1987
Public Enemy specialized in crafting politically infused rap lyrics, uttering controversial public statements on race relations and social conditions, and appealing to urban and suburban audiences with a clever musical message of African American nationalism and cultural militancy. By the end of the decade, Public Enemy had redefined rap and hiphop music. In 1982, Chuck D (Carlton Ridenhour) began rapping with MC Flavor Flav (William Drayton) while both were students at Adelphi University. Public Enemy grew out of this collaboration and the desire of fledgling record label Def Jam to sign Chuck D to a contract. In 1986, the group was assembled and the contract was signed. Instead of imitating rap pioneers Run-D.M.C., Chuck D sought to disseminate political and social messages through music. Behind his staccato delivery of complex rhythms, Public Enemy featured the comical meanderings of oversizedwatch-toting jester Flavor Flav; steady beats and sampling from DJ Terminator X (Norman Rogers); the militant choreographic stylings of the group’s minister of information, Professor Griff (Richard Griffin); and backup dancers known as the Security of the First World, who sported fatigues, fake Uzis, and martial arts moves during performances. Despite the group’s sometimes amusing outward appearance, Public Enemy was all business. Its debut album Yo! Bum Rush the Show (1987) on the Def Jam label featured complex lyrics and digestible beats, and it sounded like little else at the time. The album was largely ignored by mainstream listeners. A year later, Public Enemy’s sophomore release, It Takes a Nation of Millions to Hold Us Back (1988), broke new ground. The album touched on drug use, crime, racism, religion, poverty, African American role models, and innercity decay. The next year, Public Enemy contributed the song “Fight the Power” to the sound track of
Public Enemy
■
783
Spike Lee’s controversial film Do the Right Thing (1989). At the end of the decade, Public Enemy’s music hinted at greater possibilities for hip-hop and rap music. Public Enemy’s commercial success brought tremendous press attention. While the group celebrated the politics of Malcolm X and the tactics of the Black Panther Party for Self-Defense, Chuck D’s endorsement of Nation of Islam leader Louis Farrakhan led to negative publicity. The public also reacted to lyrics in “Fight the Power” that accused Elvis Presley and John Wayne of being racists. The most severe media attack came in the summer of 1989, following a Washington Times interview with Professor Griff. In the interview, Griff blamed Jews for “the majority of the wickedness that goes on across the globe.” Following the incident, the group temporarily disbanded, but it returned (without Professor Griff) a year later with a new album. Impact During the 1980’s, Public Enemy blazed an important path for political and intellectual rap, demonstrating the power of the mainstream recording industry to disseminate subversive and countercultural messages. As their popularity increased, controversy threatened to split the group apart. Subsequent hip-hop, rap, heavy metal, and rock groups drew inspiration from Public Enemy.
Public Enemy. (Paul Natkin)
784
■
The Eighties in America
Public Enemy
Further Reading
Bogdanov, Vladimir, ed. All Music Guide to Hip-Hop. San Francisco: Backbeat Books, 2003. Cheney, Charise L. Brothers Gonna Work It Out: Sexual Politics in the Golden Age of Rap Nationalism. New York: New York University Press, 2005.
Chuck D. [Carlton Ridenhour]. Fight the Power: Rap, Race, and Reality. New York: Delta, 1998. Aaron D. Purcell See also
African Americans; Crime; Do the Right Thing; Hip-hop and rap; Music; Racial discrimination; Run-D.M.C.
Q ■ Quayle, Dan Identification
U.S. senator from Indiana from 1981 to 1988 and vice president from 1989 to 1993 Born February 4, 1947; Indianapolis, Indiana Quayle was a controversial conservative vice president who was often ridiculed in the media for perceived gaffes and lack of intelligence. Dan Quayle was born in Indianapolis, Indiana, into a family of wealthy newspaper publishers. He grew up in Arizona and in Huntington, Indiana, where he attended high school. He graduated from DePauw University of Indiana in 1969 and from Indiana University Law School in 1974. In 1972, he married Marilyn Tucker, and the couple had three children. In 1969, Quayle joined the Indiana National Guard. Since National Guard entrances were limited at the time because of the draft for the Vietnam War, his political opponents alleged that he used his family’s influence to gain admission. The guard unit was never activated for service in the war. Quayle won election to Indiana’s fourth congressional seat in 1976 at the age of twenty-nine. In 1980, he ran for the U.S. Senate. He received the influential endorsements of his family’s newspapers, including The Indianapolis Star, a conservative daily that supported the Republican Party. He also benefited from Ronald Reagan’s campaign for president that year, which drove a resurgence in political conservatism throughout the nation. With these factors in his favor, Quayle was able to defeat three-term Democratic senator Birch Bayh. In the Senate with a new Republican majority, Quayle took his job more seriously than he had as a representative. He served on the Employment and Productivity Subcommittee of the Committee on Labor and Human Resources and, despite his conservative attitudes, worked with liberal senator Ted Kennedy to pass the Job Training Partnership Act, a bill that the chair of the committee and President
Reagan opposed. He also helped bring about the sale of airborne warning and control system (AWACS) airplanes to Saudi Arabia. Vice Presidency
In 1988, after the two terms of the Reagan presidency, the Republicans nominated Reagan’s vice president, George H. W. Bush, to be his successor, and Bush chose the young, relatively unknown Indiana senator as his running mate. Bush hoped Quayle would provide a conservative balance to his own more moderate views. He also thought the
Vice President Dan Quayle, right, and President George H. W. Bush pose for their official portrait. (NARA)
786
■
The Eighties in America
Quayle, Dan
good-looking Quayle might strengthen the ticket’s appeal to women and provide a geographic Midwest balance. Furthermore, he wanted a young running mate in order to attract the votes of baby boomers. The media began an assault on the young senator. The assault became more pronounced as the campaign progressed and his embarrassing comments and flubs multiplied. However, journalists at first simply examined his record and found it wanting. They quoted the professors at DePauw who described Quayle as a fraternity gadfly rather than a serious student. They harped on his military reserve status in contrast to his hawkish statements about the Vietnam War and furthered the rumors that his family’s influence got him into the reserve in front of other applicants. Another rumor circulated about a Washington junket Quayle had taken at the expense of a lobbyist, including a golf outing and parties in the company of women with questionable reputations. Many conservatives also attacked Quayle for enlisting in a “safe” reserve unit during the Vietnam War instead of fighting at the front. Many Republicans also believed the choice of Quayle was a mistake, as he continued to be nervous and flustered in public. The party confined his appearances to friendly conservative groups and kept him away from large media outlets. Perhaps the most memorable quotation of the campaign involving Senator Quayle was not by him but about him. During the vice presidential debates, Quayle defended his youth by stating he was as old as John F. Kennedy had been when the latter became president. His opponent, Lloyd Bentsen, retorted, “Senator, I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.” Despite his apparent liabilities as a running mate, Quayle was elected alongside Bush in 1988. President Bush assigned Vice President Quayle to lead the administration’s space efforts and placed him on the Council of Competitiveness in the Twenty-first Century. In this capacity he made many international trips. However, Quayle’s greatest political liability was his poor speaking ability, and his speeches produced a number of embarrassing gaffes, making him the butt of many jokes and fodder for the nation’s comedians. Quayle once tried to correct an elementary school student’s correct spelling of potato by spelling it “potatoe.” Some of his most legendary
misstatements include, “I stand by all the misstatements that I’ve made” and “We have a firm commitment to NATO; we are a part of NATO. We have a firm commitment to Europe; we are a part of Europe.” Trying to utter the United Negro College Fund’s motto “A mind is a terrible thing to waste,” he said, “What a waste it is to lose one’s mind. Or not to have a mind is being very wasteful. How true that is.” Quayle also became the Republican Party’s spokesman on issues of “family values,” as he attempted to portray the party as the guardian of American conservative social ideas and to portray those ideas as traditional. He attacked the more “liberal” views of leading Democrats, for example toward abortion or gay rights. In this capacity, Quayle spoke out against moral laxity in the entertainment media and even attacked the television sitcom Murphy Brown, because its title character had a child out of wedlock with her ex-husband and planned to raise it as a single mother. Quayle derided the show for implying that single motherhood was a “lifestyle choice,” whereas he believed it was a moral failure. The writers of the show responded by incorporating videotape of Quayle’s remarks into the show, and the absurdity of a fictional character arguing with the vice president enhanced his negative image. Impact Quayle became a symbol of the new Republican conservative movement of the 1980’s. His supporters rallied around his call for “family values,” fewer domestic government programs, enhanced American enterprise, and a furtherance of the Reagan Revolution. His critics harped on his apparent lack of intelligence and repeated his public speaking mistakes. By the time he left office, Quayle was enough of a figure of ridicule that his national political career was effectively at an end. Further Reading
Broder, David S., and Bob Woodward. The Man Who Would Be President: Dan Quayle. New York: Simon & Schuster, 1992. Reprints articles from The Washington Post presenting both the positive and negative sides of Quayle. Fenno, Richard F. The Making of a Senator: Dan Quayle. Washington, D.C.: CQ Press, 1989. Favorable account of how Quayle became a U.S. senator, written by a respected political scientist. Quayle, Dan. Standing Firm: A Vice-Presidential Memoir. New York: HarperCollins, 1994. Quayle’s autobiography.
The Eighties in America
_______. Worth Fighting For. Nashville: World, 1999. Quayle’s political philosophy and proposals written when he was a candidate for the 2000 nomination for president. Frederick B. Chary See also
Bentsen, Lloyd; Bush, George H. W.; Conservatism in U.S. politics; Elections in the United States, 1980; Elections in the United States, 1988; Reagan Revolution.
■ Quebec English sign ban Identification
Legislation prohibiting the use of English on commercial signs in Quebec Place Quebec In the late 1970’s, Quebec made French the official language of the province and limited the legal use of other languages. During the 1980’s, in the midst of a national constitutional crisis focused on the francophone province, that province’s ban on English-language commercial signs faced legal and constitutional challenges from its anglophone minority. In 1977, the National Assembly of Quebec passed the historic Charter of the French Language, which, among other provisions, banned the use of languages other than French on commercial signs. The charter, also known as Bill 101, recognized French as the province’s official language for use in government, judicial proceedings, education, commerce, and the workplace. Section 58, the so-called English sign ban, and section 69, a requirement that all commercial firms adopt French names only, were intended to promote the public use of French in Québécois society. The Charter also designated the Office Québécois de la Langue Française (Québécois Office of the French Language) as an appropriate agency to monitor compliance with the province’s language laws and to assess fines for businesses that contravened those laws. Reactions to the province’s language laws were to a large extent divided along cultural lines. While Quebec’s francophone population generally supported the new laws governing commerce and education, many anglophones viewed them as a violation of their freedom of expression. In February of 1984, the owners of several Montreal business firms successfully argued before the Superior Court of Quebec that sections 58 and 69 of the provincial lan-
Quebec English sign ban
■
787
guage charter were contrary to provisions contained in the Canadian Charter of Rights and Freedoms, which had been passed as part of the new constitution in 1982. The charter contained specific provisions protecting the rights of English and French speakers within the provinces where they constituted linguistic minorities. Agreeing with the defendants, the court ruled that the Quebec government could not reasonably impose its unilingual policy on local businesses. In 1988, the Supreme Court of Canada took up the case, Valerie Ford v. Attorney General of Quebec, on appeal; it upheld the lower court’s decision. In their ruling, the justices reasoned that the National Assembly of Quebec had no right to prohibit the use of English on commercial signs, but they felt that it could require a “marked predominance of the French language.” In response to the Supreme Court’s decision, Quebec’s National Assembly, which had never ratified the new constitution or its Charter of Rights and Freedoms, introduced Bill 178 and invoked the notwithstanding clause of the constitution to shelter it from judicial review. Bill 178, which contained amendments to the province’s language laws, maintained the French-only policy on outdoor signs but allowed for the use of other languages including English inside commercial establishments, as long as French remained predominant. Impact Quebec’s language laws had a deeply polarizing effect on public opinion in Canada. Observers cite Quebec’s use of the notwithstanding clause to circumvent the Supreme Court ruling as a major reason for the failure of the Meech Lake Accord. Beyond the controversy it sparked, the English sign ban played an important role in Quebec’s history by reinforcing the public expression of French culture throughout the province and by boosting the selfesteem of the francophone population. It also stimulated awareness of language issues that previous generations of Canadians had ignored, and it contributed to a progressive refinement of the nation’s federal language policies. Further Reading
Coulombe, Pierre A. Language Rights in French Canada. New York: Peter Lang, 1995. Larrivée, Pierre, ed. Linguistic Conflict and Language Laws: Understanding the Quebec Question. New York: Palgrave Macmillan, 2003. Jan Pendergrass
788
■
Quebec referendum of 1980
See also
Bourassa, Robert; Business and the economy in Canada; Canada Act of 1982; Canadian Charter of Rights and Freedoms; Chrétien, Jean; Lévesque, René; Meech Lake Accord; Minorities in Canada; Trudeau, Pierre.
■ Quebec referendum of 1980 The Event
Quebec’s government asks voters if the province should seek independence from Canada Date May 20, 1980 Place Quebec The Quebec referendum of 1980 determined that a solid majority of Quebecers opposed independence from Canada. The vote did not resolve the sovereignist issue in Canadian politics, but it revealed a willingness on both sides of the political spectrum to work within the nation’s constitutional framework. During the 1970’s, in the wake of what has been called la révolution tranquille (the quiet revolution), there was a significant rise in French Canadian nationalist sentiment. In Quebec, where over 80 percent of the population consisted of native French speakers, the Parti Québécois (PQ, or Québécois Party) assumed a prominent role in the province’s struggle against economic, political, and cultural inequities that decades of neglect had engendered. However, the question of whether or not Quebec should become sovereign, as the PQ leadership contended, remained a divisive political issue. Not wishing to alienate moderate voters, the PQ government of Quebec shelved its plan to seek complete independence from Canada and proposed to negotiate with federal authorities the terms of a more nuanced form of sovereignty-association. The intent was to give Quebec full jurisdiction over its taxes, laws, and foreign policy, while retaining close economic and cultural ties with the Canadian federation. The purpose of the 1980 referendum was to see if voters supported that approach. During the months prior to the referendum, leaders of the sovereignist camp sought to reassure the population that a positive vote would not be used to justify a unilateral declaration of independence and that the results of future negotiations would be presented to voters in the form of another referendum. Canada’s minister of justice, Jean Chrétien,
The Eighties in America
and Claude Ryan, the leader of the provincial Liberal Party, coordinated the efforts of the federalist camp in its opposition to the sovereignist proposal. Prime Minister Pierre Trudeau also weighed in with a promise to work for constitutional reform at the federal level in the event that the referendum failed. On May 20, 1980, Quebecers went to the urns in record numbers and voted by a majority of 59.6 percent (2,187,991 out of 3,673,842 valid ballots) not to pursue negotiations on sovereignty-association with the federal government. While this outcome left many of the PQ’s supporters profoundly disappointed, Premier René Lévesque remained confident that the sovereignist issue was not a closed chapter in Canadian politics. In a concession speech following the referendum, he announced before a crowd of fervent supporters, “À la prochaine fois” (until the next time).
Claude Ryan, leader of the Quebec Liberal Party, votes “no” in the Quebec referendum of 1980. (AP/Wide World Photos)
The Eighties in America Impact Despite the defeat of Lévesque’s proposal, the issue of Québécois nationalism remained a problematic one for the federal government of Canada. Political opposition to the Constitution Act, 1982, and to the Meech Lake Accord of 1987 demonstrated how difficult it would be to reconcile Quebecer demands with national policies. In 1995, a similar referendum failed by a narrow margin. Further Reading
Dickinson, John, and Brian Young. A Short History of Quebec. 3d ed. Montreal: McGill-Queen’s University Press, 2003.
Quebec referendum of 1980
■
789
Jedwab, Jack, et al. À la prochaine? Une rétrospective des référendums québécois de 1980 et 1995. Montreal: Éditions Saint-Martin, 2000. Robinson, Gertrude Joch. Constructing the Quebec Referendum. Toronto: University of Toronto Press, 1998. Jan Pendergrass See also Canada Act of 1982; Chrétien, Jean; Lévesque, René; Meech Lake Accord; Minorities in Canada; Quebec English sign ban; Trudeau, Pierre.
R ■ Racial discrimination Definition
Unequal treatment of persons because of their race or ethnicity
Liberals and civil rights leaders perceived the 1980’s as a period of regression in the government’s commitment to promoting racial equality. Public opinion polls, nevertheless, showed a small increase in the liberalization of attitudes by respondents of all races. Despite myriad acts of discrimination, public opinion surveys during the 1980’s showed that racial attitudes of white Americans continued a long-term trend toward increased support for racial integration. Whereas only 50 percent of whites in 1950 had responded that white and black children should attend the same schools, 88 percent favored attendance at the same schools in 1980, and this percentage increased to 94 percent in 1989. When asked whether whites should have a right to keep African Americans out of their neighborhoods, 56 percent agreed in 1968, compared with 34 percent in 1980 and only 23 percent in 1989. In 1970, 50 percent of whites favored laws prohibiting intermarriage, compared with 32 percent in 1980 and only 23 percent in 1989. Southern attitudes remained significantly more conservative than those in the North. By the 1980’s, it was not unusual for African Americans to be elected to local office and to the House of Representatives. Although most victories occurred in predominantly black districts, Harold Washington could not have been elected as the first African American mayor of Chicago without large numbers of white votes, and he served from 1983 until his death in 1987. In 1989, David Dinkins became the first African American to be elected mayor of New York City. That same year, moreover, Douglas Wilder became the first African American to be elected governor (Virginia). The elections of these three men demonstrated that large numbers of whites were viewing issues as more important than the candidate’s race. Although civil rights activist
Jesse Jackson was unsuccessful in his bids for the presidency in 1984 and 1988, he made an impressive showing. No doubt, many whites voted against Jackson because of his race, but opposition to his leftwing views was probably more important to the outcome. Racial Incidents and Controversies A number of racial bias crimes occurred during the decade. In Mobile, Alabama, in 1981, two members of the Ku Klux Klan lynched nineteen-year-old Michael Donald, who was chosen at random in order to protest the acquittal of an African American defendant. The murder provided the opportunity for Morris Dees and the Southern Poverty Law Center to file a civil suit against the United Klans of America on behalf of Donald’s mother. The resulting wrongful-death verdict of $7 million financially destroyed the organization, which had been the nation’s largest Klan group. In 1986, a twenty-three-year-old African American, Michael Griffith, was driving with two friends in the white neighborhood of Howard Beach in Queens, New York. After Griffith’s car broke down, he and one friend were badly beaten by young white teenagers with racist attitudes. As Griffith ran across the street in an attempt to escape, he was struck and killed by a passing automobile. Three white defendants were convicted only of second-degree manslaughter. For many African Americans, the name of Howard Beach became a symbol of white racism and unequal justice. Asian immigrants also reported instances of discriminatory treatment. In Galveston, Texas, for instance, several Vietnamese-owned shrimp boats were burned between 1979 and 1981. After a fistfight between white and Vietnamese fishermen ended with the fatal shooting of a white participant, a group of angry white fishermen organized an anti-Vietnamese rally in February, 1981. Armed, hooded members of the Ku Klux Klan joined the protest and engaged in numerous acts of harassment and intimidation. In
The Eighties in America
federal district court, Dees obtained an injunction ordering an end to such behavior, and the remainder of the shrimp season proceeded peacefully. Civil rights leaders experienced a number of political setbacks during the decade. The administration of President Ronald Reagan cut back on governmental efforts to enforce laws prohibiting discrimination in jobs and housing. The administration also opposed affirmative action programs and attempted to reinstate tax breaks for private schools that practiced racial segregation. In the presidential election of 1988, the campaign of candidate George H. W. Bush appeared to appeal to racial prejudices in creating the infamous William Horton commercials, which emphasized the criminal acts of an African American prisoner on leave. In 1989, civil rights leaders were even more shocked when former Klansman David Duke won a seat in Louisiana’s House of Representatives. One of the most controversial issues of the decade was court-ordered busing as a tool for desegregation. Although some federal district judges continued to order busing plans, the majority of plans during the 1980’s had been ordered during the previous decade. Presidents Reagan and Bush were both committed opponents of busing plans, and by the middle of the decade their conservative appointments to the Supreme Court were beginning to have an impact. In 1985, a district judge of Kansas City, Missouri, ordered a particularly expensive magnet school plan, and he also ordered increased taxes to pay for the project. The Supreme Court would eventually overturn the plan in the two cases of Missouri v. Jenkin (1990 and 1995). Impact From the perspective of civil rights, the decade of the 1980’s was a time of both progressive change and conservative backlash. African Americans prevailed in several high-profile elections, and there was abundant evidence that the level of white racism continued its long-term trend of slowly decreasing. The Republican administrations, however, were unenthusiastic about the enforcement of civil rights laws, and the federal courts were definitely becoming more conservative, particularly regarding the issues of school desegregation and affirmative action. Further Reading
Franklin, John Hope, and Alfred Moss, Jr. From Slavery to Freedom: A History of African Americans. New
Radon
■
791
York: McGraw-Hill, 2000. A standard survey that provides a perspective on the struggle to achieve equal rights for African Americans. Schaefer, Richard. Racial and Ethnic Groups. 10th ed. Upper Saddle River, N.J.: Pearson, 1995. A useful sociology textbook that surveys the historical experiences of minorities, with an emphasis on issues of prejudice, discrimination, and civil rights. Schuman, Howard, et al. Racial Attitudes in America: Trends and Interpretations. Cambridge, Mass.: Harvard University Press, 1997. A scholarly account of public opinion on racial issues during the second half of the twentieth century. Small, Stephen. Racialized Barriers: The Black Experience in the United States and England in the 1980’s. New York: Taylor & Francis, 1994. A comparative study arguing that institutional patterns of inequality prevailed in both countries. Walker, Samuel, et al. Color of Justice: Race, Ethnicity, and Crime in America. New York: Wadsworth, 2006. A comprehensive examination of the various issues related to how African Americans and other minorities have been treated in the criminal justice system. Thomas Tandy Lewis See also
Affirmative action; African Americans; Asian Americans; Brawley, Tawana; Glass ceiling; Howard Beach incident; Latinos; Native Americans.
■ Radon Definition
Radioactive gas and health hazard
Radon, an odorless, colorless gas produced when radium (a decay product of uranium) disintegrates, was found in high concentrations in basements and crawl spaces of homes during the 1980’s. Breathing radon daughters (solid particles from radon) increases the risk of lung cancer, especially in smokers. The prevalence of radon in homes was discovered by accident in 1984 when a nuclear power plant engineer, Stanley Watras, set off a radiation detector at work in Limerick, Pennsylvania. An investigation of his home revealed a radon level 650 times greater than normal. Although it had been known that radon could enter homes, previously no one had realized the enormity of the threat. U.S. Environmental Protection Agency (EPA) official Richard Guimond
792
■
Radon
quickly recognized the risk to public health and began a campaign to warn citizens. States were encouraged to discover the extent of the danger in their areas. In 1986, Virginia surveyed eight hundred homes, about 12 percent of which had dangerously high levels of radon. The Radon Gas and Indoor Air Quality Research Act of 1986 mandated the EPA to study radon and report the findings to Congress. The act proposed the establishment of a research program that merged the efforts of the EPA with those of other public and private groups. It advised the EPA to inform the public about the dangers of radon. After much investigation of the health hazard, on September 12, 1988, the U.S. Public Health Service and the EPA held a joint press conference to publicize the problem, noting that millions of homes were found with elevated levels and that radon causes thousands of deaths each year. Assistant Surgeon General Vernon Houk, also head of the Centers for Disease Control, urged home owners to test for and fix any radon problems. After that warning, there was a huge surge in requests for radon testing. The Indoor Radon Abatement Act of 1988 estab-
The Eighties in America
lished radon monitoring at schools, federal building assessments, and several radon training centers; these efforts were managed by the EPA. The regional centers educated people about health risks, how to measure for radon, and how to fix the problem. The centers trained the public, business firms, Realtors, architects, inspectors, ventilation companies, and government officials about indoor radon hazards in the air and water in residences, workplaces, and schools. The long-term goal of the act was to make indoor air as free of radon as the air outside. The EPA also started a large Indoor Radon Program during the late 1980’s to offer grants to the states. Finding and Fixing Contamination Although radon was not officially discovered until 1900 by German chemist Friedrich Ernst Dorn, its effects were known as far back as the Middle Ages, when miners were known to live very short lives. The radioactive element is a particular hazard for uranium miners. Radon lurks in home areas that are not well ventilated. It can come from uranium in the surrounding soil, water, or even building materials such as brick or cinder block. Although the concrete in a basement slows
A billboard on the outskirts of Pruntytown, West Virginia, warns of the dangers of radon in the area. (Jim West)
The Eighties in America
radon from entering the house, the gas and particles can enter through cracks, drainage systems, and joints. If the ventilation is poor, the element cannot easily dissipate. Radon gas itself is not the main health culprit; rather, it is the radon daughters, made up of alpha particles attached to dust, that present the health risk. The lungs absorb the most intense radiation; as little as a millimeter can cause lung cancer, particularly in smokers. The National Cancer Institute estimates that between fifteen thousand and twenty-two thousand radon-related lung cancer deaths occur each year in the United States. Whether or not a home is contaminated is determined by taking air samples and then electronically measuring alpha emissions. Because radon is odorless and colorless, residents will not be aware of it unless such sampling is done. Decontamination methods involve sealing entry points (using caulk, epoxy paint, and polyethylene sheeting) and increasing ventilation. Impact
Many buildings are safer today from radon because of the public awareness campaign in the 1980’s. All homes should be fixed if the radon level is 4 picocuries per liter or higher. Radon levels of less than 4 picocuries per liter still pose a risk and may be reduced. Further Reading
Cole, Leonard A. Element of Risk: The Politics of Radon. New York: Oxford University Press, 1993. Debates the EPA’s policy and reveals the interplay among science, society, and the federal government. Committee on the Biological Effects of Ionizing Radiation, U.S. National Research Council. Health Effects of Exposure to Radon. Washington, D.C.: National Academies Press, 1999. Covers the research on radon and lung cancer. Edlestein, M., and W. Makofske. Radon’s Deadly Daughters: Science, Environmental Policy, and Politics of Risk. Lanham, Md.: Rowman & Littlefield, 1998. Examines how social and scientific factors led to misunderstandings about radon. Icon Health Publications. Radon: A Medical Dictionary, Bibliography, and Annotated Research Guide to Internet References. San Diego, Calif.: Author, 2004. A guide to relevant research and terms. National Research Council. Risk Assessment of Radon in Drinking Water. Washington, D.C.: National Academies Press, 1999. Details the inhalation and ingestion risks.
Raging Bull
■
793
Ritchie, Ingrid, and Stephen John Martin. Healthy Home Kit. Chicago: Real Estate Education, 1995. How to clean up home hazards such as radon, lead, and asbestos. Jan Hall See also
Air pollution; Cancer research; Smoking and tobacco; Water pollution.
■ Raging Bull Identification American film Director Martin Scorsese (1942Date Released November 14, 1980
)
A poll of film critics in the United States and abroad conducted by Premiere and American Film magazines in 1989 named this biopic of boxer Jake La Motta as the best motion picture of the 1980’s. Released midway through a crowd-pleasing series about a fictional boxing hero—Rocky (1976), Rocky II (1979), Rocky III (1982), and Rocky IV (1985)—Raging Bull offered a more complicated, less resolved, and far darker look at men who live in the ring. In contrast to the likable Rocky, the protagonist of Raging Bull is a self-destructive, obsessive batterer. Based on the ghostwritten autobiography of middleweight Jake La Motta, director Martin Scorsese’s biopic completed a trilogy, with Mean Streets (1973) and Taxi Driver (1976), of Scorsese-directed films set in New York in which actor Robert De Niro delivered raw, visceral performances portraying men who live by and through violence. De Niro brought the La Motta project to Scorsese’s attention; the actor’s commitment to the part became legendary, as he spent a year training with the former boxer and did all the carefully choreographed fight scenes himself. He gained more than fifty-five pounds to replicate La Motta’s middle-aged, bloated body for the scenes of Jake as a nightclub performer that frame the central narrative. Also delivering outstanding performances were Joe Pesci as Jake’s brother Joey and Cathy Moriarty as Jake’s second wife. In a film of great visual beauty, cinematographer Michael Chapman moved between gritty realism and stylized expressionism. Filmed in a widescreen format, almost entirely in black and white, sometimes at varying camera speeds, utilizing an unorthodox sound track, and focused on questions
794
■
Raging Bull
The Eighties in America
Robert De Niro as Jake La Motta in Raging Bull. (AP/Wide World Photos)
of guilt and redemption, the film resembles a European art film more than the typical Hollywood movie of the 1980’s. In a decade characterized by feminist concerns, Raging Bull presents a milieu— male-dominated 1940’s and 1950’s Italian American working-class neighborhoods ruled by the mob— known for reactionary sexual politics. Even within this context, Jake’s wife beating and homophobia are presented as excessive and indicative of his selfhatred. Scorsese ends the film with a biblical quotation that indicates that La Motta has won the crucial fight for his own soul, an interpretation contested by many critics. In a decade of increasingly impersonal filmmaking, one of the greatest shocks in this shocking film is the final title, which dedicates Raging Bull to Scorsese’s recently deceased mentor and college film professor, Haig Manoogian, “With love and resolution, Marty.”
Thelma Schoonmaker’s brilliant editing and De Niro’s unforgettable performance were rewarded with Academy Awards. Scorsese and others who were involved in the production won many other important awards. Impact The creative direction of Scorsese and the bravura performance of De Niro in Raging Bull solidified their individual reputations and marked their ongoing collaborations as some of the most memorable in contemporary American cinema. The modest commercial success and considerable critical success of Raging Bull enabled Scorsese to secure financing for even riskier future projects. Further Reading
Hayes, Kevin J., ed. Martin Scorsese’s “Raging Bull.” New York: Cambridge University Press, 2005. Kellman, Steven G., ed. Perspectives on “Raging Bull.” New York: G. K. Hall, 1994.
The Eighties in America
La Motta, Jake, Joseph Carter, and Peter Savage. Raging Bull. 1970. Reprint. New York: Bantam Books, 1980. Carolyn Anderson See also Academy Awards; Boxing; Film in the United States; Organized crime; Scorsese, Martin.
■ Raiders of the Lost Ark Identification American adventure film Director Steven Spielberg (1946) Date Released on June 12, 1981
This blockbuster adventure film became a cultural icon and a template for the genre. The film generated interest in serial-style films while representing the American ideal’s triumph over evil. Raiders of the Lost Ark was one of the most influential films of the 1980’s and the most successful film of 1981. Its tremendous popularity resulted in gross revenues of $380 million worldwide, and the film’s iconic images would become embedded in American popular culture for generations. Its influence led to many imitation films. Creation and Cast Raiders of the Lost Ark was the brainchild of George Lucas. With Star Wars (1977) and The Empire Strikes Back (1980) under his belt, Lucas decided to turn to another genre for his next project. He had visualized bringing back the serial radio style from the 1930’s in movie form. Lucas created a character named Indiana Smith who would combine humor, action, and romance in a world of adventure. In Steven Spielberg, Lucas found a director for his vision. Spielberg was famous at the time for his direction of Jaws (1975) and Close Encounters of the Third Kind (1977). He suggested that the surname of the title character be changed to Jones. This character was an archaeologist in search of one of the most sought-after relics in history—the Ark of the Covenant containing the stone tablets inscribed with the Ten Commandments. Members of the Nazi Party who seek the Ark for its occult power would be Jones’s adversaries. The basis for the characters and sets were storyboarded in a comic book style by Jim Steranko. This became the model for design and the blueprint from which Spielberg worked. Filming for scenes set
Raiders of the Lost Ark
■
795
in Egypt took place primarily in Tunisia. Also, the Nazi submarine base was filmed in La Rochelle, France. The majority of the rest of the filming took place in the United States on set. The variety of exotic locations helped make the film more appealing to audiences. Despite his use of many film locations, Spielberg finished filming early, in less than three months. Raiders of the Lost Ark featured a strong cast. Originally, Lucas did not want Harrison Ford to play the lead role of Jones, as Ford had already found success playing Han Solo in the successful Star Wars franchise, but Spielberg eventually convinced Lucas to hire him for the role. Ford was accompanied by Karen Allen as Jones’s love interest and main sidekick, Marion Ravenwood. Jones had two other sidekicks, the scholarly Marcus Brody (Denholm Elliott) and the well-connected Egyptian Sallah (John RhysDavies), and was opposed by his archrival archaeologist Rene Belloq (Paul Freeman) and the Nazi colonel Dietrich (Wolf Kahler). Reaction and Legacy One of the most powerful elements of Raiders of the Lost Ark was provided by the musical score written by John Williams. His masterful work helped breathe life into the film and its characters, from love themes to one of the most recognizable theme songs of the 1980’s, “The Raiders March.” Arguably, the music may have done more for the movie’s popularity than the script. The themes of Raiders of the Lost Ark hit home for many audiences. Lucas’s move to set his story in a different era did not hide the political overtones that were haunting the early 1980’s. The substitution of Soviets with Nazis was only subtle at best, as one could easily associate Jones and company’s attitude toward the Nazis as the same attitude that many Americans had toward the Soviet Union at the time. The godless, power-grabbing Nazis echoed everything that Americans feared about the Soviet bloc. Another theme of the film was the validation of the Judeo-Christian religious tradition, specifically of Moses and the Old Testament. The moviegoer is witness to the godless enemy’s destruction through the might of God as righteous Americans withstand God’s wrath and are spared. This struck a chord with many audience members who were striving to find strength at home and in their beliefs. Impact Raiders of the Lost Ark left Americans with many iconic images, including the giant boulder
796
■
Rambo
scene from the beginning of the movie, the Ark being shelved and locked away secretly by the American government, and the clever American shooting his way out of a sword duel. The film also introduced misconceptions about archaeology. Its popularity resulted in the prequel Indiana Jones and the Temple of Doom (1984), the sequels Indiana Jones and the Last Crusade (1989) and Indiana Jones and the Kingdom of the Crystal Skull (2008), and the television series The Young Indiana Jones Chronicles (1992-1993). Further Reading
Gordon, A. “Raiders of the Lost Ark: Totem and Taboo.” Extrapolation 32 (Fall, 1991): 256-267. A look at the symbolism within Raiders of the Lost Ark. Looks at the cultural impact of the film and its realism. King, Cynthia M. “Effects of Humorous Heroes and Villains in Violent Action Films.” Journal of Communication 50 (Winter, 2000): 5-24. Examines the comic action film and explains the interest in movies such as Raiders of the Lost Ark. Also looks at how the violence in such movies is received by audiences. Taylor, Derek. The Making of “Raiders of the Lost Ark.” New York: Ballantine Books, 1981. Offers individual stories from the cast and crew, as well as an overview of the making of the film. Contains photographs on location and information on the sets. Daniel R. Vogel See also
Academy Awards; Action films; Archaeology; Empire Strikes Back, The; Epic films; Film in the United States; Ford, Harrison; Spielberg, Steven.
■ Rambo Identification
Fictional character
In the 1980’s, America began to come to terms with the social turmoil left by the Vietnam War. Attempts were made to rehabilitate the reputations of both the Vietnam veteran and the United States military in general, and Rambo became a fictional spokesperson for these movements. The character of John Rambo was created by David Morrell for the 1972 novel First Blood but was introduced to most Americans in a 1982 action film based on the book. With Sylvester Stallone as Rambo, the film First Blood became a modest hit and spawned
The Eighties in America
two bigger sequels, Rambo: First Blood, Part 2 (1985) and Rambo III (1988). Stallone reprised his role in the sequels; Morrell wrote the novelizations. There were also Rambo video games and an animated series (1986). The United States lost its first war of the modern period in Vietnam, a conflict accompanied by civil and political unrest at home and which was fought at a time of increasing liberalism in America. By the 1980’s, both cultural and political liberalism had waned and a new conservatism was developing, leading to President Ronald Reagan’s election in 1980. Reagan, president until 1989, is credited with rebuilding America’s military strength and confidence after Vietnam, and it is no accident that he was called a “Rambo” by his detractors at home and abroad. The former actor even invoked the character in speeches. Rambo died in the novel First Blood, but he survived in the film. In an emotional speech before surrendering to the authorities, Rambo vocalized feelings that many Americans had about the war and its aftermath—that American soldiers had been treated more as villains than heroes when they returned from the war and that the military could have won if it had not been shackled by political forces. The sequels reinforced these themes. Vietnam War veterans were finally honored publicly in the 1980’s by the Vietnam Veterans Memorial. President Reagan addressed America’s loss of military confidence by dramatically increasing defense spending and by showing a willingness to exercise military power, as in the 1983 Grenada invasion. However, Reagan’s critics accused him of jingoism, especially after he began referencing the Rambo films in speeches directed at the Soviet Union and its Cold War allies. While First Blood sympathetically portrayed Rambo as a psychologically scarred Vietnam War veteran, the sequels were much more violent and linked violence to patriotic themes. While the general public enjoyed the mayhem, critics and many Vietnam veterans felt that the violence was excessive and that the films oversimplified the war. Impact The Rambo films were both an influence on and a reflection of both the rehabilitation of the American military, and—particularly for the two sequels—the rise in patriotic and aggressive feelings within the United States during the 1980’s.
The Eighties in America Further Reading
Eberwein, Robert. The War Film. New Brunswick, N.J.: Rutgers University Press, 2004. Jeffords, Susan. Hard Bodies: Hollywood Masculinity in the Reagan Era. New Brunswick, N.J.: Rutgers University Press, 1994. Jordan, Chris. Movies and the Reagan Presidency. Westport, Conn.: Praeger, 2003. Charles Gramlich See also
Action films; Cold War; Film in the United States; Grenada invasion; Liberalism in U.S. politics; Reagan, Ronald; Sequels; Vietnam Veterans Memorial.
■ Rape Definition
Forced or coerced penetrative sex, often accompanied by other undesired sexual acts
American legal sanctions and social prohibitions against this crime increased in strength within the United States throughout the 1980’s. Rape is not only a crime but also a challenging cultural issue for Americans to discuss because it encompasses so many other socially charged topics, including cultural and religious ideals concerning individual rights in marriage or other relationships, social norms regarding sexuality, and power dynamics between men and women. In addition to the topic of rape often being taboo for all of these reasons, most rape victims feel tremendous shame and guilt, which contributes to social stigma and secrecy on this issue. In cases of rape, adult men are usually the aggressors, and the victims are predominantly women or girls, although boys and some men are also targets. Women often know their rapists. Ironically, one of the rape myths is that women or children are most at risk from strangers, which means that women may be more fearful in public locations and less apprehensive in private, with acquaintances or family members, not recognizing that in these situations they are at greater risk of sexual assault. (Child rape in domestic contexts is usually classified as incest, which is not covered here.) Typically, rape occurs in seclusion, without witnesses, which creates challenges for victims who desire to protest their treat-
Rape
■
797
ment, either socially or in court, since there may be little physical evidence of their lack of consent to sexual contact or the ensuing ordeal. Rape can have additional meanings depending on its context. For example, genocidal rape occurring in a war zone differs from date rape on a college campus. During the 1980’s, there was increased understanding among rape crisis workers and researchers of the variation in responses among victims, depending on cultural, religious, and other factors. In fact, some rape victims did not define the events that happened to them as rape, since the notion of individual human rights is a culturally specific one. However, individual human rights have become more accepted worldwide since extensive codification by the United Nations. For all of these reasons, during the 1980’s the issue of rape became increasingly a feminist concern. Women were well organized and managed to make significant headway in changing anachronistic state laws, such as the marital exemption. In contrast, many men and male organizations publicly promoted the idea that rape within marriage was an acceptable activity for American men, despite increasingly strong legal challenges to these attitudes. The American Legal Response
During the 1980’s, there was a major change in legal prohibitions against rape. Prior to 1976, the American state statutes had allowed a man to claim exemption from rape laws if the victim was married to him. However, in 1977, the tide began to turn, and Oregon became the first state to abolish this exemption. By mid-1980, state legislatures in Nebraska and New Jersey were following suit, and by the end of the decade, only eight states still permitted the marital rape exemption. These legal changes were accompanied by concomitant shifts in public policing and the courts. Police paid greater attention to the rights of victims and worked to eliminate some of the shame-provoking practices of earlier officer conduct. The courts also strove to provide a safer environment that was more open to rape victims’ testimony and provided harsher penalties to convicted rapists. Many of these changes were fueled by the actions of workers at rape crisis centers, which were available to women in most American cities. Now in their second decade, centers were often staffed by social workers and psychologists, rather than by feminist advocates with little professional training, as had
798
■
The Eighties in America
Rather, Dan
been predominantly the case in the 1970’s, when the movement first developed. Impact The 1980’s was a time of great social change in attitudes toward rape. As the decade continued, victims received better care, and rapists were dealt stronger penalties. By the end of the 1980’s, rape within marriage was viewed as inappropriate, both culturally and legally, which was a major step forward in human rights for women in the United States. Further Reading
Ahrens, Courtney E. “Being Silenced: The Impact of Negative Social Reactions on the Disclosure of Rape.” American Journal of Community Psychology 38, nos. 3/4 (December, 2006): 263-274. Consideration of the role of responders to rape victims. Finkelhor, David, and Kersti Yllo. License to Rape: Sexual Abuse of Wives. New York: Holt, Rinehart and Winston, 1985. A highly influential publication by two renowned academics covering the issue of marital rape, focusing on the United States. Gornick, Janet, et al. “Structure and Activities of Rape Crisis Centers in the Early 1980’s.” Crime and Delinquency 31, no. 2 (April, 1985): 247-268. Comprehensive coverage of the variety of rape crisis centers in the United States based on a representative sample of fifty. Kimmel, Michael S. The Gendered Society. 3d ed. New York: Oxford University Press, 2008. Insightful consideration of gender’s role in American life, with an excellent chapter about gendered violence, including rape. MacKinnon, Catharine A. “Genocide’s Sexuality.” In Are Women Human? And Other International Dialogues. Cambridge, Mass.: Belknap Press of Harvard University Press, 2006. Analysis of wartime rapes when genocide is the goal, by a renowned feminist lawyer. Russell, Diana E. H. Rape in Marriage. 1982. 2d ed. Indianapolis: Indiana University Press, 1990. The first publication written in English about marital rape, focusing on domestic sexual assault in San Francisco. Based on a random sample. Sebold, Alice. Lucky. Boston: Back Bay Books, 1999. Autobiographical account of the novelist’s traumatic rape at the age of eighteen, when she was a college student. Yllo, Kersti, and Michele Bograd, eds. Feminist Perspectives on Wife Abuse. Newbury Park, Calif.: Sage,
1988. Academic and community activists consider domestic violence, including rape. Susan J. Wurtzburg See also
Abortion; Brawley, Tawana; Central Park jogger case; Color Purple, The; Crime; Domestic violence; École Polytechnique massacre; Fatal Attraction; Feminism; McMartin Preschool trials; sex, lies, and videotape; Women’s rights.
■ Rather, Dan Identification
Anchor of the CBS Evening News, 1981-2005 Born October 31, 1931; Wharton, Texas Rather was the first television anchor to emerge as a celebrity in his own right as measured by a multimillion-dollar contract. As a boy, Dan Rather listened to the World War II radio broadcasts of Columbia Broadcasting System (CBS) news reporter Edward R. Murrow, who vividly described bombings of London and who, in 1954, effectively denounced anti-Communist zealot Senator
New CBS Evening News anchor Dan Rather, left, shakes hands with outgoing anchor Walter Cronkite in March of 1981. (CBS/ Landov)
The Eighties in America
Joseph McCarthy. Perhaps influenced by Murrow, Rather resembled him in many ways. Both were dedicated to the primacy and integrity of news. Both sometimes courted danger, as when Rather entered Afghanistan in 1980, shortly after the Soviet invasion. Neither unquestioningly accepted authority; in 1988, Rather confronted Vice President George H. W. Bush, then running for U.S. president, over the Iran-Contra affair. Both attracted criticism; in 1985, Senator Jesse Helms urged conservatives to buy CBS to control Rather. Rather first drew national attention with his coverage of the 1961 Galveston, Texas, hurricane and the 1963 assassination of President John F. Kennedy. To replace retiring anchor Walter Cronkite, CBS offered Rather a two-million-dollar, ten-year contract and the power of managing editor. Rather made his first broadcast as anchor on March 9, 1981. While rarely altogether relaxed on camera, he communicated honesty and decency. Not content simply to announce news, he focused on the effects of policies and events on people concerned. He sometimes showed emotion, as when in 1983 he announced the murder of 241 young Marines by terrorists in Lebanon. After a weak first year, he ranked at the top in Nielsen rankings for most of the first half of the decade. Perhaps his most important story of the decade was his reporting of the 1989 suppressed democratic revolution in China’s Tiananmen Square, recorded in his book The Camera Never Blinks Twice (1994). Rather’s ratings fell late in the decade as news was transformed with the advent of corporate ownership. The 1986 sale of CBS to Lawrence Tisch introduced the largest corporate cutback of news employees in that network’s history. Rather was among those who futilely protested; he published, with news producer Richard Cohen, “From Murrow to Mediocrity?” in The New York Times (March 10, 1987). His reaction to these changes probably was responsible for his most famous lapse. Broadcasting from Miami in 1987 to cover the visit of Pope John Paul II, Rather found his broadcast cut short so that the ending of a semifinal U.S. Open tennis match could be aired. He walked away, leaving six minutes of dead air. Despite criticism, Rather survived, finally resigning from CBS in 2005. Impact Rather inherited the traditions of the wordoriented, on-the-spot, and, if necessary, confrontational radio journalists who virtually invented mod-
Reagan, Nancy
■
799
ern broadcast journalism during World War II. He was at his best when he could live out this inheritance, but network news audiences diminished. He was less successful as news and entertainment were increasingly blended in reaction to competition from cable television. Further Reading
Alan, Jeff, with James M. Lane. Anchoring America: The Changing Face of Network News. Chicago: Bonus Books, 2003. Fensch, Thomas, ed. Television News Anchors. Woodlands, Tex.: New Century Books, 2001. Goldberg, Robert, and Gerald Jay Goldberg. Anchors: Brokaw, Jennings, Rather, and the Evening News. Secaucus, N.J.: Carol Publishing Group, 1990. Betty Richardson See also Brokaw, Tom; Cable television; China and the United States; Elections in the United States, 1988; Iran-Contra affair; Jennings, Peter; Journalism; Network anchors; Television.
■ Reagan, Nancy Identification
First Lady of the United States, 1981-1989 Born July 6, 1921; New York, New York During the decade of the 1980’s, Nancy Reagan was a constant presence and considerable force in the life of Ronald Reagan, president of the United States from 1981 to 1989. The closeness of their relationship has led many to speculate that Nancy influenced many of the president’s crucial decisions and helped to shape the actions by which his presidency would ultimately be judged. Before Ronald Reagan’s presidency, Nancy Reagan, as the governor’s wife in California, was involved in organizations that focused on helping physically and emotionally handicapped children. This experience provided her with the background for her later initiatives concerned with alcohol and drug abuse among young people. In 1985, she hosted the second international drug conference for First Ladies from around the world at which the slogan Just Say No was coined. Nancy Reagan became honorary chair of the Just Say No Foundation, a group aimed at fighting drug
800
■
Reagan, Nancy
The Eighties in America
president’s staff. She certainly had strong objections to White House chief of staff Donald Regan. It is generally conceded that his resigning under pressure in February, 1987, was the result of his squabbles with the First Lady, who constantly interfered in Regan’s attempts to organize the president’s official schedule. As early in the Reagan presidency as July, 1982, Secretary of State Alexander Haig was forced out of his position. His resignation is also thought to have been instigated by the First Lady. Certainly scholars of the Reagan presidency have recognized the influence, albeit indirect, the First Lady had on her husband, especially in matters dealing with people whose actions affected his public image and her own. The Reagans’ first year in the White House was difficult. Their inaugural celebrations cost $16 million, making the inauguration the most expensive in history and evoking harsh comments from many critics. Nancy then attracted considerable negative press by spending $800,000 to renovate the family’s living quarters in the executive mansion and by spending $200,000 to buy china for use at official First Lady Nancy Reagan in 1981. (Library of Congress) functions. Private funds were used to finance her expenditures, but with unemployment high and a recession crippling the problems afflicting the youth of many nations. She economy, such extravagance rankled many people. was also active in the National Federation of Parents The First Lady regularly consulted an astrologer, for Drug-Free Youth (later the National Family PartWilliam Henkel, for guidance, particularly seeking nership) and the National Child Watch Campaign. to determine imminent dangers that might face her She served as honorary president of the Girl Scouts husband. Overruling some of Donald Regan’s schedof America.Good Housekeeping magazine named Reauling, she helped shape the president’s agenda, makgan one of the ten most admired American women. ing sure that he stayed close to the White House on In three successive years—1984, 1985, and 1986— days that Henkel identified as dangerous. Henkel’s she ranked first in national polls as the most admired “threat days” were spelled out and given to the Sewoman in the United States. During this period, she cret Service, Regan, and the president’s secretary so was deeply involved in encouraging the arts, serving that, in concert, they could work out the president’s on the President’s Commission on the Arts and Huschedule in ways that eliminated commitments at manities and as a board member of the Wolf Trap times Henkel considered dangerous. Regan rankled Foundation for the Performing Arts. As a former acat this interference and sometimes refused to take tress, her interest in the performing arts was longthe First Lady’s telephone calls. standing. Behind-the-Scenes Political Role There has been considerable speculation about how active a role the First Lady played in the dismissal of some of the
Impact Nancy Reagan unquestionably bolstered the president’s self-confidence, ever urging him to follow his natural instincts in his decision making.
The Eighties in America
She denies that she directly influenced his decisions, but the two certainly discussed many crucial issues. The First Lady opposed the Equal Rights Amendment, which the president initially supported. He eventually came around to her point of view. Although the First Lady wanted her husband’s lasting legacy to be one of working arduously for peace in the world, she crossed swords with Raisa Gorbachev, wife of Soviet leader Mikhail Gorbachev, causing a minor breach in U.S.-Soviet relations. Nevertheless, the First Lady worked tirelessly to promote her husband’s most important initiatives and was ever the unfailingly loyal wife, who lived through the trauma of the attempted assassination of the president in 1981 and later through her husband’s bout with colon cancer in 1985, with prostate surgery in 1987, and with her own breast cancer in 1987. She was a strong-willed, intelligent woman who unflaggingly supported but did not control her husband. Further Reading
Beasley, Maurine H. First Ladies and the Press: The Unfinished Partnership of the Media Age. Evanston, Ill.: Northwestern University Press, 2005. An overall consideration of how First Ladies have dealt with the press. Boller, Paul F., Jr. Presidential Wives: An Anecdotal History. 2d ed. New York: Oxford University Press, 1998. Provides some intimate, anecdotal glimpses into the life of Nancy Reagan as First Lady. Kelley, Kitty. Nancy Reagan: The Unauthorized Biography. New York: Simon & Schuster, 1991. A controversial, essentially negative portrayal of Nancy Reagan. Reagan, Nancy, with William Novak. My Turn: The Memoirs of Nancy Reagan. New York: Random House, 1989. Nancy Reagan’s attempt to cast herself in a positive light during her life as First Lady. Wallace, Chris. First Lady: A Portrait of Nancy Reagan. New York: St. Martin’s Press, 1986. Based on the NBC White Paper; strong overall coverage with excellent illustrations. Watson, Robert P., and Anthony J. Eksterowicz, eds. The Presidential Companion: Readings on the First Ladies. 2d ed. Columbia: University of South Carolina Press, 2006. Some excellent in-depth insights into Nancy Reagan’s role as First Lady. R. Baird Shuman
Reagan, Ronald
■
801
See also Elections in the United States, 1980; Elections in the United States, 1984; Haig, Alexander; Iran-Contra affair; Just Say No campaign; National Minimum Drinking Age Act of 1984; Reagan, Ronald; Reagan assassination attempt; Reagan Revolution; Reaganomics; Regan, Donald.
■ Reagan, Ronald Identification U.S. president, 1981-1989 Born February 6, 1911; Tampico, Illinois Died June 5, 2004; Bel Air, California
Reagan’s two-term presidency dominated American politics, domestic federal policies and U.S. foreign affairs during the 1980’s. Reagan conservatism reshaped the American political landscape, and his administration’s successes ensured the 1988 presidential victory of his vice president, George H. W. Bush. Reagan’s “Revolution” in foreign policy led to the disintegration of the Soviet Union and Eastern European communism at the end of the decade. Ronald Reagan was the second son of Nelle and John Reagan of Tampico, Illinois. He graduated as an economics major from Eureka College in 1932 and began work as a radio announcer in Davenport and then Des Moines, Iowa. In 1937, Reagan moved to California and entered a film career that lasted through the 1940’s and resulted in his election as president of the Screen Actors Guild in 1947. He married his second wife, actress Nancy Davis, with whom he had two children, in 1952. From 1954 to 1962, Reagan performed on television’s General Electric Theater and served as a spokesperson for General Electric. Though a Democrat, he supported the presidential campaigns of Republicans Dwight D. Eisenhower (1952, 1956) and Richard M. Nixon (1960) and changed parties in 1962, claiming famously that the Democratic Party had left him. His speech in support of Barry Goldwater late in the 1964 presidential campaign brought him to the political limelight, and he successfully ran for governor of California two years later. Though he failed in a halfhearted attempt to secure the Republican presidential nomination in 1968, he easily won reelection in California in 1970. Reagan mounted a stronger bid against sitting president Gerald R. Ford for the Republican nomination in 1976, losing out in a tightly run contest.
802
■
Reagan, Ronald
Candidate Reagan
By the 1980 election cycle, the American voters had lost patience with the Democratic administration’s inability to deal effectively with the many foreign and domestic problems that had landed America in what President Jimmy Carter termed a “malaise.” Americans still stung from the defeat in Vietnam and more recently the rise of extremist Iran, whose young militants held scores of American hostages, and fear of the Soviet nuclear threat was palpable. Domestically, America’s economy suffered from high foreign oil prices and a new phenomenon of “stagflation,” which produced economic stagnation, high unemployment, and double-digit price inflation and interest rates. With only moderate opposition within the Republican Party, Reagan assembled a very capable staff and ran his campaign on one major theme and two major planks: The theme was optimism in the face of malaise—effectively conveyed in Reagan’s “Morning in America” television commercial—and the planks were a revitalized military and domestic economic recovery. From at least the early 1960’s, Reagan had been evolving a personal conservatism that relied on strong evangelical Christian faith, and he advocated personal and economic freedom, the reduction of the size and power of the federal government, and the aggressive confrontation of world communism with the goal of ending communism and eliminating the very existence of nuclear weapons. This flew in the face of liberal reliance on “big government” for the protection of certain rights and provision of the means of existence in the form of welfare and other forms of wealth redistribution. Reagan’s ideas also opposed the decades-long nuclear weapons policy of “mutually assured destruction,” or MAD, and the “containment” of international communism. The U.S. electorate in 1980 was not only fed up with Carter-styled malaise but also energized by new political forces that included evangelical Christians organized as the Moral Majority, ex-liberals and radicals-turned-Republican intellectuals known as neoconservatives, and conservative “Reagan Democrats.” Though dismissed by many opponents as “an actor” of limited intellect and few ideas, Reagan proved a stellar campaigner, gaining the nickname the Great Communicator. After Reagan defeated exDirector of Central Intelligence George H. W. Bush for the Republican nomination on July 16, 1980, in Detroit, he chose Bush as his running mate and
The Eighties in America
trounced President Carter on November 4, receiving almost 52 percent of the popular vote, but a 48949 electoral college landslide. Republicans also gained control of the Senate. What followed was quickly dubbed the Reagan Revolution. Reagan’s First Term On the day Reagan was sworn in, January 20, 1981, Iranian leaders released the American hostages, raising questions about Reagan’s role in prolonging their captivity but signaling a more effective foreign policy. The new administration began to take shape as Reagan not only filled the usual posts and offices but also created a new layer of cabinet councils and task forces made up of administration figures and specialists drawn from academia and the private sector. He was less “hands on” as an executive than other presidents and was often criticized for this. Having clearly articulated his policy goals, however, he generally left the arguing and details to trusted experts. For example, only two days after his inauguration, he established the Task Force on Regulatory Relief to explore targets for his campaign to reduce the scope and structure of federal regulation. Two weeks later, Reagan began employing one of his most effective tools, the televised address to the nation. He announced the outline of his program for economic recovery, laying the groundwork for his address to Congress on February 18. He followed this with a second congressional visit on April 28 and a second national address on July 27. Two days later, the Democrat-controlled House of Representatives passed the keystone Economic Recovery Tax Act (ERTA), which cut the rate of growth of welfare spending, lowered income tax rates, and increased spending on the U.S. military. Reagan signed the act on August 13. August also saw Reagan’s response to striking air traffic controllers: Declaring their stoppage illegal, he fired more than eleven thousand workers, signaling that he would be no friend to labor unions. His first year was marred, however, by an assassination attempt by deranged gunman John Hinckley, Jr., on March 30. Following surgery, the seventy-year-old president was released from the hospital twelve days later. In July, 1985, he underwent another surgical procedure when he was successfully operated on for colon cancer. Shading the ideological complexion of the Supreme Court was a major goal of the Reagan admin-
The Eighties in America
Reagan, Ronald
■
803
antagonistic move that appalled many liberals but istration. In late summer of 1981, the administration confirmed Reagan’s conviction about the nature was successful in having moderate conservative Sanof the United States’ greatest enemy. He followed dra Day O’Connor appointed to the Court, making this on March 23 with a national address on the misher the first female associate justice. In 1986, Reagan sile threats to U.S. security and his new antiballistic also managed to install conservative Antonin Scalia missile Strategic Defense Initiative (SDI). Quickly as an associate justice and William H. Rehnquist as and derisively nicknamed “Star Wars,” this earthchief justice, replacing Warren E. Burger. In 1987, and space-based system would supposedly shield the however, Senate opposition to conservative nomiUnited States from foreign nuclear warheads. Caunee Robert H. Bork killed his nomination, resulting tious negotiation and continued buildup seemed in the appointment of the much more moderate Anwarranted, as the Soviets downed a Korean airliner thony Kennedy. for violating Soviet airspace on September 1, 1983, On October 2, 1981, with higher levels of funding and Konstantin Chernenko replaced Andropov the in place, Reagan announced his program for develfollowing February. oping a new generation of strategic weapons, his first step toward challenging the Soviet threat. On November 18, he outlined his ideas for nuclear arms reduction to the National Press Club, and on May 31, 1982, he announced U.S.-Soviet Strategic Arms Reduction Talks, which would ultimately result in the first Strategic Arms Reduction Treaty (START). Rather than seeking to limit the growth of nuclear arsenals, as previous negotiators had tried to do, Reagan laid the foundation for eliminating the threat of nuclear weapons altogether. In June, Reagan traveled to Europe for a meeting of the G7 (the group of seven economic powers) and became the first U.S. president to address the full British parliament. On June 17, he presented his “Agenda for Peace” speech to the U.N. Special Session on Disarmament. Reagan, however, wanted to deal from a position of military strength, and he continued America’s buildup and challenge to the Soviets, especially after the death of Soviet leader Leonid Brezhnev and the rise of Yuri Andropov in November. On March 8, 1983, in a speech to the National Association of Evangelicals, Reagan famously labeled the Soviet Union President Ronald Reagan. (Library of Congress) the “Evil Empire,” a calculated if
804
■
Reagan, Ronald
The Middle East also preoccupied Reagan during his first term. He announced the Fresh Start initiative to aid the Israeli peace process on August 20, 1982, and at the request of the Lebanese government, which was in the midst of a civil war, he sent U.S. Marines to Beirut five days later. On April 18, 1983, pro-Iranian terrorists bombed the U.S. embassy there, killing 63, and on October 23, 1983, 241 U.S. military personnel died in the car bombing of their barracks. Reagan responded by pulling troops out and ordered air strikes on Syrian installations in Lebanon in December. On October 25, 1983, Reagan sent U.S. troops into the tiny Caribbean island of Grenada in the wake of a leftist coup. Reagan announced in January, 1984, that he would run for reelection, and much of the year was taken up with popular initiatives. He visited Beijing in April, where he signed an accord on nuclear arms development with the Chinese. He signed bills effectively establishing a federal minimum age for alcohol consumption and for reducing the federal deficit. From August to November, Reagan was preoccupied with campaigning against Democrat Walter Mondale, whom he debated twice and defeated handily (60-40 percent in the popular vote; 525-13 in the electoral college). Reagan’s Second Term
In March, Soviet leader Chernenko died and Mikhail Gorbachev assumed power. Reagan warmed slowly, continuing to warn about missile threats and advocate negotiations. On October 24, 1985, Reagan addressed the United Nations on its fortieth anniversary about his hopes for a “fresh start” with the Soviet Union and a week later announced an initiative for further nuclear arms reductions. This catchphrase was carried through with Reagan’s watershed summit with Gorbachev in Geneva, Switzerland, in mid-November. On January 1, 1986, Reagan spoke to the Soviet people, and Gorbachev to the Americans, in the first of four annual radio speeches. After addressing the United Nations on progress in arms reductions in September, Reagan met with Gorbachev in Reykjavik, Iceland, in mid-October. There Gorbachev agreed in principle with Reagan’s proposal to reduce the stockpiles of nuclear weapons, even to eliminate them. Visiting Europe again in June, 1987, for the G7 summit, Reagan stopped in Berlin. On June 12, he stood before the Brandenburg Gate and challenged the Soviet Union to release its communist grip on
The Eighties in America
Eastern Europe, famously saying, “Mr. Gorbachev, tear down this wall!” The two leaders arranged another summit and in December met in Washington, D.C., where they signed the Intermediate-Range Nuclear Forces (INF) Treaty, which arranged for the eventual elimination of an entire class of nuclear weapons. Gorbachev even acquiesced to on-site verification and softened his opposition on the SDI, which was blocking progress on long-range (strategic) missile talks. The Senate approved the INF on May 27, 1988, and on June 1, Reagan and Gorbachev ratified it at the Moscow summit. The two leaders also began discussions on reducing conventional weapons. While in Moscow, Reagan gave an unprecedented speech defending political freedom at Moscow University and urged religious freedom while visiting Danilov Monastery. Reagan met one last time as president with Gorbachev on December 7, 1988, in New York City. Preoccupied with and politically damaged by the Iran-Contra affair, and having lost control of the Senate, Reagan’s administration made little headway in addressing his domestic concerns. His confrontational attitude toward the federal budgeting process made Reagan many enemies in Congress, but a strong economy kept the issues of taxes and spending out of the limelight, especially after passage of the Tax Reform Act of 1986, which lowered taxes for Americans in the highest tax bracket. Reagan supported George H. W. Bush for the Republican nomination in 1988 and retired to California to write his memoirs after he handed power over to Bush in January, 1989. Impact Reagan’s two terms in office created a sea change in U.S. foreign policy, reinvigorated the U.S. military, and proved a dominant factor in ending the Cold War. His administration introduced conflict and confrontation in the federal budgeting process, energized the political right in America, and placed the conservative agenda of smaller government, a leaner welfare system, and fiscal responsibility at the center of the national debate. ERTA, however, was his only unalloyed fiscal victory. His bungling of the IranContra episode undermined his reputation, and his attempts to reduce the size of government and government spending foundered. Nevertheless, Reagan remained well liked for his humor, appealing personality, and ability to engage world leaders from British prime minister Margaret Thatcher to Soviet premier
The Eighties in America
Gorbachev. Justifiably or not, he will be remembered for presiding over the end of the Cold War era and for defending traditional, conservative values reflecting the American heartland. Further Reading
Brinkley, Douglas, ed. The Reagan Diaries. New York: HarperCollins, 2007. Noted presidential historian’s edition of Reagan’s diaries. Cannon, Lou. President Reagan: The Role of a Lifetime. New York: PublicAffairs, 2000. Noted political journalist’s view of Reagan and his presidency. Diggins, John P. Ronald Reagan: Fate, Freedom, and the Making of History. New York: W. W. Norton, 2007. Positive analysis of Reagan’s accomplishments in the public arena from a conservative perspective. Kengor, Paul. The Crusader: Ronald Reagan and the Fall of Communism. New York: Regan Books, 2006. Very positive discussion of Reagan’s role in the events that led to the end of Communism in Eastern Europe and the Soviet Union. Lettow, Paul. Ronald Reagan and His Quest to Abolish Nuclear Weapons. New York: Random House, 2006. An Oxford scholar examines Reagan’s foreign policy as an expression of his desire to eliminate all nuclear weapons. Reagan, Ronald. An American Life. New York: Simon & Schuster, 1999. Reagan’s autobiography. Reagan, Ronald, et al. Reagan, in His Own Hand: The Writings of Ronald Reagan That Reveal His Revolutionary Vision for America. New York: Free Press, 2001. A compilation of Reagan’s unpublished writings that display his style and keen grasp of the issues and matters of the day. Reeves, Richard. President Reagan: The Triumph of Imagination. New York: Simon & Schuster, 2005. Using a broad range of sources, this journalist vividly re-creates the Reagan White House with its successes and failures. Troy, Gil. Morning in America: How Ronald Reagan Invented the 1980’s. Princeton, N.J.: Princeton University Press, 2007. Balances the accomplishments and contradictions in the influence of Reaganite conservatism on American society and culture. Tygiels, Jules. Ronald Reagan and the Triumph of American Conservatism. New York: Longman, 2006. Despite the title, a fairly balanced view of Reagan’s accomplishments as president, in a series for student readers. Joseph P. Byrne
Reagan assassination attempt
■
805
See also Berlin Wall; Congress, U.S.; Economic Recovery Tax Act of 1981; Elections in the United States, 1980; Elections in the United States, 1984; Foreign policy of the United States; Grenada invasion; Haig, Alexander; Inflation in the United States; Intermediate-Range Nuclear Forces (INF) Treaty; Iran-Contra affair; Iranian hostage crisis; Meese, Edwin, III; Middle East and North America; Military spending; Moral Majority; Panama invasion; Reagan, Nancy; Reagan assassination attempt; Reagan Democrats; Reagan Doctrine; Reagan Revolution; Reaganomics; Reagan’s “Evil Empire” speech; Recessions; Soviet Union and North America; Stealth fighter; Strategic Defense Initiative (SDI); Tax Reform Act of 1986; USS Stark incident; Welfare.
■ Reagan assassination attempt The Event
U.S. president Ronald Reagan is shot by deranged gunman John Hinckley, Jr., in a failed assassination attempt Date March 30, 1981 Place Washington Hilton Hotel, Washington, D.C. Reagan’s courage and grace following the assassination attempt won the hearts of the American people and increased his popularity. Sympathy for Reagan after his injury may have prompted Congress to pass some of the president’s programs into law. On a cold March day in 1981, newly elected president Ronald Reagan arrived at the Washington Hilton Hotel, a five-minute drive from the White House, to deliver a speech to a group of AFL-CIO union members. Thirty minutes later, Reagan exited the hotel and proceeded to a waiting limousine, where the scene soon erupted into chaos. As Reagan moved toward the presidential car, an armed assassin by the name of John Hinckley, Jr., appeared on the sidewalk. Reagan raised his left arm to wave to onlookers as the deranged gunman began firing his .22 caliber handgun. Hinckley fired six shots at the president; one bullet ricocheted off the car and hit Reagan in the chest. Secret Service men immediately pushed the president into the limousine and raced to the George Washington University Hospital. Three others, including a member of the Reagan administration and two law enforcement officials,
806
■
Reagan assassination attempt
The Eighties in America
Martin Scorsese’s film Taxi Driver (1976), Hinckley began to identify with the main character, played by Robert De Niro, who plots the assassination of a U.S. senator while wooing a prostitute played by Foster. Hinckley decided to repeat De Niro’s performance in real life, except that his victim would be the president. In 1979, Hinckley had stalked President Jimmy Carter but never made an attempt on his life. Two years later, he nearly killed Reagan. In 1982, Hinckley was tried for attempted murder. The court found him not guilty by reason of insanity. Hinckley was committed to St. Elizabeths Hospital in Washington, D.C. Victims’ Recovery Reagan’s demeanor, both during and after the shooting, enhanced his public image. With a bullet lodged near his Secret Service agents and police surround and overpower would-be assassin John heart, he walked into the emerHinckley, Jr., as White House press secretary James Brady lies unconscious on the pavegency room and even joked with ment on March 30, 1981. (AP/Wide World Photos) doctors and nurses. Before going under anesthesia, Reagan goodwere also shot. Secret Service agent Timothy McCarnaturedly asked his surgeons if they were Republithy jumped in front of the president, taking a bullet cans. One doctor replied, “Today, Mr. President, to the abdomen. He was not scheduled to work on we’re all Republicans.” Reagan underwent three March 30 but became part of that day’s protection hours of surgery and several weeks of recovery. His detail when he lost a coin toss with another agent. unflagging spirit cemented his image as a courageous White House press secretary James S. Brady had also president. accompanied Reagan to the Washington Hilton HoHinckley’s other three victims also survived the tel. He was standing a few feet away from the presishooting. Brady was the most seriously injured. Aldent when Hinckley began firing. One of the bullets though he remained the White House press secrestruck Brady in the head, resulting in permanent tary throughout the Reagan administration, Brady brain damage. A third bullet hit metropolitan police never fully recovered from his wound. Suffering officer Thomas Delehanty, who stood guard outside from permanent partial paralysis, he became an adthe hotel. The three victims lay on the sidewalk as vocate for gun control. Brady and his wife, Sarah, security agents subdued Hinckley and held him formed the Brady Campaign to Prevent Gun Vioagainst the hotel wall. Stunned members of the lence and began lobbying for stronger gun control president’s entourage and bystanders tended to the laws. wounded and waited for help to arrive. Impact The assassination attempt inspired some John Hinckley, Jr. Hinckley was a twenty-five-yearstates to pass laws limiting the use of the insanity plea old loner from Evergreen, Colorado. Seeking fame in trials. In the 1980’s, twelve states adopted a “guilty, and the attention of actor Jodie Foster, he decided to but mentally ill” plea. Four of these states changed kill the president of the United States. After watching their laws because of the Hinckley verdict. New gun
The Eighties in America
control laws were inspired by Brady. In 1993, President Bill Clinton signed the Brady Handgun Violence Prevention Act, also known as the Brady Bill, which placed restrictions on the sale of handguns. Further Reading
Johnson, Haynes. Sleepwalking Through History: America in the Reagan Years. New York: W. W. Norton, 1991. This critical look at Reagan provides a few pages on the assassination attempt, focusing on how the event enhanced the president’s image. Low, Peter W., John Calvin Jeffries, and Richard J. Bonner. The Trial of John W. Hinckley, Jr.: A Case Study in the Insanity Defense. Anaheim, Calif.: Foundation Press, 1986. A brief account of the Hinckley trial and its aftermath. Provides a detailed look at the use of the insanity plea. Schaller, Michael. Reckoning with Reagan: America and Its President in the 1980’s. New York: Oxford University Press, 1992. Provides a readable introduction to the Reagan presidency and includes a brief description of the assassination attempt. Rhonda L. Smith See also Elections in the United States, 1980; Elections in the United States, 1984; Reagan, Ronald; Reagan Revolution; Reaganomics.
■ Reagan Democrats Definition
Voters who traditionally voted Democratic but crossed party lines to vote for Republican presidential candidate Ronald Reagan in 1980 and 1984
In his two successful presidential campaigns, Ronald Reagan’s ability to attract voters from constituencies traditionally identified as Democratic voting blocs contributed to his victories. The perception that the Democratic Party had become too liberal on a variety of issues led some Democrats to abandon their party and vote for Reagan. Republican candidate Ronald Reagan won a significant victory over the incumbent Democratic president Jimmy Carter in 1980 and a landslide victory over Carter’s former vice president Walter Mondale in 1984. In both of these elections, people from voting blocs that were generally considered to be solidly Democratic broke with that party and voted for Reagan.
Reagan Democrats
■
807
Beginning with Franklin D. Roosevelt’s presidency (1933-1945), the Democratic Party had built an urban-ethnic voting bloc that, combined with its traditional strength among white Southern voters, gave that party significant power in national elections. White ethnic voters from large cities, such as Irish Americans and Italian Americans, began to move into the Democratic Party in the 1930’s. Roosevelt also attracted significant numbers of northern black voters, who had traditionally voted Republican because of the Republican Party’s connection to President Abraham Lincoln and emancipation. In the 1960’s, many voters believed that the Democratic Party had moved too far to the left on many issues. President Richard M. Nixon had some success in attracting these disaffected Democrats in 1968 and even more in 1972. Reagan built upon this trend. Political observer Richard Brookhiser, citing research conducted by the Hamilton, Frederick, and Schneider consulting firm, identified four specific groups of voters that made up these “Reagan Democrats”: men and women in their fifties and sixties, urban Roman Catholics, young males, and Southern whites. These voters were attracted to Reagan, or alternatively, put off by the Democratic candidates, over positions on social issues such as crime, abortion, and education, or economic policies. Reagan also succeeded in attracting a significant number of voters from members of organized labor, although the leadership of the unions remained firmly Democratic and usually endorsed the Democratic candidates. Impact Political scientists believe that in most elections, about 40 percent of the electorate is firmly identified with each of the major parties. This means there is about a 20 percent “swing vote” that could go to either candidate and will often represent the margin of victory. Reagan Democrats are estimated to have made up about 10 percent (or roughly half) of this swing vote in the 1980 and 1984 elections. Long after Reagan left office, political observers and strategists continued to debate the prospects for later Republican candidates to appeal to this same constituency of disaffected Democrats. Further Reading
Barrent, L. “Reagan Democrats’ Divided Loyalties.” Time, October 31, 1988, 14. Brookhiser, Richard. “The Democrat’s Dilemma.” National Review, December 17, 1990, 40-42. Mark S. Joy
808
■
The Eighties in America
Reagan Doctrine
See also
Elections in the United States, 1980; Elections in the United States, 1984; Reagan, Ronald; Reagan Revolution; Reaganomics.
■ Reagan Doctrine Definition
U.S. foreign policy to defeat communist expansion by supporting anticommunist forces in Africa, Asia, and Latin America
This policy of President Ronald Reagan was a significant shift from the strategy of containment of communism that had been the dominant foreign policy of most American administrations since World War II. Some credit the pressure from this policy with the fall of the Soviet Union and the end of the Cold War. Although the term “Reagan Doctrine” was not coined until 1985, the policy of attempting to push back communism was always part of the Ronald Reagan administration’s foreign policy. President Reagan inherited President Jimmy Carter’s policy of actively supporting forces opposing the Soviets and the Soviet-installed government in Afghanistan. Many of Reagan’s conservative allies sought ways of expanding this policy to other locations. Initially, most of the support was handled by covert Central Intelligence Agency (CIA) operations, with the second phase being the expansion into Angola and Nicaragua. However, as Congress became aware of this use of the CIA, it began to seek oversight of the operations. After this time, Reagan’s support of anticommunists became part of his public policy. Politically, the most controversial operation was his support of the Contras against the Sandinista National Liberation Front in Nicaragua. Congress’s rejection of this policy led to the Iran-Contra affair. However, the president was able to survive this political setback and continued to expand the list of countries in which the United States challenged communist-supported governments. Reagan often used the term “freedom fighters” to describe the forces opposing these governments. In many ways, the Reagan Doctrine reflected the
lesson learned by previous administrations: that it was difficult and expensive to defeat insurgent movements seeking to overthrow allied governments in the developing world, while it was relatively inexpensive to support insurgencies. Just as the Soviet Union had supported such movements in earlier decades against American allies, Reagan supported any force he thought capable of defeating the Soviet allies. In 1986, he specifically mentioned the countries of Afghanistan, Angola, Cambodia, Ethiopia, and Nicaragua as targets of this move against communism. Impact While the Reagan Doctrine had some domestic political repercussions, it was relatively successful as a foreign policy. The criticism focused on the facts that the forces supported by the United States did not follow internationally recognized standards of conduct and that most of the conflicts were fairly local in scope without a direct impact on American interests. However, President Reagan viewed the world through the context of the Cold War and acted according to the dictate of opposing all manifestations of communism. During his presidency, or shortly thereafter, all five countries specifically targeted in his 1986 speech showed changes that signaled the defeat of the Soviet Union and its allies. Only Angola did not have a total change of government as a result of this doctrine. In more recent years, the one major criticism of this policy pointed to the administration’s support of the mujahideen in Afghanistan, which led to the Taliban governing that country, with negative repercussions for the United States in the next two decades. Further Reading
Lagon, Mark P. The Reagan Doctrine: Sources of American Conduct in the Cold War’s Last Chapter. Westport, Conn.: Praeger, 1994. Scott, James M. Deciding to Intervene: The Reagan Doctrine and American Foreign Policy. Durham, N.C.: Duke University Press, 1996. Donald A. Watt See also
Africa and the United States; Cold War; Foreign policy of the United States; Grenada invasion; Iran-Contra affair; Latin America; Nuclear winter scenario; Reagan, Ronald; Reagan’s “Evil Empire” speech.
The Eighties in America
■ Reagan Revolution The Event
A new set of federal priorities and initiatives inspired by conservative and neoconservative principles redirects U.S. foreign and domestic policies
The Reagan Revolution was a major realignment of U.S. federal governmental policy along lines that were generally politically, economically, and socially conservative. Underlying policy successes during Reagan’s two terms, this conservative trend continued to influence U.S. policy and politics into the twenty-first century. By the 1980 election cycle, the American electorate had grown weary of pessimism engendered by defeat in Vietnam, the failures of the Great Society programs of the 1960’s, the federal scandals of the 1970’s, fear of the Soviet Union, and disastrous foreign and economic policies of the Jimmy Carter administration. Ronald Reagan ran on two optimistic principles—economic recovery and revitalization of U.S. military strength—and handily defeated Carter. Reagan’s young administration quickly began reshaping U.S. policies. Economic growth meant lowering tax rates and reducing government spending and regulation, both conservative goals that would create free market prosperity and lead to “smaller government,” a program often called Reaganomics. Simultaneously, however, Reagan required an expensive military buildup that increased troop strength, developed new weapons systems (the stealth bomber and Strategic Defense Initiative), and generally increased the aggressive stance of the United States toward the Soviet Union and its satellites. Ultimately, Reagan sought the elimination of all nuclear weapons, but, distrusting Soviet good faith, he sought this goal by starting an arms race in which, he gambled, the weaker economy of the Soviet Union could not compete. This anticommunist stance also led to a successful incursion into Grenada and domestically scandalous support for the anti-Sandinista Contras in Nicaragua. Domestically, the Reagan White House exercised what it called New Federalism. It sought to limit abortion rights, civil rights programs such as affirmative action, and the funding of certain social programs while protecting private property rights, espousing victims’ rights over those of criminals, and strengthening the prerogatives of the executive branch that had been attenuated in the wake of Watergate.
Reaganomics
■
809
Impact The successes of the Reagan Revolution galvanized conservative politics in the United States, slowed the advance of leftist statism and socialistinspired social programming, seriously weakened the Soviet Union, and established the United States’ position as the unrivaled world superpower for the remainder of the twentieth century. Subsequent Events Arguably the most significant subsequent event was the disintegration of communist regimes in the Soviet Union and Eastern Europe beginning in 1989. Domestically, the solid Republican victories in the House and Senate in 1994, which held until 2006, also stemmed from the Reagan Revolution. Further Reading
Reagan, Ronald. An American Life. New York: Simon & Schuster, 1999. Schwab, Larry M. The Illusion of a Conservative Reagan Revolution. Somerset, N.J.: Transaction, 1991. Thornton, Richard C. The Reagan Revolution. 2 vols. Victoria, B.C.: Trafford, 2006. Joseph P. Byrne See also
Berlin Wall; Economic Recovery Tax Act of 1981; Elections in the United States, 1980; Foreign policy of the United States; Grenada invasion; Inflation in the United States; Intermediate-Range Nuclear Forces (INF) Treaty; Iran-Contra affair; Military spending; Reagan, Ronald; Reagan Doctrine; Reaganomics; Reagan’s “Evil Empire” speech; Soviet Union and North America; Stealth fighter; Strategic Defense Initiative (SDI); Tax Reform Act of 1986.
■ Reaganomics Definition
An economic policy that emphasizes the downsizing of government and of costly government-supported social programs whose curtailment permits reductions in taxation
The supply-side economic policies of the Reagan administration resulted initially in a severe recession and in a drastic reduction in social services that adversely affected the poor, while they simultaneously ran up huge federal deficits. However, before Reagan left office, the American economy had improved significantly. When he was inaugurated on January 20, 1981, Ronald Reagan became president of a nation beset by
810
■
Reaganomics
The Eighties in America
would lead to corporate growth and to an improved and much strengthened economy, as would permitting corporations to accelerate their depreciation credits. Prior to Reagan’s nomination, George H. W. Bush, who also sought the party’s nomination for the presidency, labeled Reagan’s views “voodoo economics.” Bush subsequently became Reagan’s vice president, but he was not comfortable with Reagan’s brand of “supply-side” economics. David Stockman, who became Reagan’s budget director, also had strong reservations about Reaganomics. President Ronald Reagan explains his plan to cut taxes during a televised address in He knew what the numbers indiJuly, 1981. (Ronald Reagan Presidential Library) cated as he worked through the complexities of a huge budget that in many aspects was too complex overwhelming economic problems that had been to be fully comprehensible. Stockman expressed his growing for almost two decades. These problems rereservations forthrightly in an article in the Decemsulted in towering inflation, interest rates that apber, 1981, issue of The Atlantic Monthly in which he proached 20 percent, and widespread unemploycontended, before it was ever sent to the Capitol ment. During his run for the presidency against for congressional approval, that Reagan’s economic incumbent Jimmy Carter, Reagan, who considered plan would not work. runaway inflation the nation’s most salient economic problem, proposed a revolutionary plan for dealing Three Salient Elements of Reaganomics The funwith the economy. damental aspects of Reaganomics had considerable This plan, based largely on the economic theories public appeal, because they appeared, superficially of Arthur Laffer, a University of Chicago professor of at least, to offer hope of improving the economy economics, emphasized lowering taxes to stimulate painlessly. Contradictions inherent within the three the economy. Laffer had gained his tenured profeselements of Reagan’s proposed reforms were oversorship at Chicago by claiming to hold a Ph.D. in looked or ignored by many in the administration economics, which he did not hold. Laffer took seriwho examined them. Reagan outlined his plan to ously the pronouncement of the eighteenth century the American people in a presidential address on French philosopher Baron de Montesquieu that the February 18, 1981, less than a month into his presigovernment that overtaxes will eventually erode the dency. sources of its revenue. Reagan, who majored in ecoTo begin with, Reagan called for a significant nomics at Eureka College, was under considerable reduction in the rates at which individuals and corpressure to promote new ideas for dealing with the porations were to be taxed. He reasoned that such nation’s economic problems. Laffer’s theories inreductions would increase the average person’s distrigued him. posable income and would put more money into cirDuring his run for the presidential nomination, culation. Corporations, released from the impediReagan articulated a revolutionary economic policy ment of high taxation, would plow more money into based largely on Laffer’s model. He reasoned that expansion, thereby creating jobs and helping to allemoney from tax reductions, once released, would viate unemployment. Reagan called for a 30 percent encourage people to save (thereby creating new captax cut across the board, 10 percent per year over ital) and to use that capital to invest, which, in turn, three years.
The Eighties in America
Related to this tax cut was the second element of his proposal, that of substantially reducing federal oversight of corporations and permitting accelerated depreciation. Reagan reasoned that implementing this deregulation plan would free corporations from many complicated bureaucratic rules and regulations that struck him as burdensome and unnecessarily restrictive. The third and probably most controversial element of his proposal was to cut drastically the size of the federal bureaucracy, much of which he considered wasteful and inefficient. Whereas the economic policies of his predecessors from Franklin D. Roosevelt to Jimmy Carter had championed the establishment of government welfare programs and other stimuli to a sluggish economy, Reagan held an opposite view. He argued for the elimination or severe reduction of many of the social programs designed to help the poor, including welfare for dependent children, institutionalization and custodial care for the mentally ill, and financial aid for the unemployed. He was convinced that such cuts would reduce unemployment and that, although corporations and the more fortunate members of society would not be damaged by these cuts, the poor would eventually benefit as well from what Reagan’s critics cynically termed the “trickle-down” effect. Along with all of these mandates, Reagan called for significantly increased spending on national defense. The contradiction inherent in his proposals, which many advisers pointed out to Reagan, was the seeming impossibility of increasing substantially the money spent on national defense in the face of the tax reductions the president proposed. Reagan countered such objections by pointing out that his policies would make people more productive, would reduce materially the unemployment rate, and would result in bringing increased revenue to the federal government despite the tax reductions. Reaction to Reagan’s proposals was immediate. Members of racial minorities raged against what they considered benefits to the middle and upper classes at the expense of the less fortunate. The notion of a “trickle-down” effect did little to assuage their concerns. The Economic Recovery Tax Act of 1981 On March 30, 1981, two months into his term, Reagan was shot and almost killed by John Hinckley, Jr. This nearfatal event and the president’s upbeat attitude fol-
Reaganomics
■
811
lowing it helped to endear the president to many Americans who until then had not been supportive of him. A major increase in his approval rating made it possible for him to put many of his policies into effect. In August, 1981, Congress passed the Economic Recovery Tax Act of 1981. This legislation reduced individual and corporate income taxes drastically, reducing by $33 billion the amount of tax revenue coming into the government for fiscal 1982. It represented the largest tax cut in American history. Even as this legislation was being passed, the United States slumped into a recession. On one hand, the recession cooled the raging inflation, but on the other hand increased the ranks of the unemployed. Many small businesses were forced into bankruptcy. Large reductions in the amount of tax money coming into federal coffers led inevitably to an alarming federal deficit. Looming Federal Budget Deficits
By 1982, the growth of the deficit was so great that Congress was left with no choice but to raise taxes. With a federal budget deficit of $110.7 billion in 1982, the tax increase Congress imposed—the largest in the history of the country—was insufficient. It amounted to $91 billion, resulting in a shortfall of nearly $20 billion. As early as the fall of 1981, a poll of Americans revealed that more than 60 percent of those questioned considered themselves in worse condition economically than they had been during the preceding administration. By 1982, unemployment in the United States had reached a staggering eleven million people, the highest number since 1941. The economy began to improve rapidly in 1983, but the federal budget deficit reached another new high of $195 billion in that year. This was followed by yet another record-setting deficit in 1984. The deficits run up during the first three years of the Reagan administration exceeded half a trillion dollars. Distraught presidential economic advisers recommended that the administration cut back on defense spending as a means of controlling the runaway federal deficits.
The Cost of Defense Spending In the face of reduced tax revenues, the Reagan administration continued to pour huge sums into defense spending. Reagan, a staunch anticommunist, noted that the Soviets were spending 50 percent more than the United States on defense. Because he was convinced
812
■
The Eighties in America
Reaganomics
that this imbalance had to be eliminated, Reagan turned a deaf ear to those who recommended that he attempt to balance the budget by making major cuts in this colossal drain on the federal coffers. Even though the economy revived substantially in 1984 and inflation waned, the deficit continued to grow, and the federal bureaucracy that Reagan was on record as wanting to shrink became larger. Audiences that attended his public appearances and listened to his speeches spurred the president on in his efforts to increase defense spending. His charisma and personal charm allayed many of the fears people might have harbored about his economic policies. Whenever Reagan called for stronger armed forces in his speeches, he elicited cheers and applause from his audiences. Whereas his economic advisers tried to convince him of the stark realities associated with uncontrolled overspending, an adoring and largely uninformed public gave him the impetus to continue the policies he was pursuing against the better judgment of those who were dealing at first hand with numbers that simply did not add up. Impact The immediate impact of Reaganomics was devastating to the poorer members of American society. Whereas affluent citizens and corporations benefited from the newly imposed economic policies, many of the poor suffered greatly. Reagan’s most strident critics accused him of being indifferent to the problems of the poor. The cuts he made in welfare programs designed to help poor people and those incapable of helping themselves led to a great increase in poverty, particularly in the nation’s large cities. Thousands of people who needed custodial care were forced onto the streets, resulting in a staggering increase in homelessness that continued well into the twenty-first century. Bread lines and soup kitchens resembling those seen during the Great Depression of the 1930’s began to spring up throughout the country. Thousands of people slept in public parks, beneath highway bridges, or wherever they could find shelter. In extreme weather, many people died from exposure, even though efforts were made through such charitable agencies as the Salvation Army and the Red Cross to provide them with shelter from extreme heat and cold. Because a fundamental tenet of Reaganomics was to reduce governmental regulation of big busi-
ness, many environmental regulations imposed by earlier administrations were relaxed through the connivance of Reagan and his secretary of the interior, James G. Watt. The administration approved measures that promoted the corporate use of federal lands for drilling, mining, and harvesting timber. A quarter century after Ronald Reagan left the presidency, many results of his economic policies— Reaganomics—were still evident. Further Reading
Campagna, Anthony C. The Economy in the Reagan Years: The Economic Consequences of the Reagan Administration. Westport, Conn.: Greenwood Press, 1994. Chapters 3, 8, and 9 deal respectively with the beginnings, implementation, and consequences of Reagan’s economic policies. Johnson, Dary. The Reagan Years. San Diego, Calif.: Lucent Books, 2004. In this overview directed to juveniles, Johnson devotes chapter 2 to Reaganomics and covers the subject well. Nester, William R. A Short History of American Industrial Policy. New York: St. Martin’s Press, 1998. Chapter 6, “Reaganomics Versus Clintonomics, 1981-2000,” is particularly relevant. A compelling contrastive consideration. Smith, Roy C. Comeback: The Restoration of American Banking Power in the New World Economy. Boston: Harvard Business School Press, 1993. Chapter 1, “Reaganomics: Vision or Voodoo?,” gives objective overviews of the economic policies of the Reagan administration. Stockman, David A. The Triumph of Politics: The Inside Story of the Reagan Revolution. New York: Avon, 1987. Reagan’s former budget director explains the pitfalls of Reaganomics and exposes the inherent weaknesses of the economic policy. Strobel, Frederick R. Upward Dreams, Downward Mobility: The Economic Decline of the American Middle Class. Savage, Md.: Rowman & Littlefield, 1992. An interesting, although neither detached nor objective, view of the topic is found in chapter 8, “Reaganomics: A Wolf in Sheep’s Clothing.” R. Baird Shuman See also
Business and the economy in the United States; Conservatism in U.S. politics; Economic Recovery Tax Act of 1981; Homelessness; Reagan, Ronald; Reagan Doctrine; Recessions; Unemployment in the United States.
The Eighties in America
Reagan’s “Evil Empire” speech
■ Reagan’s “Evil Empire” speech The Event
President Ronald Reagan identifies the Soviet Union as an “evil empire” Author Ronald Reagan (1911-2004), reading text written by speechwriter Anthony R. Dolan (1948) Date March 8, 1983 Place Orlando, Florida Reagan’s speech was intended to answer his critics by indicating that the Soviet Union was so unacceptable a regime that extraordinary measures should be taken to defeat it. President Ronald Reagan proposed increases in the military budget to deploy Cruise and Pershing II intermediate-range missiles in Europe in order to match the Soviet Union’s SS-20 missiles, thus raising suspicions that he might launch nuclear war against the Soviet Union and frightening many international observers into calling for a freeze on the de-
■
813
ployment, development, and manufacture of nuclear weapons. Speaking at a convention of the National Association of Evangelicals in the Sheraton Twin Towers Hotel, Orlando, Florida, Reagan sought to justify his position by emphasizing the nature of the enemy as an “evil empire” that must be stopped at all costs. The media and the public largely supported this assertion. As a result, the nuclear freeze campaign lost momentum: A congressional committee had just approved a resolution advocating a nuclear freeze, but the proposal was subsequently dropped. The policy of merely deterring the Soviet Union was questioned as an unproductive continuation of the Cold War stalemate between the capitalist West and the communist East. Dissidents in the Soviet Union and under Soviet-controlled Eastern Europe were emboldened to organize resistance against communist regimes in their countries. In response, the Soviet Union wanted to match
The “Evil Empire” Speech The following excerpts are taken from President Ronald Reagan’s speech to the National Association of Evangelicals, delivered on March 8, 1983: During my first press conference as president, in answer to a direct question, I point[ed] out that, as good Marxist-Leninists, the Soviet leaders have openly and publicly declared that the only morality they recognize is that which will further their cause, which is world revolution. I think I should point out I was only quoting Lenin, their guiding spirit, who said in 1920 that they repudiate all morality that proceeds from supernatural ideas—that’s their name for religion—or ideas that are outside class conceptions. Morality is entirely subordinate to the interests of class war. And everything is moral that is necessary for the annihilation of the old, exploiting social order and for uniting the proletariat. Well, I think the refusal of many influential people to accept this elementary fact of Soviet doctrine illustrates a historical reluctance to see totalitarian powers for what they are. . . . This doesn’t mean we should isolate ourselves and refuse to seek an understanding with them. I intend to do everything I can to persuade them of our peaceful intent, to remind them that it was the West that refused to use its nuclear monopoly in the forties and fifties for territorial gain and which now proposes a 50-percent cut in strategic ballistic missiles and the elimination of an entire class of land-based, intermediate-range nuclear missiles. At the same time, however, they must be made to understand we will never compromise our principles and standards. We will never give away our freedom. We will never abandon our belief in God. And we will never stop searching for a genuine peace. But we can assure none of these things America stands for through the so-called nuclear freeze solutions proposed by some. . . . So, in your discussions of the nuclear freeze proposals, I urge you to beware the temptation of pride— the temptation of blithely declaring yourselves above it all and label both sides equally at fault, to ignore the facts of history and the aggressive impulses of an evil empire, to simply call the arms race a giant misunderstanding and thereby remove yourself from the struggle between right and wrong and good and evil.
814
■
The Eighties in America
Reagan’s “Evil Empire” speech
President Ronald Reagan delivers his “Evil Empire” speech to the National Association of Evangelicals on March 8, 1983. (Ronald Reagan Presidential Library)
America’s military expansion but lacked the funds to do so. Mikhail Gorbachev, after he was elected as General Secretary of the Central Committee of the Communist Party of the Soviet Union in 1985, tried to persuade Reagan to stop the American deployment of an antiballistic missile system, known as the Strategic Defense Initiative (SDI), as well as the proposed increase in intermediate-range missiles in Western Europe, but Reagan at first refused. The two leaders reached such rapport in their conversations at the Reykjavik Summit during 1986, however, that they agreed in principle that all ballistic missiles should ultimately be abolished. They launched negotiations to reduce the number of nuclear weapons on both sides, resulting in the Intermediate-Range Nuclear Forces (INF) Treaty of 1987. Their discussions also led to negotiations for agreements known as the Strategic Arms Reduction Treaties (START I and II). SDI was placed on the back burner. Finalization of the agreements was left to Reagan’s successors. In 1986, Reagan was asked whether he still regarded the Soviet Union as an “evil empire.” He responded, “No,” believing that negotiations with Gorbachev had brought about a new era in East-West cooperation.
Impact Some observers credit the “evil empire” speech with starting a chain reaction of events that led to the dismantling of the Berlin Wall separating communist East Germany from capitalist West Germany in 1989 and ultimately to the end of the Cold War by the end of the 1980’s. Further Reading
Gaddis, John Lewis. The Cold War: A New History. New York: Penguin, 2005. Credits Reagan’s speech as a turning point in the Cold War, abandoning the policy of deterrence. Johns, Michael. “Seventy Years of Evil: Soviet Crimes from Lenin to Gorbachev.” Heritage Foundation Policy Review, Fall, 1987. Cites 208 examples of “evil” actions by the Soviet Union from 1917 until 1991, thereby defending the use of the term “evil empire” against Reagan’s critics. Michael Haas See also
Berlin Wall; Cold War; Congress, U.S.; Foreign policy of the United States; IntermediateRange Nuclear Forces (INF) Treaty; Military spending; Reagan, Ronald; Reykjavik Summit; Strategic Defense Initiative (SDI).
The Eighties in America
■ Recessions Definition
Sustained declines in economic activity lasting six months or longer
The recessions of the early 1980’s resulted in the highest unemployment rates in forty years, causing millions of workers and their families to experience substantial declines in their standard of living. There were two recessions during the 1980’s. The first occurred from January to July, 1980. The second began in July, 1981, and lasted through November, 1982. Following a period of little economic growth in 1979, overall economic activity declined in early 1980. Because of the high rate of increase in consumer prices (inflation), the Federal Reserve imposed severe restraints on the availability of credit. Partly as a result, interest rates rose rapidly to peaks of 14-20 percent in March and April, 1980. Consumer spending fell drastically and gross national product declined at a 10 percent annual rate in the second quarter of 1980. When the Federal Reserve became aware of the negative economic impact of very high interest rates, it removed the credit constraints beginning in May, 1980. The decline in economic activity came to an end in July, 1980, and for the remainder of the year retail sales (including automobile expenditures) and home construction gradually increased. Because the recession lasted for only six months, its overall impact was moderate. For example, industrial production declined 8.5 percent and unemployment rose by two percentage points. Of the seven post-World War II recessions up to that time, the 1980 recession had the smallest effect on the economy as a whole. The recovery that began in August, 1980, was one of the shortest on record, lasting only eleven months. Because it was a relatively weak recovery, unemployment barely declined from the recession peak. Reaching a level of 7.8 percent in July, 1980, it had fallen only to 7.4 percent by July, 1981. The rate of inflation remained above 10 percent in 1980 and early 1981. Partly because of high inflation, monetary policy remained restrictive. The combination of high inflation and restrictive monetary policy pushed the prime interest rate to 20 percent by spring, 1981, compared to 11 percent in mid-1980. The high interest rates were responsible for a sharp decline in purchases of new homes and automobiles. Consumer spending on such items as furni-
Recessions
■
815
ture and household equipment (durable goods) as well as home construction is responsive to interest rates. These expenditure declines were major causes of the sixteen-month recession that began in July, 1981, and resulted in an increase in unemployment to the highest levels since 1941. The 1981-1982 Recession
The second recession of the 1980’s was among the most severe of the postWorld War II period. By November, 1982, twelve million people were unemployed. This level was 50 percent more than in the third quarter of 1981 and nearly double the number of unemployed at the beginning of the 1980 recession. Industrial production fell 12.5 percent during the 1981-1982 recession. Home construction in 1982 was 50 percent less than between 1977 and 1979, the most recent period of general prosperity. By the fall of 1982, the prime rate of interest had fallen to just over 13 percent. The interest rate decline was the result of easier monetary policy, somewhat lower inflation, and a decline in demand for business and consumer loans. Moreover, in August, 1981, the Economic Recovery Tax Act became law. One major feature was a 25 percent reduction in individual income taxes consisting of a 5 percent cut in October, 1981, a 10 percent reduction in July, 1982, and a further 10 percent cut in July, 1983. The tax cut, plus increased government purchases of goods and services and the rise in transfer payments, led to substantial increases in disposable income. This factor, combined with lower interest rates, led the economy out of recession. From 1983 through 1985, the economy experienced substantial growth. Employment increased by more than nine million, and business investment experienced the largest increase of any comparable period in the post-World War II period. Interest rates declined five percentage points from their peaks in 1981, and home mortgage rates were down by seven percentage points. Inflation was only about one-third of the level reached in the early 1980’s. There were no periods of economic decline from 1985 to 1989. The next recession, a relatively mild one, did not occur until 1990-1991.
Impact Frequently, a decline in economic activity has a political impact. For example, when President Jimmy Carter ran for reelection in 1980, the effects of the 1980 recession were still occurring. This con-
816
■
The Eighties in America
Regan, Donald
tributed to Ronald Reagan’s victory over Carter in the 1980 presidential election. By contrast, when President Reagan ran for reelection against Walter Mondale in 1984, the economy was expanding vigorously and the 1981-1982 recession was a fading memory. This contributed to Reagan’s reelection. The major economic impact of the 1980 and 1981-1982 recessions was the rise in unemployment, particularly in the manufacturing and construction industries. Manufacturing unemployment was 5.6 percent in 1979 and 12.3 percent in 1982, an increase of 6.7 points. In construction, unemployment rose from 10.3 percent in 1979 to 20.0 percent in 1982. Because most of the workers in these highly impacted industries were men, the female unemployment rate rose more slowly than the male rate from 1979 to 1982. Thus, for women the rate rose from 6.8 percent in 1979 to 9.4 in 1982. Among men, the rate rose from 5.1 percent in 1979 to 9.9 percent in 1982. Among all workers, African Americans suffered the greatest increase in unemployment. In 1979, the unemployment rate for African Americans was 12.3 percent; it rose to 18.9 percent in 1982 and 19.5 percent in 1983. For white workers, the unemployment rate was 5.1 percent in 1979, 8.6 percent in 1982, and 8.4 percent in 1983.
■ Regan, Donald Identification
Secretary of the Treasury, 19811985, and White House chief of staff, 1985-1987 Born December 21, 1918; Cambridge, Massachusetts Died June 10, 2003; Williamsburg, Virginia As President Ronald Reagan’s first secretary of the Treasury, Regan was a major architect of the supply-side economic policies that became known as Reaganomics. Donald Regan graduated from Harvard University in 1940. He served in the U.S. Marine Corps in World War II, attaining the rank of lieutenant colonel. After the war, he joined the Merrill Lynch investment firm, where he rose eventually to become president of the firm in 1968 and chairman and chief executive officer of Merrill Lynch, Pierce, Fenner and Smith in 1971. From 1973 to 1975, he also served as vice chair of the New York Stock Exchange. At the Department of the Treasury, Regan helped to craft the Economic Recovery Tax Act of 1981 and what became the Tax Reform Act of 1986. These bills
Further Reading
Glasner, David, ed. Business Cycles and Depressions: An Encyclopedia. New York: Garland, 1997. An exhaustive study of the history of business cycles and the economists who wrote about them. Kurian, George, ed. Datapedia of the United States, 1790-2005. 2d ed. Lanham, Md.: Bernan Press, 2005. Extensive statistical information on the impact of business cycles. Contains important data not readily available elsewhere. Samuelson, Paul, and William Nordhaus. Economics. 17th ed. New York: McGraw-Hill, 2001. Excellent discussion of the causes of business cycles. Written at a basic level. Alan L. Sorkin See also Business and the economy in the United States; Economic Recovery Tax Act of 1981; Inflation in the United States; Reaganomics; Unemployment in the United States; Unions.
Secretary of the Treasury Donald Regan listens as President Ronald Reagan whispers in his ear in the White House Rose Garden in June, 1981. (AP/ Wide World Photos)
The Eighties in America
provided significant tax cuts, which, according to the supply-side economic theories favored by the Reagan administration, were believed to benefit the economy by stimulating spending and investment by businesses and wealthy individuals. Overall, Regan generally pursued policies of business deregulation and the promotion of competition that exhibited his deep faith in the free enterprise system. In 1984, Regan decided to leave the Treasury Department because of his concerns about leaks to the press from within the White House and an atmosphere of mistrust among the president’s major advisers. The president urged him to stay on, and in an unusual maneuver, Regan and White House chief of staff James Baker switched jobs in early 1985—Baker became secretary of the Treasury, and Regan became the new White House chief of staff. At the White House, personality conflicts soon emerged between Regan and First Lady Nancy Reagan. More serious, Regan was soon dogged by the Iran-Contra affair. In late 1986, news reports revealed that members of Reagan’s administration had sold arms to Iran and funneled the proceeds to the anticommunist Contras in Nicaragua. Regan was accused of trying to obstruct the investigation into the Reagan administration’s involvement in these arms sales, and he resigned as chief of staff in February, 1987. After leaving the White House, he published his memoirs, For the Record. His allegations that both Ronald and Nancy Reagan consulted a personal astrologer before making major decisions (widely denied by the Reagan family) further estranged him from the First Family. Impact Although he left the chief of staff position under a cloud, Regan had a major impact on the economic policies of the Reagan administration because of his leadership in the Department of the Treasury during Reagan’s first term as president. Further Reading
Noonan, Peggy. What I Saw at the Revolution: A Political Life in the Reagan Era. New York: Ivy Books, 1990. Regan, Donald T. For the Record: From Wall Street to Washington. New York: St. Martin’s Press, 1989. Mark S. Joy See also
Business and the economy in the United States; Iran-Contra affair; Reagan, Nancy; Reagan, Ronald; Reagan Revolution; Reaganomics; Tower Commission.
Rehnquist, William H.
■
817
■ Rehnquist, William H. Identification
U.S. Supreme Court justice, 19721986, and chief justice of the United States, 1986-2005 Born October 1, 1924; Milwaukee, Wisconsin Died September 3, 2005; Arlington, Virginia Rehnquist was a conservative Supreme Court justice who favored federalism, states’ rights, business, and religion. President Richard M. Nixon appointed William H. Rehnquist to the U.S. Supreme Court on December 10, 1971. As an associate justice of the United States, Rehnquist quickly established himself as the most conservative justice on the Warren E. Burger Court. Unwilling to concede on many issues, he consistently found himself as the lone dissenter. Rehnquist could be counted on to vote in favor of states, business, religious freedom, capital punishment, and antiabortion policies, and against the expansion of the Fourteenth Amendment guarantee of equal protection for all citizens. President Ronald Reagan nominated Rehnquist to fill the chief justice position when Burger retired in 1986. The Senate confirmed his nomination on September 26, 1986, by an overwhelming majority. His successor, Antonin Scalia, has been viewed as more conservative than Rehnquist. With his ascension to chief justice, Rehnquist was able to shift the ideological focus of the Court. Between 1986 and the end of the decade, Rehnquist slowly drove a wedge between the public’s expectations that the Court would continue to deliver liberalminded decisions and the reality that it was becoming increasingly conservative. It should be noted that the true extent of the conservative revolution in the Court was not fully witnessed until Clarence Thomas was appointed in 1991. In spite of his conservative views, Rehnquist was not afraid of joining and occasionally writing considerably liberal decisions. For instance, in Meritor Savings Bank v. Vinson (1986), Rehnquist wrote the majority opinion, which expanded the Civil Rights Act of 1964 to cover hostile-environment sexual harassment and include protections against the psychological aspects of harassment in the workplace. Rehnquist was also an advocate for patriotism. He wrote the primary dissenting opinion in the highly controversial flag burning case Texas v. Johnson (1989), in which he argued that the flag was
818
■
The Eighties in America
Religion and spirituality in Canada
Tushnet, Mark. A Court Divided: The Rehnquist Court and the Future of Constitutional Law. New York: W. W. Norton, 2005. James W. Stoutenborough See also
Abortion; Conservatism in U.S. politics; Flag burning; Liberalism in U.S. politics; Meritor Savings Bank v. Vinson; O’Connor, Sandra Day; Reagan, Ronald; Religion and spirituality in the United States; Sexual harassment; Supreme Court decisions.
■ Religion and spirituality in Canada Definition
Spiritual belief and practice by Canadians expressed in both formal and informal ways
Chief Justice William H. Rehnquist. (Supreme Court Historical Society)
much more than a symbol. He drew inspiration from the records of the Continental Congress, American flag history, the national anthem, and the “Concord Hymn” when articulating his disgust for flag burning. His arguments resonated with the American public in a way that few Court opinions or decisions have managed to achieve. Impact Rehnquist was a strong influence on the U.S. Supreme Court. He advanced a conservative approach to interpreting the U.S. Constitution and urged the Court to follow. After becoming chief justice, he sought to make the Court more collegial in an effort to decrease the number of split decisions, thus increasing the legitimacy of the Court’s decisions. Further Reading
Galub, Arthur L., and George J. Lankevich. The Rehnquist Court, 1986-1994. Danbury, Conn.: Grolier Educational Corporation, 1995. Hudson, David L. The Rehnquist Court: Understanding Its Impact and Legacy. Westport, Conn.: Praeger, 2006.
Through the decade, Canada witnessed a decline in the relative proportion of its “big three” Christian groups, the Catholic, Anglican, and United Churches; growth in the proportion of non-Christian groups and new religious movements (NRMs) such as paganism; and a growing secularization consistent with trends in other industrialized countries. Canada is a predominantly Christian country with guaranteed religious freedom and a pluralistic nature consistent with its political philosophy. In 1981, Canada was 90 percent Christian; another 7.4 percent were atheist or agnostic or had no religion, and the remaining 2.6 percent were Muslims, Jews, Hindus, Buddhists, Sikhs, or other. Immigration played a major role in increasing the religious diversity of Canada through the decade, with the newcomers holding mostly non-Western, non-Christian belief systems. Catholics, the largest single faith group, exhibited modest growth, largely due to immigration, while non-Christian groups like Muslims, Buddhists, Hindus, and Sikhs showed much greater percentage increases. In 1975, Catholics worshiped in twenty-one languages, the United in fourteen, Anglicans in twelve, Baptists in thirteen, Lutherans in nine, and Presbyterians in five. Clearly the church played a part in supporting multilingualism and the preservation of ethnic identities. Canada’s religious pulse has been less fundamentalist and moralistic than that of its U.S. neighbor, and sectarianism has not played a prominent role in Canada’s religious history. The mainstream Catho-
The Eighties in America
lic, Anglican, and United Churches are hierarchical, ritualistic, and dedicated to continuity through tradition. Most Canadians were not regular churchgoers in the 1980’s, however, and about 12 percent of the population during the decade was classified as nonreligious, the largest share on the west coast. Quebec experienced the most obvious secularization after the Quiet Revolution of the 1960’s, yet religious identity remained strong in the province with the Triune of French ethnicity, French language, and Catholic religion. Indigenous peoples continued to practice animistic spiritual traditions, Christianity, or a blending of animistic and Christian. Impact Unlike U.S. groups, Canadian faith groups after World War II drifted to the left in support of the country’s social welfare policies, including guaranteed universal health care. The United Church, created from a merger in 1925 of Methodists, Congregationalists, Brethren, and most Presbyterians, represents an unusually successful ecumenical and cooperative effort among Christians, and this group has been particularly effective in the promotion of civil rights for groups such as indigenous peoples and prison populations. The Canadian Council of Churches, a Christian ecumenical umbrella organization, offers leadership and support for its member denominations. Further Reading
Hewitt, W. E., ed. The Sociology of Religion: A Canadian Focus. Toronto: Butterworths, 1993. Essays on diverse topics such as new religious movements, nonbelief, religion and multiculturalism, and the influence of religion on national identity. Menendez, Albert J. Church and State in Canada. Amherst, N.Y.: Prometheus Books, 1996. Addresses the impact of religion on politics, law, education, and national identity, drawing comparisons to the United States. “Religion in Canada.” Journal of Canadian Studies 22 (Winter, 1987-1988). A summary of religious patterns supported by official data. Ann M. Legreid See also Immigration to Canada; Minorities in Canada; Religion and spirituality in the United States; Televangelism.
Religion and spirituality in the United States
■
819
■ Religion and spirituality in the United States Definition
Organized and nonorganized expressions of spiritual belief and practice among Americans
During the 1980’s, debates on several questions that had been a part of American religious history for decades became increasingly heated in religious denominations, in politics, and even in the realm of popular culture. The decade was marked by calls for religious pluralism and tolerance for underrepresented groups—especially in ordained ministry—but also by a gradual and powerful shift to the right in terms of social and cultural activism and the promotion of a Judeo-Christian worldview. These debates were featured not only in internal dialogues among religious and spiritual leaders but also in political debates (including all three presidential elections of the 1980’s) and popular culture. The early 1980’s saw the growth and development of opportunities for women, minorities, and others who had been traditionally excluded from certain roles within U.S. religious denominations. The Civil Rights and women’s movements of the 1960’s and 1970’s opened up doors for participation in many positions not only in political and corporate leadership but also for leadership in religious communities. During the 1970’s, mainstream Episcopalian and Lutheran churches had voted to allow the ordination of women. Reform Judaism had ordained its first female rabbi, Sally Priesand, in 1972. During the 1980’s, other denominations followed this trend. In 1984, the Reorganized Church of Jesus Christ of Latter-day Saints (now the Community of Christ) began to ordain women. Conservative Judaism began ordaining female rabbis in 1985. In 1983, the U.S. Conference of Catholic Bishops issued a pastoral letter titled The Hispanic Presence: Challenge and Commitment, which addressed the Catholic Church’s relationship with believers of Spanish and Latin American descent. The letter reaffirmed the need for Church ministries that would address this rapidly growing segment of the Catholic population and encouraged young men in this demographic to consider serving in the priesthood. Pluralism and Ecumenism The changes that were begun in previous decades and the opening of ordained leadership positions described above were reflections of larger trends of broader participation
820
■
Religion and spirituality in the United States
within U.S. religious denominations as well as greater participation among the groups. Since the establishment of the National Council of Churches (NCC) in 1950, many denominations had continued to work together to find points of agreement and common areas of interest. One example of this kind of collaboration was the publication in 1989 of the New Revised Standard Version (NRSV) of the Bible. Under the supervision of the NCC, a group of scholars from Protestant denominations joined with Roman Catholic, Eastern Orthodox, and Jewish scholars to produce a version of the Scriptures that included more gender-inclusive language and updated other archaisms. The NRSV was widely endorsed by larger Protestant denominations and is an accepted translation in the Catholic and Orthodox churches. Though the NCC is a broad alliance of churches representing members of various theological inclinations, it has been criticized by liberal Christians for its exclusion of the Metropolitan Community Church (MCC), an organization that formed during the late 1960’s but grew in both membership and national recognition during the 1980’s. Throughout the decade, the MCC, with its explicit mission of ministering to gay and lesbian Christians, became a voice for AIDS awareness and in support of churchordained marriage ceremonies for gays and lesbians. Between 1983 and 1992, the MCC sought membership in the ecumenical body. In 1992, it was denied not only membership but also the opportunity to apply for “observer” status. The Rise of the Religious Right
The decision by the NCC to deny the MCC’s membership resulted in part from tensions at the other end of the political and theological spectrum. The NCC’s decision to appeal to more conservative members represented the council’s own acknowledgment that conservative Christians were playing a larger role not only in American religious life but also in politics at the local, state, and national levels. The 1980 election of Ronald Reagan (and the defeat of Jimmy Carter, a self-professed born-again Christian) was a political victory delivered in large part by Christian activist groups that had been mobilizing throughout the 1970’s in response to the Supreme Court’s 1973 Roe v. Wade decision and to the proposed adoption of the Equal Rights Amendment (ERA). Singer Anita Bryant’s crusade to repeal a 1977 civil rights ordinance in Florida that included
The Eighties in America
protections for gays and lesbians is often marked by historians as the birth not only of the current struggles for gay and lesbian rights but also of what is sometimes referred to as the New Christian Right or the New Religious Right. Opponents of the ordinance were quite successful in their use of grassroots-level organization strategies to influence voters to vote according to their religious principles. The 1980’s thus saw a dramatic increase in participation in conservative religious organizations. Opponents of the ERA—including the Concerned Women for America, founded by Beverly LaHaye, and Eagle Forum, founded by Phyllis Schlafly— helped ensure that the 1982 deadline came and went without the required number of votes for ratification to the U.S. Constitution. Other groups that gained status and popularity during the decade were the Moral Majority, founded by Jerry Falwell and others in 1979, and the Christian Coalition, established in 1989 by the Reverend Pat Robertson. Reagan’s election marked an important victory for the Religious Right. The president rewarded his base by appointing activists to national leadership positions. His tenure in office was marked by social and economic policies that derived from his engagement with what a future presidential candidate called the American “culture wars.” Reagan often invoked rhetoric associated with Judeo-Christian principles and frequently referred to America as God’s “city upon a hill,” a reference to John Winthrop’s 1630 sermon “A Model of Christian Charity.” Reagan proclaimed 1983 the “Year of the Bible,” arguing: “Of the many influences that have shaped the United States of America into a distinctive Nation and people, none may be said to be more fundamental and enduring than the Bible.” The proclamation ended: “I encourage all citizens, each in his or her own way, to reexamine and rediscover its priceless and timeless message.” Christian conservatives continued to work on social and political issues. Though there has been fragmentation and some division among conservative leadership at the national level, activists remain united in at least one area—their crusade to end the legalization of abortion in the United States. The Reagan years were a period of great political mobilization for conservative Christians. Many activists joined in Robertson’s 1988 failed run in the Republican primary election for the presidency. Robertson, an ordained Southern Baptist minister, was not the
The Eighties in America
only minister in the election. One of the candidates in the Democratic primary was the Reverend Jesse Jackson, a Baptist minister and civil rights activist. Religion and Popular Culture While the 1980’s saw the development of tensions within various religious communities and the rise of religious conservatism in politics, it was also a time of memorable religious expression in the realm of popular culture. Janette Oke, a Christian romance writer, followed her 1979 best-selling novel Love Comes Softly with seven other popular titles in the Love Comes Softly series during the 1980’s. Jewish rabbi Harold Kushner published his best-selling book about faith during periods of grief and loss, When Bad Things Happen to Good People, in 1981. In 1988, Martin Scorsese directed a film adaptation of Nikos Kazantzakis’s novel O Teleftaíos Peirasmós (The Last Temptation of Christ, 1951). The film included footage of Jesus’ crucifixion and, more controversially, hypothesized about the inner nature of the temptations he experienced during his final hours. Conservative Christians organized protests against the film’s nonbiblical portrayal of a conflicted Jesus who imagines, and is tempted by, the idea of a fully human existence. Another source of debate that involved religion and the media was the release in 1989 of Madonna’s Like a Prayer album. The album featured her popular hit “Express Yourself” along with the title song, which compared the ecstasy of religious expression with the ecstatic feelings associated with love and sex. The “Like a Prayer” video included an image of the singer receiving a sort of stigmata (a rendering of the wounds Christ received during the crucifixion) as well as embracing a black man who appears to be a religious figure. The album, the song, and especially the video mobilized Christians in a boycott against the singer and the products she endorsed.
Religion and spirituality in the United States
The social and cultural questions that emerged in mainstream American culture during the 1960’s and 1970’s continued to affect conservative and liberal religious denominations alike during the 1980’s. The question of women’s ordination opened opportunities in progressive and moderate communities while strengthening the resolve against these kinds of changes in more conservative communities. Lingering related questions about the role
821
of minorities in the denominations continued to be debated in broader coalitions such as the National Council of Churches. The political winners during the decade were definitely the Christian conservatives, members of the New Religious Right, whose campaigns for national attention culminated in the election of Ronald Reagan in 1980. Their campaigns against the Equal Rights Amendment and the legalization of abortion translated into large, politically powerful organizations that dominated the Republican political agenda both during the 1980’s and beyond. Further Reading
Ammerman, Nancy Tatom. Bible Believers: Fundamentalists in the Modern World. New Brunswick, N.J.: Rutgers University Press, 1987. Ammerman’s ethnographic study has inspired many subsequent investigations of religiosity among evangelical Christians. Gaustad, Edwin, and Leigh Schmidt. The Religious History of America: The Heart of the American Story from Colonial Times to Today. San Francisco: HarperCollins, 2004. An overview of American religious history. Jorstad, Erling. The New Christian Right, 1981-1988: Prospects for the Post-Reagan Decade. Studies in American Religion 25. Lewiston, N.Y.: Edwin Mellen Press, 1987. Jorstad’s study investigates the social and cultural influences behind the rise of the Religious Right. Nall, Mark A. A History of Christianity in the United States and Canada. Grand Rapids, Mich.: Wm. B. Eerdmans, 1992. An older, but still widely studied, history of Christian events, movements, and leaders. Jennifer Heller See also
Impact
■
Abortion; Bakker, Jim and Tammy Faye; Conservatism in U.S. politics; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Evangelical Lutheran Church in America; Falwell, Jerry; Feminism; Grant, Amy; Heritage USA; Hustler Magazine v. Falwell; Last Temptation of Christ, The; Moral Majority; Nation of Yahweh; Religion and spirituality in Canada; Robertson, Pat; Swaggart, Jimmy; Televangelism.
822
■
The Eighties in America
R.E.M.
■ R.E.M. Identification American rock band Date Formed in 1980
Despite beginning as a “college radio” group, R.E.M. became one of the seminal bands of the 1980’s. When R.E.M. formed in Athens, Georgia, in 1980, there was little indication of the superstardom the group would achieve. Composed of vocalist Michael Stipe, guitarist Peter Buck, bassist Mike Mills, and drummer Bill Berry, the band was vastly different from the artists that dominated the commercial charts. R.E.M.’s sound was characterized by Buck’s jangle-pop guitar, the driving rhythm section of Mills and Berry, and Stipe’s cryptic, garbled lyrics. R.E.M.’s first single, “Radio Free Europe,” aired extensively on college radio stations and garnered substantial critical acclaim. After signing with the in-
dependent label I.R.S. Records, R.E.M. released the extended play (EP) album Chronic Town in 1982. This was quickly followed by the band’s full-length debut Murmur, which Rolling Stone magazine named the Best Album of 1983. In 1984, the band released Reckoning, also considered a critical success. The hypnotic single “So. Central Rain” received considerable airplay on college radio stations, increasing R.E.M.’s cult following. In addition to building their careers, the band members’ early success underscored the viability of college radio and helped spotlight scores of alternative artists producing quality music—many who sounded very similar to R.E.M. The band’s next albums, 1985’s Fables of the Reconstruction and 1986’s Lifes Rich Pageant, maintained the core sound that had come to define R.E.M.’s musical style and also expanded the band’s fan base. Although each successive album earned the band
R.E.M. in 1984. From left: Michael Stipe, Mike Mills, Bill Berry, Peter Buck. (Paul Natkin)
The Eighties in America
Retton, Mary Lou
■
823
more renown, 1987’s Document was deemed R.E.M.’s first mainstream success. Document reached the top ten in large part because of the breakthrough singles “The One I Love” and “It’s the End of the World as We Know It (And I Feel Fine).” Heavy airplay on top 40 radio, coupled with frequent rotation of the videos on MTV, moved R.E.M. closer to superstardom. The band’s most successful album up to that time was also the last it would record with I.R.S. Records. Green, released in 1988, was the first album of R.E.M.’s record deal with Warner Bros. No longer darlings of college radio, R.E.M. moved to the corporate label, signaling major changes for the band. Every single was played on commercial radio and had an accompanying video on MTV. Also, although the band had toured extensively since its formation, the Green tour marked the first time that R.E.M. played in larger stadiums instead of concert halls. R.E.M. returned to the studio in 1989 to begin work on what would be one of their most successful commercial albums, 1990’s Out of Time. Impact Although R.E.M. began the 1980’s in relative obscurity, by the end of the decade the band had laid the groundwork for its continued success and was poised to become one of the greatest rock bands of the later twentieth century. Further Reading
Fletcher, Tony. Remarks Remade: The Story of R.E.M. 2d ed. New York: Omnibus Press, 2002. Rolling Stone. R.E.M., the “Rolling Stone” Files: The Ultimate Compendium of Interviews, Articles, Facts, and Opinions from the Files of “Rolling Stone.” New York: Hyperion, 1995. Matthew Schmitz See also
MTV; Music; Music videos; Pop music.
■ Retton, Mary Lou Identification American gymnast Born January 24, 1968; Fairmont, West Virginia
Retton captured American’s attention by becoming the first American woman to win an Olympic gold medal in gymnastics. In 1981, Mary Lou Retton, age thirteen, competed regionally and nationally in gymnastics while training in Fairmont, West Virginia, a coal mining town.
Mary Lou Retton competes on the balance beam at the 1984 Summer Olympics. (Hulton Archive/Getty Images)
Recognizing Retton’s potential, Bela Karolyi, coach of the 1976 Olympic gold medalist Nadia Comaneci, offered to train Retton for free at his gym in Houston, Texas. Only fourteen years old, Retton moved to Texas without her family on January 2, 1983. By the end of the year, she had achieved the number one senior-class ranking in the nation. Although a broken left wrist prevented her from competing in the World Championships that year, she won the American Cup, the U.S. Gymnastics Federation American Classics, and the Chunichi Cup in Japan—the first American woman to do so. In May, six weeks before the 1984 Olympics in Los Angeles, doctors determined that Retton would have to undergo arthroscopic surgery to repair dam-
824
■
The Eighties in America
Reykjavik Summit
aged cartilage in her knee. After only one day of rest, Retton returned to the gym and worked hard at rehabilitation. She defied her doctor’s expectations and recovered in time for the Olympics. In the individual competition, Retton scored a perfect 10 on her floor exercise. She then sealed her all-around gold medal with a perfect 10 on her final apparatus, the vault. Retton performed her second vault, although it was not necessary, scoring another 10. She became the first American female athlete to win a gold medal in gymnastics. She also took home two silver medals, in team and vault, and two bronze medals, for uneven bars and floor exercise. Her five medals were the most won by any athlete at the 1984 Olympics. Retton, dubbed “America’s Sweetheart” by the media, capitalized on her fame with national endorsement offers and appearances. Retton’s endorsements for companies such as McDonald’s and Vidal Sassoon hair products earned her an estimated $1 million in the year following the Olympics. She also became the first woman to appear on a Wheaties cereal box. She briefly continued to compete in gymnastics, winning her third American Cup in 1985. In 1986, however, she retired from gymnastics to attend the University of Texas. Throughout the 1980’s, Retton remained in the public eye working as a sportscaster, including acting as commentator for the National Broadasting Company (NBC) at the 1988 Olympics in Seoul, South Korea. She also appeared in the film Scrooged (1988). Her numerous accolades during the 1980’s included 1984 Sports Illustrated Sportswoman of the Year and 1984 Associated Press Female Amateur Athlete of the Year. In 1985, she became the youngest person ever inducted into the U.S. Olympic Committee’s Olympic Hall of Fame. Impact Mary Lou Retton inspired a generation of Americans with her perfect performances in the floor exercise and vault, which won for her the allaround gold medal at the 1984 Los Angeles Olympics. One of the first female athletes in America to earn significant fame and endorsement revenue, Retton paved the way for future female athletes with her success. Further Reading
“Gymnastics: Pathways to the Olympics—The Golden Girls.” Sports Illustrated 88, no. 23 (June 6, 1988): 61-68.
Hoobing, Robert. The 1984 Olympics: Sarajevo and Los Angeles. Washington, D.C.: U.S. Postal Service, 1984. Malana S. Salyer See also
Advertising; Olympic boycotts; Olympic Games of 1984; Sports.
■ Reykjavik Summit The Event
Meeting between U.S. president Ronald Reagan and Soviet general secretary Mikhail Gorbachev over arms control and other issues Date October 11-12, 1986 Place Höf ði House, Reykjavik, Iceland U.S. and Soviet leaders met at this groundbreaking summit that many participants believed was a key turning point in the Cold War. While in the end the two sides were unable to agree upon final terms for the elimination of nuclear weapons, the negotiations at Reykjavik eventually led to crucial agreements on intermediate-range and strategic nuclear force reductions. The summit meeting in Reykjavik, Iceland, between U.S. president Ronald Reagan and Soviet general secretary Mikhail Gorbachev on October 11-12, 1986, followed from their first meeting in Geneva, Switzerland, in 1985. Though no arms control agreements were initialed at Geneva, the two leaders of the world’s most powerful states did declare that a nuclear war could not be won by either side and that such a war should never be fought. After President Reagan made an impassioned speech for a nuclear arms reduction accord before the United Nations in September of 1986, General Secretary Gorbachev extended an offer for the two leaders to meet in October, 1986, in either Iceland or the United Kingdom. In the end, Reykjavik was established as the meeting place for what was billed as an informal tête-à-tête. The talks began early on the morning of Saturday, October 11; as agreed, the negotiations were to be wide-ranging, covering four major thematic areas: arms control, regional issues, bilateral issues (such as Jewish and dissident emigration from the Soviet Union), and human rights. However, in the final analysis, most commentators and journalists would report that the primary importance of Reykjavik was
The Eighties in America
the sweeping deliberations over nuclear arms control issues. Indeed, Gorbachev came to Reykjavik having realized the need to end both the superpower arms race and the ideological conflict with the Western Bloc, as the rapidly declining Soviet economy was in dire need of reform. Gorbachev soon showed his hand, arguing in the opening sessions that the two countries should agree to a 50 percent reduction in strategic nuclear arms, a total elimination of all intermediate-range missiles deployed in Europe, compulsory nonwithdrawal from the Anti-Ballistic Missile (ABM) Treaty for a period of ten years, and a complete ban on the testing of space-based antiballistic defensive weapons, except in laboratories. On October 12, Gorbachev sweetened the deal by proposing to limit all intermediate-range missiles in the Soviet and American arsenals to one hundred. In the final, dramatic hours of the summit, Gorbachev remarked to Reagan that he wanted to rid their countries’ nuclear arsenals of all strategic forces, not merely ballistic missiles. To that, Reagan responded that he would agree to the elimination of all nuclear arms, be they strategic, intermediaterange, or tactical nuclear weaponry. It appeared that a major, far-reaching compact on nuclear disarmament was within sight. Negotiations Break Down
In the final hour, however, the talks collapsed when Gorbachev insisted that Reagan’s Strategic Defense Initiative (SDI)—a space-based, antiballistic defensive weapons system unveiled in March, 1983, in order to make nuclear weapons “impotent and obsolete”—be limited to research and testing in a laboratory setting. The Soviet leadership believed that the SDI program was being developed in order to give the United States a firststrike capability and to take the arms race into outer space, not to provide a protective shield against nuclear attack as the Reagan administration claimed. Reagan would not agree to limiting research, development, and testing of the system within the framework of the ABM Treaty, arguing that the SDI program was the best insurance policy against the Soviet Union reneging on arms reduction commitments. Thus, the failure to find common ground on defensive antiballistic systems caused the summit to end without any agreement on nuclear arms control.
Impact
Although the Reykjavik Summit ended without the signing of an arms control treaty, the
Reykjavik Summit
■
825
meeting was of fundamental importance, as the sweeping negotiations advanced the arms control agenda significantly. Important breakthroughs made at Reykjavik enabled the two leaders to sign an Intermediate-Range Nuclear Forces Treaty the following year at their third summit meeting, in Washington, D.C. This accord was groundbreaking: For the first time ever, an entire class of nuclear weapons was eliminated from U.S. and Soviet arsenals. Likewise, the Reykjavik discussions on strategic nuclear forces eventually culminated in the first Strategic Arms Reduction Treaty (1991), the first arms agreement signed by the two superpowers that eliminated strategic nuclear arms. Another rarely discussed outcome of the Reykjavik Summit—one secured by the United States—was the commitment made by the Soviets to have an ongoing discussion on human rights issues. Perhaps the most significant result of Reykjavik was that the meeting led to a greater level of trust between the two superpowers; indeed, Gorbachev later claimed that Reykjavik was the key turning point in the Cold War, as it was the first time the leaders of the two states met over an extended period of time and talked about all outstanding issues of concern. Further Reading
Beschloss, Michael R., and Strobe Talbott. At the Highest Levels: The Inside Story of the Cold War. Boston: Little, Brown, 1993. A historian and a journalist team up to examine the causes, consequences, and denouement of the Cold War. Goodby, James E. At the Borderline of Armageddon: How American Presidents Managed the Atom Bomb. Lanham, Md.: Rowman & Littlefield, 2006. As a participant in the arms control negotiations between the United States and the Soviet Union in the early 1980’s, Goodby examines the negotiating positions, strategies, and achievements of past U.S. presidents. Gorbachev, Mikhail S. Memoirs. New York: Doubleday, 1996. In this wide-ranging autobiography, Gorbachev discusses his rise to power, meetings with global leaders, and the fall of the Soviet Union and communism. Shultz, George P. Turmoil and Triumph: My Years as Secretary of State. New York: Charles Scribner’s Sons, 1993. As secretary of state under Reagan, Shultz played a significant role in the five summit meetings held between Reagan and Gorbachev.
826
■
Rice, Jerry
In this book, Shultz recounts his role in these negotiations and the opportunities that were opened by them. Thomas E. Rotnem See also Foreign policy of the United States; Intermediate-Range Nuclear Forces (INF) Treaty; Reagan, Ronald; Reagan’s “Evil Empire” speech; Shultz, George P.; Soviet Union and North America; Strategic Defense Initiative (SDI); Weinberger, Caspar.
■ Rice, Jerry Identification American football player Born October 13, 1962; Starkville, Mississippi
Rice came out of a life of obscurity to become one of the greatest wide receivers in football history. Jerry Rice established himself as a college sensation at Mississippi Valley State University, where he was
The Eighties in America
named an Associated Press (AP) All-American and finished ninth in the Heisman Trophy voting in 1984. Drafted by the San Francisco 49ers, he quickly ascended to the premier receiver spot and remained a dominant force at that position for sixteen seasons in the National Football League (NFL). Each year in the league, Rice set new records for achievement. In 1987, he was named Player of the Year. The 1988 season proved to be one of his best, as he caught 64 passes for 1,306 yards and 9 touchdowns. That season, Rice helped propel the team to a narrow Super Bowl victory over the Cincinnati Bengals, 20-16. His 11 pass receptions for 215 yards and a touchdown set Super Bowl records and earned him the Most Valuable Player (MVP) honor. The next year, Rice helped the 49ers advance once again to the Super Bowl, and his team beat the Denver Broncos handily, 55-10. Impact Jerry Rice set more records than any other receiver in the history of the game. To date, he owns
San Francisco 49er Jerry Rice makes a reception while being tackled by Cincinnati Bengals Lewis Billups, left, and Ray Horton, during the 1989 Super Bowl in Miami, Florida. (AP/Wide World Photos)
The Eighties in America
Richie, Lionel
■
827
the records for most receptions (1,549), most receiving yards (22,895), most touchdown receptions (198), most yards from scrimmage (23,540), and most rushing/receiving touchdowns (208). Experts believe many of these records will never be broken. Subsequent Events In 1994, Rice and the 49ers were Super Bowl champions again. In 2001, he left the 49ers for the Oakland Raiders, and in 2002 he returned to the Super Bowl; although the team lost, Rice became the first player to catch touchdown passes in four Super Bowls. He played with the Raiders until 2004, went to the Seattle Seahawks for a year, had a brief stay with the Denver Broncos, and retired in 2005. In 2006, he signed a one-day contract to end his career as a 49er. Further Reading
Dickey, Glenn. Sports Great Jerry Rice. New York: Enslow, 1993. Rice, Jerry, and Brian Curtis. Go Long! My Journey Beyond the Game and the Fame. New York: Ballantine Books, 2007. Rice, Jerry, and Michael Silver. Rice. New York: St. Martin’s Press, 1996. Stewart, Mark. Jerry Rice. New York: Children’s Press, 1996. David W. Madden See also
African Americans; Football; Sports.
■ Richie, Lionel Identification American singer and songwriter Born June 20, 1949; Tuskegee, Alabama
Richie was the most influential songwriter and singer of the romantic ballad in the 1980’s. Lionel Richie achieved international success during the 1970’s as a lead vocalist, saxophone player, and songwriter for the band the Commodores. The 1978 single “Three Times a Lady,” which Richie composed, reached platinum status by selling more than one million copies, highlighted the artist’s songwriting talents, and put him on the path to a very successful solo career during the 1980’s. In 1980, Richie wrote and produced the song “Lady” for Kenny Rogers and the following year wrote, produced, and recorded the duet “Endless Love” with Diana Ross for the film of the same name,
Lionel Richie. (Paul Natkin)
starring Brooke Shields. In 1982, under the Motown label, Richie produced his first solo album, Lionel Richie. From that album, the ballad “Truly” topped the pop charts and marked the beginning of a successful solo career. Two other hit songs from the album were the ballads “You Are” and “My Love.” Richie’s second album, Can’t Slow Down (1983), sold more than eight million copies and included the dance hit “All Night Long” as well as the hit songs “Hello,” “Stuck on You,” and “Penny Lover.” In 1985, Richie and Michael Jackson cowrote the single “We Are the World” to help raise money for famine relief in Africa. Produced by Quincy Jones, the song was recorded on January 28, 1985, at the A&M Recording Studios in Hollywood, California. Forty-five musicians participated in the project— known as United Support of Artists for Africa (USA for Africa)—including Kenny Rogers, Billy Joel, Bob Dylan, and Bruce Springsteen. The song debuted on March 7, and on April 5 more than five thousand radio stations played the song simultaneously. In total, the single sold more than 7.5 million copies, raised
828
■
Richler, Mordecai
more than $50 million for famine relief, and won three Grammy Awards, including Song of the Year, Record of the Year, and Best Pop Performance by a Duo or a Group. That same year, Richie composed and recorded the ballad “Say You, Say Me” for the movie White Nights, starring Gregory Hines and Mikhail Baryshnikov. The song won Richie an Academy Award for Best Original Song. His third album, Dancing on the Ceiling (1986), proved just as popular as his others and included such well-known songs as “Dancing on the Ceiling” and “Ballerina Girl.” In 1987, the singer wrote and recorded the song “Deep River Woman” with the country band Alabama. Impact During the 1980’s, Lionel Richie’s name became synonymous with the romantic ballad. His songwriting and singing talents led to the recording of numerous award-winning songs. From 1977 to 1985, the popular singer had a song reach number one on the pop charts each year, making him the only performer in music history to do so consecutively for nine years. In total, he won three Grammy Awards, six American Music Awards, one People’s Choice Award, and one Academy Award. Richie enjoyed phenomenal success as a performer in the 1980’s and his romantic lyrics defined romance for a generation of people. Further Reading
Nathan, David. Lionel Richie: An Illustrated Biography. New York: McGraw-Hill, 1985. Richie, Lionel. Lionel Richie Anthology. Milwaukee, Wis.: Hal Leonard, 2004. Bernadette Zbicki Heiney See also Academy Awards; African Americans; Jackson, Michael; Music; Pop music; Shields, Brooke; Springsteen, Bruce; USA for Africa.
■ Richler, Mordecai Identification Canadian author Born January 27, 1931; Montreal, Quebec Died July 3, 2001; Montreal, Quebec
Richler wrote two acclaimed novels and served as a critic of Canadian culture and politics, particularly within Quebec. During the 1980’s, Mordecai Richler published two well-received novels, Joshua Then and Now (1980)
The Eighties in America
and Solomon Gursky Was Here (1989), as well as the popular children’s book Jacob Two-Two and the Dinosaur (1987) and the book for the failed musical adaptation of his earlier novel The Apprenticeship of Duddy Kravitz (1959). Joshua Then and Now was a modest success, but it was Solomon Gursky Was Here, released nine years later, that had the greater literary impact. The picaresque novel about an alcoholic Rhodes scholar who spends his life chasing down leads related to Solomon Gursky, the deceased middle brother of a trio of bootleggers, is considered to be one of Richler’s best. While Richler was popular with Americans— Morton Ritts of Maclean’s noted that he was “the one to whom editors of The New York Times and The Atlantic turn when they want a Canadian perspective on this country”—some Canadians took offense at that perspective. After a September 29, 1985, New York Times sports piece on Wayne Gretzky in which Richler described Edmonton in an unflattering light, columnists and city dwellers cried foul, with Canadian publisher Mel Hurtig complaining that “Richler now makes his living knocking Canada.” Perhaps this sentiment was felt most in Richler’s native Quebec, where he frequently commented on the Parti Québécois and strained English-French relations in the province. In the essay “Language (and Other) Problems” published in Home Sweet Home (1984), Richler critiqued the province’s Charter of the French Language (Bill 101), noting that the government’s zealous attempts to eradicate the English language from any signage—going so far as to confiscate fifteen thousand Dunkin’ Donuts bags— made Quebec a laughingstock internationally and threatened to destroy Montreal’s unique cultural balance. In a summary of the 1980’s he wrote for Maclean’s, Richler decried the underfunded hospitals and universities in Quebec and voiced his displeasure over the government’s concerns over bilingual signage: “Soon otherwise grown men will be out measuring the size of English letters on indoor commercial signs, crying perfidy if they measure a tad over half-size.” Richler’s voice in the English-French debate in Canada increased in the following decade. Impact Despite his literary achievements in the decade, for many Richler is remembered in the 1980’s for his role as the critical voice of Canada for American and Canadian publications such as The New York Times, GQ, Newsweek, Esquire, and Maclean’s. Many of
The Eighties in America
Ride, Sally
■
829
the essays he wrote during this time can be found in compilations such as Home Sweet Home, Dispatches from the Sporting Life (2002), and Broadsides (1990). Further Reading
Richler, Mordecai. Dispatches from the Sporting Life. Guilford, Conn.: Lyons Press, 2002. _______. “The Eighties: Decade in a Hurry.” Maclean’s, December, 30, 1989. _______. Home Sweet Home: My Canadian Album. New York: Alfred A. Knopf, 1984. Julie Elliott See also
Children’s literature; Literature in Canada; Meech Lake Accord; Quebec English sign ban; Quebec referendum of 1980.
■ Ride, Sally Identification
Scientist and pioneer in the U.S. space shuttle program Born May 26, 1951; Encino, California Ride was the first American—and the world’s third— female astronaut to fly in outer space. Only two other women, Soviets Valentina Tereshkova and Svetlana Savitskaya, preceded her. In 1978, Sally Ride became one of six women in the National Aeronautics and Space Administration’s (NASA) eighth astronaut class. There was some initial animosity within the program when women were chosen, but Ride found NASA’s attitude toward her and her fellow female trainees straightforward. Ride was involved in the first three flights of the space shuttle Columbia. Her first assignment was as mission specialist. Her tasks included working as capsule communicator (capcom), the only person on Earth allowed to talk directly to the space shuttle crew during a mission. Ride was the first woman to hold this position. She also participated in the development of the remote manipulator system (RMS), the robotic arm used on the space shuttle. Ride’s rigorous training for space included running four miles a day, playing tennis and volleyball, and weightlessness training. She also received training in all shuttle systems, including mechanical, electrical, and propulsion. NASA believed that it was important that each astronaut learn the tasks of the other crew members, in case of an emergency.
Sally Ride. (National Aeronautics and Space Administration)
On June 18, 1983, Ride became the first American woman in space while aboard the space shuttle Challenger on mission STS-7. One of her jobs during the six-day trip was to deploy two satellites into space using the RMS. She also worked on forty experiments, including studying the effect of weightlessness on carpenter ants. Challenger returned to Earth on June 24, 1983. Ride received a lot of media attention before and after the historic mission. She also received a number of honors and awards and even made a guest appearance on the popular children’s television series Sesame Street. As Ride was preparing for her third flight, the shuttle Challenger exploded on January 28, 1986, killing all seven crew members on board. She was appointed to the Rogers Commission to investigate the explosion. Following the tragedy and her work on the commission, Ride decided that she was not ready
830
■
The Eighties in America
Rivera, Geraldo
to fly again and retired in 1987, having logged more than 343 hours in space. She took a teaching position at Stanford University, then went on to become a physics professor at the University of California, San Diego. Impact By recruiting six female candidates, NASA demonstrated its commitment to adding women to the space program. Women proved that they could work as efficiently and effectively as their male counterparts. Though Ride did not like being the center of the media’s attention, she understood the importance of being the first American woman in space and how that would affect the future of the space program. Further Reading
Fox, Mary Virginia. Women Astronauts Aboard the Shuttle. New York: Simon & Schuster, 1984. Holden, Henry M. Pioneering Astronaut Sally Ride. Berkeley Heights, N.J.: Enslow, 2004. Orr, Tamara. Sally Ride, the First American Woman in Space. New York: Rosen, 2004. Maryanne Barsotti See also Challenger disaster; Feminism; Science and technology; Space exploration; Space shuttle program.
■ Rivera, Geraldo Identification American broadcast journalist Born July 4, 1943; New York, New York
A controversial but accomplished broadcast journalist during the 1980’s, Rivera became a popular talk show host. Born Gerald Michael Riviera, Geraldo Rivera was raised by his Puerto Rican father and Jewish mother in West Babylon, New York. He earned a law degree from Brooklyn Law School in 1969 and a journalism degree from Columbia University. During the 1970’s, he worked as an investigative reporter for WABC-TV in New York City and hosted the late-night television show Good Night, America. From 1978 to 1985, he worked as a special correspondent for the American Broadcasting Company (ABC) news magazine 20/ 20. His subjective, often opinionated style of reporting made him popular with television audiences. Rivera also won numerous awards for his reporting and became a role model for the Latino community.
In 1986, Rivera hosted the documentary The Mystery of Al Capone’s Vault, a live television broadcast in which he opened what was purported to be Al Capone’s vault and found nothing. Although Rivera was embarrassed by the outcome, his popularity soared. Under contract with Tribune Entertainment, Rivera filmed seven additional documentaries during the 1980’s, including American Vice: The Doping of America (1986), Innocence Lost: The Erosion of American Childhood (1987), Sons of Scarface: The New Mafia (1987), and Murder: Live from Death Row (1988). In 1987, Rivera began hosting and producing the syndicated daytime talk show Geraldo. The show, which remained on the air until 1998, often featured controversial guests and tabloid theatrics. One of his most famous episodes involved neo-Nazi skinheads and black and Jewish activists and ended in an on-air brawl, with Rivera receiving a broken nose. The talk show continued to provide Rivera with a forum to share his personal opinions with the American public. The show also joined a growing roster of other talk shows, such as The Jerry Springer Show and Sally Jessy Raphael, which became known in the television industry as “trash TV.” The controversial and extreme nature of Rivera’s talk show, however, earned him the reputation as the “King of Tabloid TV.” Impact During the 1980’s, Rivera became a popular investigative reporter and broadcast journalist known mostly for his unorthodox style of subjective reporting. While he won numerous awards for his work and became a role model for Latinos, many of his professional peers were critical of his opinionated reporting style. As his popularity grew with the American public, Rivera catered to this fame by moving away from legitimate reporting and toward entertainment reporting through a series of investigative documentaries and, eventually, to his successful talk show. In the late 1990’s, Rivera returned to reporting as a war correspondent for FOX News. Further Reading
Langer, John. Tabloid Television: Popular Journalism and the “Other News.” New York: Routledge, 1997. Rivera, Geraldo. Exposing Myself. New York: Bantam Books, 1991. Bernadette Zbicki Heiney See also
Television.
Cable television; Journalism; Talk shows;
The Eighties in America
■ Roberts v. United States Jaycees Identification Supreme Court decision Date Decided on July 3, 1984
The Supreme Court determined that the application of the Minnesota Human Rights Act, which prohibited gender discrimination in many public contexts, to the Junior Chamber of Commerce did not violate the group’s right to freedom of association. During the 1970’s, the Minnesota Junior Chamber of Commerce (Jaycees) began to admit women as members, in violation of the national Jaycees’ bylaws. When the national organization threatened to revoke the charter of the state organization, the Minnesota Jaycees complained to state authorities that the national organization’s actions were in violation of the Minnesota Human Rights Act, which prohibited gender discrimination in places of public
Associate Justice William J. Brennan delivered the opinion of the Court in Roberts v. United States Jaycees. (Library of Congress)
Roberts v. United States Jaycees
■
831
accommodation. When state authorities, including the Minnesota Supreme Court, agreed with this conclusion, the U.S. Jaycees appealed to the U.S. Supreme Court, arguing that the state’s actions infringed on its freedom of association. The Court, in an opinion written by Justice William J. Brennan, distinguished between two forms of freedom of association—one centering on intimate activities such as marriage and procreation, and the other finding its roots in freedom of expression. The Court concluded that the freedom of association claimed by the Jaycees was expressive in nature. The Court had previously recognized that freedom of expression included a freedom to associate with those with whom one shared common expressive purposes. This freedom, however, was not absolute. According to the Court’s opinion, it could be infringed upon if the government had a sufficiently compelling purpose for doing so, unrelated to the suppression of speech, and could not accomplish this purpose in a manner less restrictive of freedom of association. The Court concluded that the state’s interest in prohibiting gender discrimination was sufficiently compelling to justify overriding the Jaycees’ freedom of association and that the state had no means of accomplishing this end in a manner less restrictive of the Jaycees’ freedom of association. Justices William H. Rehnquist and Sandra Day O’Connor concurred in this result. Chief Justice Warren E. Burger and Justice Harry A. Blackmun did not participate in the case. In a separate concurring opinion, Justice O’Connor argued that the Jaycees were more a commercial form of association than an expressive one and that their rights to freedom of association could be more readily regulated than if their identity were less commercial in nature. Impact In the last quarter of the twentieth century, the U.S. Supreme Court had several occasions to resolve conflicts between efforts to prevent certain forms of discrimination against individuals and constitutionally protected rights such as freedom of religion, speech, and association. The Court’s decisions in this area generally shielded groups whose identities were closely associated with the exercise of these freedoms from the effect of antidiscrimination laws. It permitted, however, the enforcement of antidiscrimination laws against groups that could be characterized—as Justice O’Connor did in this case—as more commercial in nature.
832
■
Robertson, Pat
The Eighties in America
Further Reading
Rosenblum, Nancy L. Membership and Morals: The Personal Uses of Pluralism in America. Rev. ed. Princeton, N.J.: Princeton University Press, 2000. Warren, Mark E. Democracy and Association. Princeton, N.J.: Princeton University Press, 2001. Timothy L. Hall See also Feminism; Glass ceiling; Supreme Court decisions; Women’s rights.
■ Robertson, Pat Identification
American television personality and conservative political candidate and activist Born March 22, 1930; Lexington, Virginia Using his prominence as the founder of the first Christian television network and the host of a conservative Christian talk show, Robertson led many conservative Christians to become involved in the political process during the 1980’s. Pat Robertson came to prominence as a television personality in the United States in the 1970’s, the decade after he founded the Christian Broadcasting Network (CBN) and Pat Robertson celebrates placing second in the Iowa caucuses, ahead of Vice began hosting its most influential television President George H. W. Bush, on February 9, 1988. (AP/Wide World Photos) program, The 700 Club, a Christian talk show. In the 1980’s, Robertson, along with many other conservative Christians, turned his atThe Christian Coalition remained active in Amertention to politics. In his book America’s Dates with ican politics during the following decade. Robertson Destiny (1986), he argued that America had drifted himself continued to use his television prominence from the Christian and moral values that had anito comment on American political and cultural afmated its founding. Two years later, Robertson made fairs; however, by the beginning of the twenty-first an unsuccessful attempt to capture the Republican century the influence of conservative Christians such nomination for president of the United States. He as Robertson in the American political process aplost to George H. W. Bush, who won the Republican peared to have waned somewhat from its height in primary and went on to win the presidential election the 1980’s. of 1988. Undeterred by the loss, Robertson thereafter turned his attention to grassroots political action by founding the Christian Coalition in 1989. The Impact During the first three quarters of the twenpurpose of this organization, he explained, was “to tieth century, conservative Christians tended to mobilize Christians—one precinct at a time, one avoid interaction with the American political procommunity at a time—until once again we are the cess. In the last quarter of the century, however, they head and not the tail, and at the top rather than the reemerged on the national political stage. Along bottom of our political system.” with televangelist Jerry Falwell, Pat Robertson was a
The Eighties in America
key architect of conservative Christian activism. This activism was at its height during the 1980’s and continued into the twenty-first century. Robertson never claimed the political prize to which he aspired, the presidency of the United States, but he had a profound impact on the American political process by persuading many conservative believers that they should be active participants in this process. Further Reading
Boston, Rob. The Most Dangerous Man in America? Pat Robertson and the Rise of the Christian Coalition. Amherst, N.Y.: Prometheus Books, 1996. Harrell, David Edwin. Pat Robertson: A Personal, Religious, and Political Portrait. San Francisco: Harper & Row, 1987. Robertson, Pat, with Jamie Buckingham. The Autobiography of Pat Robertson: Shout It from the Housetops! Rev. ed. South Plainfield, N.J.: Bridge, 1995. Timothy L. Hall See also
Conservatism in U.S. politics; Elections in the United States, 1988; Falwell, Jerry; Moral Majority; Religion and spirituality in the United States; Televangelism.
■ RoboCop Identification Science-fiction action film Director Paul Verhoeven (1938) Date Released July 17, 1987
RoboCop, a violent social satire on big-business capitalism, features a cyborg police officer seeking revenge while trying to regain his own humanity in gritty, near-future Detroit. At the time of its release, Orion Pictures’ RoboCop was one of the most graphically violent American films ever to make it to theaters. The Motion Picture Association of America originally gave the controversial film an X rating, but Dutch director Paul Verhoeven trimmed enough graphic violence to earn it a strong R rating. The film was a box-office success, grossing almost $54 million in the United States. The film is set in a futuristic, dystopian Detroit where crime, drug abuse, and unemployment are pervasive. The city employs the megacorporation Omni Consumer Products (OCP) to take over the police department, replacing traditional patrolmen
RoboCop
■
833
and methods with new, high-technology weaponry to cleanse and eradicate “Old Detroit” and to create a utopia called “Delta City.” Police officer Alex J. Murphy (Peter Weller) transfers to an exceptionally dangerous precinct in Old Detroit and is viciously killed by a gang of thugs on his first assignment. He is officially dead, but OCP finds him to be a “prime candidate” for its experiments, and it fuses his body with cybernetics to create the cyborg RoboCop. RoboCop inflicts swift justice on the lawbreakers of Detroit, and crime is reduced to a standstill. However, soon RoboCop begins dreaming of his former life and discovers that his killers are working for the president of OCP. He ultimately turns against both his murderers and his creators, killing them all, and regains a semblance of his former identity. Verhoeven brought to RoboCop a European sensibility previously unseen in American action films. RoboCop is not only an incredibly violent film but also an intelligent social satire, especially toward the overarching economic policies of the time, known as Reaganomics, which resulted in a recession in the early 1980’s and in a drastic reduction in social services. The villains in the film, members and employees of OCP, represent big business, and the downtrodden denizens of Old Detroit clearly represent the middle and lower classes. RoboCop acts as a champion for the common people, overthrowing the evil empire of OCP, and as a liaison between technology and humankind. There is also a current of criticism running through the film regarding new technology’s trend of dehumanizing individuals and eroding conventional social structures, of which RoboCop himself is a prime example. Impact RoboCop raised the bar for shocking and graphic violence in American cinema, formally launched Verhoeven’s career as an American director, and spawned two less successful and poorly received sequels, neither directed by Verhoeven: RoboCop 2 (1990) and RoboCop 3 (1993). Further Reading
Duncan, Paul, and Douglas Keesey. Paul Verhoeven. Los Angeles: Taschen, 2005. Van Scheers, Rob. Paul Verhoeven. Boston: Faber & Faber, 1997. Alan C. Haslam See also Action films; Blade Runner; Consumerism; Cyberpunk literature; Film in the United States;
834
■
The Eighties in America
Robots
Income and wages in the United States; Information age; Reaganomics; Science-fiction films; Special effects; Unemployment in the United States.
by using infrared beams. During the 1980’s, the public perception of robots was heightened in numerous television episodes and films, including the Star Trek and Star Wars series, The Terminator (1984), and RoboCop (1987).
■ Robots
Impact Personal mobile robots built in the 1980’s became the prototypes for the evolution of robots in the 1990’s and into the twenty-first century. During the 1980’s, the evolution of computer processing technology led to mobile robots that possessed artificial intelligence. Mobile robots can free workers from dirty, heavy, dangerous, and mundane labor. The implementation of robots by industrial companies provides improved management control, better productivity, and consistent production of high-quality products. The military uses robots to clear mine fields and survey battlefields. About a dozen small robots were used as part of the rescue and recovery operations after the 2001 World Trade Center disaster. Personal mobile robots are being designed to provide entertainment, education, communications, security, and other useful functions.
Definition
Machines devised to carry out a series of actions automatically or with minimum external impulse
The advancements made with robots during the 1980’s led to the development of robots that could replace humans in performing many routine and dangerous jobs with speed and precision. By the late 1970’s, new technologies were emerging that made the production of personal mobile robots feasible. The development of computer processing technology and storage in the 1980’s was the vital link needed to implement simple and more complex actions in robots. The processing power necessary to develop artificial intelligence became available in smaller packages. Developing software interpreted signals from sensors that controlled where the robot was going and what it was doing. Through the software, robots could interact to some extent with humans in a reasonable, predictable manner. In 1983, a robot dubbed TOPO-I was released by Androbot. TOPO-I was designed as a mobile extension of the home computer. Standing three feet tall, it could be moved with a joystick or with software control. Improved versions of TOPO-I in the 1980’s included a bidirectional infrared link to the host computer, some speech support, and an onboard vacuum cleaner. The Heathkit HERO robot of the mid-1980’s contained sensors for sound, visible and infrared light, and motion detection. It utilized an 8-bit 6808 processor. A significant improvement in the HERO series was the HERO 2000, a multitasking robot that had a robotic arm and utilized eleven 8-bit peripheral microprocessors. Each microprocessor performed separate tasks and activities that were all coordinated by the main processor. The breakthrough robot of the 1980’s was the Gemini. It stood four feet tall and weighed seventy pounds. Three onboard computers, all CMOS chip devices, provided artificial intelligence, voice and speech synthesis, and propulsion control. The robot could respond to voice commands and be navigated
Further Reading
Gutkind, Lee. Almost Human: Making Robots Think. New York: W. W. Norton, 2006. Patnaik, Srikanta. Innovations in Robot Mobility and Control. New York: Springer, 2006. Spong, Mark W., and Mathukumalli Vidyasagar. Robot Modeling and Control. Hoboken, N.J.: John Wiley & Sons, 2006. Alvin K. Benson See also
Apple Computer; Computers; Inventions; RoboCop; Science and technology; Science-fiction films; Space exploration.
■ Rock and Roll Hall of Fame Definition
A repository dedicated to the rich heritage of rock and roll Date Created in 1983 The establishment of this institution exemplified the trend toward the preservation of heritage that marked the 1980’s. In 1983, Ahmet Ertegün, cofounder of Atlantic Records, decided to establish an organization that would recognize those who had created rock-androll music and those who had propelled it to the
The Eighties in America
Rock and Roll Hall of Fame
■
835
The Rock and Roll Hall of Fame, designed by I. M. Pei. (Jason Pratt/cc-by-a-2.0)
height of popularity. After meeting with attorney Suzan Evans about the concept, Ertegün brought together a group of music industry professionals including Rolling Stone publisher Jann Wenner, attorney Allen Grubman, and a handful of record executives. After numerous discussions, a nominating committee was formed to select inductees into the Rock and Roll Hall of Fame. The criteria established by the group included three categories for induction: performer, nonperformer (including producers, journalists, and music industry executives), and early influences. Once the criteria were established, the search began for a home for a major museum that would include a library, archives, educational facilities, a performance venue, and a permanent museum collection of rock-and-roll memorabilia. The first location was to be a brownstone in New York City, but
other cities began to submit requests to be considered for the honor. Philadelphia, New Orleans, San Francisco, Memphis, Chicago, and Cleveland all made offers to the Rock and Roll Hall of Fame Foundation. The first inductees were honored at a dinner at the Waldorf-Astoria Hotel in New York City on January 23, 1986. The inaugural class of the Rock and Roll Hall of Fame performers included Chuck Berry, James Brown, Ray Charles, Sam Cooke, Fats Domino, the Everly Brothers, Buddy Holly, Jerry Lee Lewis, Elvis Presley, and Little Richard. Robert Johnson, Jimmie Rodgers, and Jimmy Yancey were honored in the early influences category, and the first inductees in the nonperformer category were legendary producer Sam Phillips and disc jockey Alan Freed, who was credited with first using the term “rock and roll.” On May 5, 1986, the foundation announced that
836
■
The Eighties in America
Rock and Roll Hall of Fame
1980’s Inductees into the Rock and Roll Hall of Fame Year
Performer
Early Influence
Lifetime Achievement
Nonperformer
1986
Chuck Berry James Brown Ray Charles Sam Cooke Fats Domino The Everly Brothers Buddy Holly Jerry Lee Lewis Elvis Presley Little Richard
Robert Johnson Jimmie Rodgers Jimmy Yancey
John Hammond
Alan Freed Sam Phillips
1987
The Coasters Eddie Cochran Bo Diddley Aretha Franklin Marvin Gaye Bill Haley B. B. King Clyde McPhatter Ricky Nelson Roy Orbison Carl Perkins Smokey Robinson Big Joe Turner Muddy Waters Jackie Wilson
Louis Jordan T-Bone Walker Hank Williams
Leonard Chess Ahmet Ertegun Jerry Leiber and Mike Stoller Jerry Wexler
1988
The Beach Boys The Beatles The Drifters Bob Dylan The Supremes
Woody Guthrie Lead Belly Les Paul
Berry Gordy, Jr.
1989
Dion Otis Redding The Rolling Stones The Temptations Stevie Wonder
The Ink Spots Bessie Smith The Soul Stirrers
Phil Spector
Cleveland, Ohio, had been selected as the permanent home for the Rock and Roll Hall of Fame and Museum. Once the site had been selected, an exhaustive search was held for a designer, and worldrenowned architect I. M. Pei was chosen. He created a building that reflected the energy of rock and roll. The hallmark of the building was a dramatic triangu-
lar glass “tent” that served as the main entrance to the museum. The groundbreaking ceremonies for the museum took place on June 7, 1993. The Rock and Roll Hall of Fame opened its doors on September 2, 1995, with artifacts of John Lennon donated by Yoko Ono serving as one of the primary collections.
The Eighties in America
Rose, Pete
Impact The Rock and Roll Hall of Fame Foundation created an enormously successful location for the preservation, commemoration, and promotion of rock and roll. It provides a place for the narrative of rock and roll’s history. Further Reading
Juchartz, Larry, and Christy Rishoi. “Rock Collection: History and Ideology at the Rock and Roll Hall of Fame.” Review of Education, Pedagogy, and Cultural Studies 19, nos. 2/3 (May, 1997): 311-332. Manzel, Kevin. “Cleveland’s New Museum Celebrates Rock and Roll.” Historian 58, no. 1 (Autumn, 1995): 29. Amanda Bahr-Evola See also
Architecture; Lennon, John; Music; Pei, I. M.; Pop music; Women in rock music.
■ Rose, Pete Identification
Major League Baseball player and manager Born April 14, 1941; Cincinnati, Ohio In 1985, Pete Rose broke Major League Baseball’s career record for hits, but four years later he was banned from baseball for life for gambling. The 1980’s were a decade of triumph and tragedy for baseball great Pete Rose. As a member of the Philadelphia Phillies, he led the National League in hits in 1981 and played in the World Series in 1980 and 1983. The Phillies won the series in 1980, marking the third time in his career that Rose had played on a world championship team. As the decade opened, Rose, at the age of thirty-nine, began an assault on one of baseball’s most important career records: the career hit record held by National Baseball Hall of Famer Ty Cobb. Rose broke the National League career hit record, held by Stan Musial, in 1981. Playing for the Montreal Expos in 1984, Rose recorded his four thousandth career hit. Later that season,
■
837
he returned to his beloved Cincinnati Reds, with whom he had begun his Major League Baseball career, as a player-manager. Rose topped Cobb’s mark of 4,192 hits on September 11, 1985, in Cincinnati. He retired after the 1986 season with a total of 4,256 hits and a .303 career batting average. He was a sure bet for baseball’s hall of fame. After his playing career ended, Rose encountered problems as manager of the Reds. He did not bring a pennant to Cincinnati; in his first three full seasons as manager, the team finished in second place. During the 1988 season, Rose was suspended for thirty days after an argument and shoving match with an umpire. Even more serious problems surfaced during the next season. Rumors circulated that Rose had a significant gambling problem, that he had encumbered substantial illegal gambling debts, and that he had even bet on baseball. Since the great World Series betting scandal of 1919, Major League Baseball had implemented a zero-tolerance policy for illegal gambling. Rose denied the allegations and threatened to sue Major League Baseball for besmirching his character. In late August, 1989, baseball commissioner Bart Giamatti confronted Rose about his gambling issues. Giamatti agreed not to publicize the evidence that his office had gathered about Rose’s gambling provided that Rose agreed to drop his lawsuit and agreed to a lifetime ban from
Montreal Expo Pete Rose hits his four thousandth career hit, a double, while playing against the Philadelphia Phillies on April 13, 1984. (AP/Wide World Photos)
838
■
Major League Baseball—meaning that Rose could never again play or manage and that he would not be eligible for membership in the National Baseball Hall of Fame. Although he claimed not to have gambled illegally, Rose agreed to the commissioner’s deal and left the game that he loved. Impact The gambling allegations that surrounded Rose at the end of his career damaged the reputation of one of baseball’s best and most colorful players. Baseball fans and players continue to debate whether Rose’s lifetime ban from Major League Baseball—and from the National Baseball Hall of Fame—should continue. A week after imposing the ban, Giamatti died, but his successors have kept the lifetime ban on Rose in place. Further Reading
Rose, Pete, and Rick Hill. My Prison Without Bars. Emmaus, Pa.: Rodale Press, 2004. Shatzkin, Mike, ed. The Ballplayers. New York: Arbor House, 1990. James Tackach See also
The Eighties in America
Run-D.M.C.
Baseball; Baseball strike of 1981; Sports.
■ Run-D.M.C. Identification
African American rap/hip-hop group Date First single released in 1983 In addition to bringing hip-hop into the cultural mainstream, Run-D.M.C. is responsible for one of the most successful rap-rock crossover songs of the 1980’s. Rap/hip-hop juggernaut Run-D.M.C.’s lyricists, Joseph “Run” Simmons and Darryl “D.M.C.” McDaniels, met while growing up in Hollis, Queens, and started performing together in high school during the late 1970’s. Following the addition of fellow Brooklyn native and turntable wizard Jam Master Jay several years later, Run-D.M.C. began recording and releasing albums. After signing with rap and hip-hop label Profile Records, Run-D.M.C. released a self-titled debut album in early 1984. The breakthrough single “It’s Like That” helped propel sales of the album, which was certified gold. Although groups such as the Sugarhill Gang had pioneered rap, Run-D.M.C.’s
unique, interlaced lyrical style helped define the growing genre and established key vocal and musical components that became the backbone of early rap and hip-hop. Other groups and solo artists quickly followed, but Run-D.M.C. undoubtedly set the bar in the early 1980’s. The group also separated itself from its contemporaries visually, with each member dressed in a black fedora hat, laceless white Adidas shoes, and a leather jacket—an ensemble that would quickly become the group’s trademark wardrobe. Less than a year after Run-D.M.C., the group’s sophomore effort, King of Rock, was released, charting even higher and selling more copies than its predecessor. The trio was enjoying moderate success both on radio and on the fledgling cable network MTV, exposure that helped increase Run-D.M.C.’s fan base. However, the success was moderate compared to the superstardom that the group would experience upon release of the third album, 1986’s triple platinum Raising Hell. In addition to the popular singles “It’s Tricky” and “My Adidas,” the album’s—and group’s—most successful hit was “Walk This Way,” originally recorded during the 1970’s by the veteran rock band Aerosmith. This was more than just a traditional cover song, however; Aerosmith’s lead singer Steven Tyler and guitarist Joe Perry collaborated on the track and music video with Run-D.M.C., creating the first rap-rock crossover. The success of the song was unparalleled; not only was there significant radio airplay, but the accompanying music video was also in heavy rotation on MTV. The album gave the group its highest commercial success to date, reaching number one on the Billboard Top R&B/Hip-Hop Albums chart. While it seemed impossible to match the colossal success of Raising Hell, Run-D.M.C.’s fourth album, 1988’s Tougher than Leather, was issued to critical and commercial acclaim. The album, benefiting from the hit single “Mary, Mary,” eventually reached number two on the Billboard Top R&B/Hip-Hop Album chart and was certified platinum. The decade ended with the group in the studio, recording 1990’s Back from Hell. Impact
Although the group’s members would achieve success in later decades both collectively and individually, Run-D.M.C.’s accomplishments during the 1980’s cemented the group’s status as rap/hip-
The Eighties in America
Ryan, Nolan
hop icons. The group’s popularity also brought rap and hip-hop into the cultural mainstream, one of Run-D.M.C.’s greatest accomplishments. Further Reading
McDaniels, Darryl, with Bruce Haring. King of Rock: Respect, Responsibility, and My Life with Run-D.M.C. New York: St. Martin’s Press, 2001. Ro, Ronin. Raising Hell: The Reign, Ruin, and Redemption of Run-D.M.C. and Jam Master Jay. New York: Amistad, 2005. Matthew Schmitz See also African Americans; Hip-hop and rap; MTV; Music; Music videos; Public Enemy.
■ Ryan, Nolan Identification Professional baseball player Born January 31, 1947; Refugio, Texas
Because of his record-breaking accomplishments as a baseball player, Ryan is recognized as one of the greatest pitchers in the history of Major League Baseball. After graduating from high school in 1965, Nolan Ryan signed a contract to pitch for the New York Mets. After pitching for the Mets from 1966 to 1971, he played for the California Angels from 1972 to 1979. He pitched two no-hitters in 1973 and two more in 1974. Ryan became the first professional baseball player to sign a million-dollar contract when he joined the Houston Astros in 1980. Ryan, whose fastballs often exceeded 100 miles per hour, was dubbed the “Ryan Express” by the media and other players. In 1981, Ryan led the National League with a 1.69 earned run average, and he set the Major League Baseball (MLB) record for nohitters when he pitched his fifth on September 26, 1981. He registered his second career postseason win with a victory over the Los Angeles
■
839
Dodgers and Fernando Valenzuela in the National League division series. On April 27, 1983, Ryan established the MLB career record for strikeouts when he recorded number 3,509, surpassing the mark long held by Walter Johnson. After battling injuries during two frustrating seasons in 1984 and 1985, Ryan returned to top form in 1986 and struck out 194 batters in 178 innings. He earned his 250th career victory on August 27, 1986. Although he had a poor 8-16 record in 1987, he struck out 270 hitters in 212 innings and became the only pitcher in MLB history to register 2,000 strikeouts in each of the American and National Leagues. At the end of the 1988 season, Ryan signed a contract to play for the Texas Rangers. On June 3, 1989, he pitched his eleventh career one-hitter. He became the only pitcher in MLB history with 5,000 strikeouts when he struck out Rickey Henderson on August 22, 1989. During the 1989 campaign, he recorded 301 strikeouts, which was the sixth time that he had more than 300 strikeouts in one season. While playing for the Rangers, Ryan pitched his sixth no-hitter in 1990 and recorded number seven in 1991.
Houston Astro Nolan Ryan pitches against the Chicago Cubs on August 23, 1984. Ryan struck out twelve batters during the game. (AP/Wide World Photos)
840
■
The Eighties in America
Ryan, Nolan
Impact After twenty-seven years as a major-league pitcher, Ryan retired in 1993 with an amazing fiftythree MLB records, including 5,714 strikeouts and seven no-hitters. He was selected as a Major League All-Star eight times. He posted a 324-292 win-loss record. His jersey number was retired by three teams—the Astros, Angels, and Rangers. In 1999, Ryan was elected to the National Baseball Hall of Fame, receiving the second-highest percentage of votes in history after Tom Seaver. That same year, Sporting News placed him on the list of the 100 Greatest Baseball Players. In 2003, he was inducted into the Hall of Fame of the Texas Rangers.
Further Reading
Anderson, Ken. Nolan Ryan: Texas Fastball to Cooperstown. Austin, Tex.: Eakin Press, 1999. Kramer, Sydelle. Baseball’s Greatest Pitchers. New York: Random House, 1992. Ryan, Nolan, and Jerry B. Jenkins. Miracle Man: Nolan Ryan, the Autobiography. Dallas: Word Publishing Group, 2004. Alvin K. Benson See also Baseball; Brett, George; Gibson, Kirk; Hershiser, Orel; Rose, Pete; Ryan, Nolan; Sports; Valenzuela, Fernando.
S ■ St. Elsewhere Identification Television drama series Date Aired from October 26, 1982, to May 25,
1988 Unlike some previous medical shows that centered on patients, St. Elsewhere focused on doctors and nurses, who were portrayed as flawed and fallible. One of the major ensemble dramas of the decade, the show often emphasized the medical professionals’ career and personal problems and portrayed the workplace as a surrogate family. Much like Hill Street Blues (1981-1987), St. Elsewhere was an ensemble drama with a large cast. There were often four story lines in an episode, and some plots and subplots continued through several episodes. The show treated medical ailments not generally discussed on television or even “polite” society, such as impotence and addiction. In December, 1983, it became the first prime-time drama to focus on an AIDS patient. The primary characters, each of whom functioned in various ways as a role model, were three veteran physicians: Dr. Donald Westphall, played by Ed Flanders; Dr. Mark Craig, played by William Daniels; and Dr. Daniel Auschlander, played by Norman Lloyd. The show featured a dozen more central characters, including nurses, first- and second-year residents, and other hospital staff. Additional characters for each episode, some of whom were recurring, were often played by actors celebrated for their work in film and television.St. Elsewhere began the careers of many major television and film actors and writers. Actors Mark Harmon, Howie Mandel, and Alfre Woodard went on to success in film and television, and both Denzel Washington and Helen Hunt won Academy Awards. Viewers were attracted to the realism of the show, set at a deteriorating hospital, St. Eligius, in Boston. Compelling plots often explored ethical dilemmas. The series was also characterized by dark humor that mixed the real with the surreal— and by a series of in-jokes and puns. In 1993, the edi-
tors of TV Guide named St. Elsewhere the best drama of all time. Impact St. Elsewhere had the good fortune to be produced at a time when networks were beginning to care as much about the demographics of their viewship as they did about sheer numbers. The advent of cable television and narrowcasting, in addition to more sophisticated audience analysis techniques, made targeted audiences with expensive tastes and disposable income desirable. The series had dismal ratings: It finished its first season ranked eighty-sixth out of ninety-eight prime-time shows. However, critics recognized the superior quality of its writing and acting, and it was renewed for a second season based on the fact that its viewship was composed disproportionately of yuppies. In addition to its quality, St. Elsewhere’s focus on the personal problems of the hospital staff appealed to the socalled me generation. Over its six years, the show won thirteen Emmy Awards but never reached higher than forty-ninth place out of about one hundred shows in the Nielson ratings; it did, however, make a lot of money for the National Broadcasting Company (NBC). Advertisers seeking to reach wealthy baby boomers paid top dollar to air commercials during St. Elsewhere’s time slot. The final episode of St. Elsewhere, titled “The Last One,” portrayed the entire six-year series as a fantasy, existing only in the imagination of an autistic child. This playful vision of television “reality” made media history. Further Reading
Thompson, Robert J. Television’s Second Golden Age. New York: Continuum, 1996. Turow, Joseph. Playing Doctor: Television, Storytelling, and Medical Power. New York: Oxford University Press, 1989. Marcia B. Dinneen See also AIDS epidemic; Hill Street Blues; Television; Yuppies.
842
■
San Ysidro McDonald’s massacre
■ San Ysidro McDonald’s massacre The Event
A seventy-seven-minute shooting rampage at a fast-food restaurant leaves twentyone people dead Date July 18, 1984 Place San Ysidro, California Because of the high death toll, the familiarity of McDonald’s, and the random nature of the attack—the killer had no grudge against the franchise or any of the customers, made no demands, and espoused no agenda—the San Ysidro McDonald’s massacre stunned the nation, as it suggested a new kind of vulnerability.
The Eighties in America
criminate gunfire. When the police arrived, they assumed that there were several shooters (Huberty fired more than 250 rounds). Shortly after 5:00 p.m., an employee escaped out a back door and informed the special weapons and tactics (SWAT) commandos that there was only one shooter and no hostages. Everyone else was either wounded or dead. The SWAT team reacted quickly, and sharpshooter Chuck Foster killed Huberty with a single chest shot. Impact In the wake of the massacre, McDonald’s razed the building and gave the land to the city, which built a community college on the site after erecting a memorial. Investigators never accounted for the rampage. Huberty’s widow filed a lawsuit against both McDonald’s and the Ohio factory where Huberty had welded. She claimed the food’s monosodium glutamate and the factory’s airborne toxins had slowly poisoned Huberty. Forensic pathologists, however, suggested Huberty might have been a paranoid schizophrenic.
A single gunman—a forty-one-year-old former security guard named James Oliver Huberty—killed twenty-one people and wounded nineteen others during his rampage at a McDonald’s restaurant. The dead and wounded, Wednesday-afternoon patrons of the restaurant, included men, women, and children. Huberty, who had graduated with a degree in sociology from an Ohio Quaker college and worked as a welder for fourteen years, had drifted through various menial jobs before arriving in early 1984 in San Ysidro, California, two miles from the Mexican border. He worked as a condominium security guard but was fired ten days before the shootings. Concerned about his own depression and mood swings, Huberty contacted a mental health clinic but never received a call back. On the morning of July 18, Huberty settled a minor traffic ticket. He took his wife and two children to a different McDonald’s (they frequently ate there) and then to the San Diego Zoo. They left early because of the heat. After returning home, he casually informed his wife, “I’m going to hunt humans.” He drove to the nearby restaurant, arriving around 4:00 p.m., with a nine-millimeter Browning automatic pistol in his belt and a twelve-gauge Winchester shotgun and a nine-millimeter Uzi semiautomatic machine gun across his shoulders. Once inside, he ordered the stunned patrons, mostly Hispanic, to get down on the floor and began executing A blood-spattered woman is led away from the scene of the San Ysidro McDonthem, showering the restaurant with indisald’s massacre on July 18, 1984. (AP/Wide World Photos)
The Eighties in America
The shootings were unprecedented in the United States. The closest parallel event had been the 1967 University of Texas clock tower shootings. As a result of the incident, police agencies reconsidered their policy of using violence only as a last resort in hostage situations. California politicians launched unprecedented (and largely unsuccessful) attempts to ban assault rifles. McDonald’s set a standard for corporations victimized by random crime: It settled victims’ injury claims, covered funeral costs, and provided counseling. Mental health facilities reevaluated overworked clinics, and forensic psychologists examined Huberty’s antisocial behavior and his wife’s failure to respond to his chilling comment before leaving for the restaurant. Further Reading
Fox, James Alan, and Jack Levine. Extreme Killing: Understanding Serial and Mass Murderers. Thousand Oaks, Calif.: Sage, 2005. Ramsland, Katherine. Inside the Mind of Mass Murderers. Westport, Conn.: Greenwood Press, 2005. Joseph Dewey See also
Atlanta child murders; Crime; Goetz, Bernhard; Lucas, Henry Lee; Night Stalker case; Post office shootings; Tylenol murders.
■ Sauvé, Jeanne Identification
First woman governor-general of Canada Born April 26, 1922; Prud’homme, Saskatchewan Died January 26, 1993; Montreal, Quebec Sauvé was a trailblazing figure in Canadian politics. She was the first woman governor-general and the first woman member of Parliament from Quebec to be appointed to the federal cabinet. Jeanne Sauvé was the daughter of Charles Albert Benoît and Anna Vaillant Benoît. She was a brilliant student at a convent school and at the University of Ottawa, and she became active as a teenager in a reformist Catholic students’ movement, Jeunesse Etudiante Catholique. As the organization’s president at the age of twenty, she moved to Montreal and was soon involved with a group of reform-minded intellectuals that included future prime minister Pierre Trudeau. In 1948, Jeanne Benoît married
Sauvé, Jeanne
■
843
Maurice Sauvé, who was later to serve as a federal cabinet minister in the 1960’s. They went to Paris to study, and Jeanne worked in the youth section of the United Nations Educational, Social, and Cultural Organization (UNESCO); the couple returned to Montreal in 1952. Sauvé worked as a broadcaster and journalist until 1972, when she was elected to the House of Commons as a Liberal member of Parliament from Ahuntsic, Montreal. She served in Prime Minister Trudeau’s cabinet until 1979, first as minister of science and technology, then as minister of environment and communications. As Speaker of the House of Commons from 1980 to 1984 and the member of Parliament for Laval-des-Rapides, Sauvé was criticized for early parliamentary mistakes, and as governorgeneral from 1984 to 1990, she was attacked for security controls she imposed on public access to the grounds of Government House, Rideau Hall. Sauvé’s most important achievement was her appointment as the first woman governor-general. In that position, her patriotism often caused controversy, especially among Quebec nationalists, who were frustrated by her conviction that Quebec could better achieve fulfillment by personal effort, persuasion, and reform rather than by anger, which she felt only destroyed political and social structures. Opponents disliked her farewell New Year’s message to the nation as governor-general at the end of 1989, believing it was an improper interference in politics, while supporters found it an eloquent appeal to keep Canada whole. Although Sauvé had delivered the same message at her installation as governorgeneral in 1984 and on many other occasions, she was criticized for expressing those sentiments at the height of debate over the Meech Lake plan for constitutional reform. In 1990, Sauvé founded the Jeanne Sauvé Youth Foundation. Impact Sauvé argued her case for a united Canada at every opportunity. Her political vision went beyond Quebec. She traveled widely and believed that if young people were just given a chance to communicate with other young people—East and West, Catholic and Protestant, French and English—the problems that arose from fear and a sense of foreignness would quickly be eliminated. Shortly before her term as governor-general expired, Sauvé established in her name a $10 million youth foundation to bring together young leaders from around the world.
844
■
Savings and loan (S&L) crisis
Further Reading
Sauvé, Jeanne. Selected Decisions of Speaker Jeanne Sauvé, 1980-1984/Recueil de Décisions du Président Jeanne Sauvé, 1980-1984. Ottawa: House of Commons Canada, 1994. Wallin, Pamela, and Tim Kotcheff. Jeanne Sauvé. Toronto: CTV Television Network, 1989. Woods, Shirley E. Her Excellency Jeanne Sauvé. Toronto: Macmillan of Canada, 1986. Martin J. Manning See also
Canada Act of 1982; Canada and the British Commonwealth; Canada and the United States; Meech Lake Accord; Trudeau, Pierre.
■ Savings and loan (S&L) crisis The Event
Deregulation leads to a national banking crisis
The S&L crisis of the 1980’s, during which one thousand savings and loan associations across the United States failed, was the nation’s largest-ever financial scandal and cost American taxpayers and depositors billions of dollars in bailouts, contributing to large budget deficits and possibly to the 1990’s recession. The events leading up to the S&L crisis began during the Jimmy Carter and Ronald Reagan administrations in the late 1970’s and the early 1980’s, when the government removed many of the federal regulations on banks in a laissez-faire approach designed to make banks more competitive on the open market. Until then, federal law had required savings banks to maintain maximum interest rates on savings accounts and had prohibited them from issuing checking accounts or credit cards. Savings banks also could not make commercial or nonresidential real estate loans. On the other hand, unlike savings banks, commercial banks could, when necessary, borrow from the Federal Reserve Bank. Under deregulation, savings and commercial banks became almost indistinguishable. One result of this situation was an immediate increase in savings and loan institutions, or S&L’s, which, under the supervision of the Federal Home Loan Bank Board and insured by the government’s Federal Savings and Loan Insurance Corporation (FSLIC), could for the first time freely venture into lucrative commercial real estate markets and issue credit cards. By 1980, there were
The Eighties in America
forty-six hundred such institutions in the United States. However, this trend quickly reversed when the S&L crisis began: By the end of the 1980’s, there were only three thousand S&L’s left, and five years later, that number had been reduced to less than two thousand. The Crisis Looms Deregulation resulted in the growth of the U.S. economy during the 1980’s, especially in the real estate sector. This growth enticed many of the underregulated S&L’s to invest in highrisk, speculative ventures; it also tempted unscrupulous executives to defraud the regulatory agencies. Thus, when the real estate market faltered and fell, the S&L’s found themselves in dire circumstances, because they owned real estate that was worth less than they had paid for it. Numerous bankruptcies ensued, and a great number of S&L depositors lost their money. Indeed, many lost their entire life savings. In 1988, the Federal Home Loan Bank Board, whose function it was to supervise the S&L’s, reported that fraud and insider abuse were the primary causes of the S&L failures. The head of Lincoln Savings of Phoenix, Arizona, Charles Keating, came to be known as the worst of the abusers. With full knowledge that Lincoln Savings was about to become insolvent, Keating removed $1 million from the S&L. He was eventually convicted of fraud, racketeering, and conspiracy and spent four and onehalf years in prison. His conviction was then overturned, and he pleaded guilty to bankruptcy fraud to avoid a new trial. Before his conviction, however, Keating attempted to secure the aid of five U.S. senators in avoiding regulatory sanctions for his company. When the relationship between Keating and the senators became public, a national scandal ensued, and the senators became known as the Keating five. Congress and various states attempted to respond to the S&L crisis during the early and mid-1980’s, but their stopgap measures were insufficient. Eventually, in 1989, newly elected president George H. W. Bush engaged in a full-scale federal bailout of the industry. He estimated, to the shock of the country, that the government would have to spend between $50 billion and $60 billion. Congress enacted the Financial Institutions Reform, Recovery, and Enforcement Act (FIRREA). It ensured oversight of the S&L’s and eliminated the Federal Home Loan Bank
The Eighties in America
Savings and loan (S&L) crisis
■
845
Three members of the Keating five, from left, Senators John Glenn, Dennis DeConccini, and John McCain, arrive at the Senate Committee on Ethics hearing room in November, 1990. The five faced charges of peddling their influence to help Charles Keating, the man at the heart of the S&L crisis. (AP/Wide World Photos)
Board, which had failed to effectively supervise the S&L industry. The government also created the Office of Thrift Supervision (OTS). Impact The 1980’s were characterized by deregulation of industry in general, which had been a major plank in President Reagan’s platform and was at the center of his economic philosophy. The deregulation of the banking industry that led to the S&L crisis, however, was begun by Reagan’s predecessor, Jimmy Carter. The ensuing financial disaster became a major threat to the U.S. financial system. The crisis severely dampened the Republican Party’s enthusiasm for deregulation, and it changed the terms of the debate about financial policy in the United States. It soon became clear that President Bush had grossly underestimated the cost of the S&L bailout. Economists have estimated the final cost of the crisis
at $600 billion. The savings and loan crisis, in turn, contributed to the severe budget deficits of the early 1990’s, as well as to a major slowdown in both the finance and the real estate markets, and arguably to the 1990-1991 economic recession. Further Reading
Adams, James R. The Big Fix: Inside the S&L Scandal— How an Unholy Alliance of Politics and Money Destroyed America’s Banking System. New York: Wiley, 1990. The author, a well-known freelance journalist, examines the savings and loan crisis by looking at the history of American banking dating back to the Great Depression. Barth, James, Susanne Trimeth, and Glenn Yago. The Savings and Loan Crisis: Lessons from a Regulatory Failure. New York: Springer, 2004. Claims to set the record straight about the poorly supervised banking practices of the 1980’s that resulted
846
■
The Eighties in America
Scandals
in the S&L crisis. Includes the contributions of a diverse group of former regulators and scholars. Calavita, Kitty. Big Money Crime: Fraud and Politics in the Savings and Loan Crisis. Berkeley: University of California Press, 1999. Attempts to addresses the often confusing and conflicting accounts of the 1980’s S&L crisis and posits effective arguments about its causes. Talley, Pat L. The Savings and Loan Crisis: An Annotated Bibliography. Westport, Conn.: Greenwood Press, 1993. Annotated bibliography that includes more than 360 titles on the S&L crisis published between 1980 and 1992. Includes scholarly and popular articles and should appeal to anyone researching the crisis. Includes dissertations and both author and subject indexes. M. Casey Diana See also
Bush, George H. W.; Business and the economy in the United States; Elections in the United States, 1988; Reagan, Ronald; Reaganomics; Scandals; Wall Street.
■ Scandals Definition
Public outcries in response to actual or perceived violations of law or morality, usually by public figures
While still recovering from the demoralizing effect of the 1970’s Watergate scandal, the United States was rocked repeatedly in the 1980’s with a wave of new scandals. These ongoing scandals deeply affected the nation’s economy and government. The 1980’s was termed by some the “decade of decadence,” and it was characterized by individualistic greed and selfishness that gave rise to the appellation “Me generation” and the famous “Greed is good” speech in Oliver Stone’s Wall Street (1987). Actual or perceived greed and self-interest drove many of the political and economic scandals of the decade, including Abscam, and Iran-Contra affair, the savings and loan (S&L) crisis, and various insider trading scandals on Wall Street. These scandals not only brought into question the integrity of the institutions at the heart of the United States’ power structure but also hampered the ability of those institutions to function.
Minor Political Scandals
In 1980, President Jimmy Carter lost his bid for reelection to Ronald Reagan. Carter was unpopular in 1980 as a result of domestic economic problems and his inability to bring home the Americans being held hostage in Iran. His reelection campaign was also hurt, however, by a scandal involving his flamboyant older brother, Billy Carter. Billy was revealed to be a paid agent of Libya who had received hundreds of thousands of dollars from dictator Muammar al-Qaddafi. Later, in 1983, it was revealed that President Reagan had “cheated” in his final debate with Carter by obtaining a copy of Carter’s briefing notes for the debate. This scandal, referred to as “Debategate” in reference to the Watergate scandal, proved to be relatively minor compared to others that would affect Reagan and other national politicians. Also in 1983, a particularly lurid sex scandal resulted in a recommendation by the House Committee on Ethics that representatives Dan Crane and Gerry Studds be reprimanded for unsuitable relationships with teenage congressional pages. After his admirable role in heading up the Tower Commission investigating the Iran-Contra affair, in 1987 John Tower was nominated as secretary of defense, but a scandal involving habitual alcohol abuse, womanizing, and questionable financial ties to the defense industry destroyed his chances. That same year, presidential hopeful Senator Gary Hart lost his lead when the Miami Herald printed a photograph of him with model Donna Rice. In 1989, scandal forced Speaker of the House of Representatives Jim Wright to resign after an investigation found financial irregularities centering on improper gifts and using sales of books to bypass limits on speaking fees. Also in 1989, a scandal involving Democratic Massachusetts representative Barney Frank hit the airwaves after Frank admitted his relationship with a male prostitute. These scandals, however, paled in comparison to the major scandals of the decade.
Abscam
In 1981, news broke of a Federal Bureau of Investigation (FBI) high-level sting, code-named Abscam, that had begun in 1978. Undercover agents posing as Middle Eastern businessmen had offered government officials large sums of money in return for granting favors to a mysterious (and fictional) sheikh named Kambir Abdul Rahman. Ultimately, the sting, which was the the first large-scale operation designed to trap corrupt public officials, resulted in
The Eighties in America
the conviction of a senator, six congressmen, the mayor of Camden, New Jersey, and members of the Philadelphia City Council. However, the controversy surrounding Abscam was generated not only by the corruption of the officials but also by accusations of entrapment on the part of the agents. The Iran-Contra Scandal
During Reagan’s second term, White House officials, at the suggestion of the Israeli government, covertly sold weapons to Iran in violation of the Arms Export Control Act (1976). In turn, $30 million in profits were used to fund the Contras, a group of right-wing guerrilla insurgents against the leftist Sandinista National Liberation Front in Nicaragua. The subterfuge was designed to circumvent the will of Congress, which had explicitly prohibited further support to the Contras. In November of 1986, a Lebanese newspaper exposed the operation and claimed that the arms sales to Iran were intended to influence Lebanon to release American hostages. President Reagan appeared on television and vehemently denied that the arms sale had taken place. A week later, he was forced to retract his statement, but he continued to deny that the sale of weapons had been intended to help secure the release of American hostages held in Lebanon. The scandal gained strength when Lieutenant Colonel Oliver North and his secretary Fawn Hall shredded incriminating documents and U.S. attorney general Edwin Meese III was forced to admit that profits from the arms sales were indeed directed to the Nicaraguan Contras. Faced with congressional wrath, in December of 1986 Reagan was forced to form a special investigative commission with former senator John Tower at its head. In 1987, the Tower Commission criticized Secretary of Defense Caspar Weinberger, Deputy National Security Adviser Admiral John Poindexter, and his deputy, Lieutenant Colonel North, as well as accusing the president of ineffectively supervising his aides.
The S&L and Insider Trading Scandals
The S&L crisis was accompanied by scandals when the depth of the greed and mismanagement that led to it were understood. By the early 1980’s, the government had removed many federal regulations from the banking industry in order to make banks more competitive. This deregulation resulted in an increase in the number of S&Ls, as well as a broadening of the scope of their financial activities. S&Ls were allowed to invest in commercial real estate and to issue credit
Scandals
■
847
cards for the first time, and they were eager to do so. The associations invested in high-risk properties, many of which lost their value. Bankruptcies ensued, and a great number of depositors lost their money— many lost their entire life savings. S&L head Charles Keating pleaded guilty to bankruptcy fraud after he removed $1 million from Lincoln Savings before the institution’s imminent collapse. In addition, five United States senators were implicated in the national scandal for helping Keating. They became known as the Keating five. In 1986, the Securities and Exchange Commission (SEC) investigated Dennis Levine, Ivan Boesky, and Michael Milken, among others, for willful violations of securities law and business ethics. The men had set up the greatest insider trading ring in history and in the process nearly destroyed Wall Street. Dennis Levine, a managing director at Drexel Burnham Lambert, led investigators to Ivan Boesky, who had made $200 million by betting on corporate takeovers after acquiring information from corporate insiders. In turn, Boesky informed on Michael Milken, who had developed a fast-paced market for junk bonds. Entertainment Scandals Although not nearly as serious as the nation’s political scandals, show business scandals during the 1980’s also captured the nation’s imagination. In 1982, Saturday Night Live actor John Belushi died of an overdose of heroin and cocaine. A year later, the Beach Boys’ drummer, Dennis Wilson, drowned in a drug-related accident, and in 1984, popular R&B singer Marvin Gaye was shot to death by his father during an argument. Actress Zsa Zsa Gabor was jailed in 1989 for slapping a police officer. The most scandalous crime of the decade, however, was the murders that same year of Kitty and Jose Menendez by their children, Lyle and Erik Menendez, who shot their parents with shotguns while the couple watched television. The brothers were eventually found guilty of first-degree murder. Impact The 1980 scandal that became known as “Billygate” helped Ronald Reagan gain the presidency, and the major scandals of the 1980’s— Abscam, Iran-Contra, the S&L crisis, and the Wall Street insider trading scandals—resulted in public outrage that led to the enactment of legislation seeking to ensure that crimes such as these would not happen again. Deeply concerned about the possibility of law enforcement officials entrapping indi-
848
■
The Eighties in America
Schnabel, Julian
viduals and damaging their reputations, the courts harshly criticized the FBI for its use of entrapment techniques in the Abscam scandal and in 1981 prompted Attorney General Benjamin Civiletti to issue The Attorney General’s Guidelines for FBI Undercover Operations. The deceptiveness and dishonesty of the IranContra scheme cast a dark shadow on the Reagan presidency and made the American people even more uneasy about their government. The S&L crisis, the largest financial scandal in American history, had an enormous impact on the U.S. economy. President George H. W. Bush estimated that government bailouts would cost the taxpayers $50 billion, but later accounts estimated that the crisis cost between $600 billion and $1.4 trillion. The Wall Street insider trading scandal led to a greater understanding that insider trading was not necessarily restricted to individuals, but that crime networks could also be established. The combined effect of all these scandals was to taint the federal government and corporations in the eyes of the American public, which came to expect a certain level of corruption from its elected officials and business leaders. Further Reading
Barth, James, Susanne Trimeth, and Glenn Yago. The Savings and Loan Crisis: Lessons from a Regulatory Failure. New York: Springer, 2004. Written by a variety of banking industry regulators and business academics, this book illustrates how the inept banking practices of the 1980’s led to one of history’s greatest financial calamities. Davis, Lanny. Scandal: How “Gotcha” Politics Is Destroying America. New York: Palgrave Macmillan, 2006. In his highly praised book, Lanny Davis, special counsel to the Bill Clinton White House, argues that politics in America have become driven by vicious scandals involving partisan politicians, extremists, and the media, who are bent on destroying public officials. Chapter 4, “The Scandal Cauldron,” is dedicated to the political scandals of the 1980’s. Greene, Robert W. Sting Man. New York: Dutton, 1981. Provides details of the FBI sting operation and how Melvin Weinberg brought about its successful conclusion. Kallen, Stuart. A Cultural History of the United States Through the Decades: The 1980’s. New York: Chicago: Lucent Books, 1998. Discusses the major
scandals of the United States in the 1980’s and the Iran-Contra affair in particular. Stewart, James B. Den of Thieves. New York: Touchstone Books, 1992. Stewart, a Pulitzer Prizewinning Wall Street Journal reporter, utilizes court documents, testimony, and interviews in this comprehensive account of the 1980’s Wall Street insider trading scandals. Troy, Gil. Morning in America: How Ronald Reagan Invented the 1980’s. Princeton, N.J.: Princeton University Press, 2005. Each chapter focuses on a year, from 1980 to 1989, during Ronald Reagan’s campaign and presidency. The book details Reagan’s reactions to the Abscam scandal, the Iran-Contra affair, the S&L crisis, and the Wall Street insider trading scandal. M. Casey Diana See also
Abscam; Bush, George H. W.; Congress, U.S.; Congressional page sex scandal of 1983; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Hart, Gary; Iran-Contra affair; Junk bonds; Meese, Edwin, III; North, Oliver; Poindexter, John; Reagan, Ronald; Savings and loan (S&L) crisis; Tower Commission; Watt, James; Weinberger, Caspar; Williams, Vanessa; Wright, Jim.
■ Schnabel, Julian Identification American artist Born October 26, 1951; Brooklyn, New York
Schnabel experienced meteoric success in the New York art scene and became a lightning rod for art criticism. Julian Schnabel received an art degree from the University of Houston in 1973. By 1981, the brash and self-promoting artist had unprecedented parallel shows at the Mary Boone and Leo Castelli galleries in New York City. All of his works were sold before the shows opened. He exhibited intensely in America and Europe throughout the 1980’s and had several “retrospectives” before the age of forty. Works that brought three thousand dollars at the beginning of the 1980’s sold for upward of sixty thousand dollars only a few years later. Schnabel’s paintings were very large, often ten feet by fifteen feet or more. They combined a return to figuration—often quoting religious or mythologi-
The Eighties in America
cal themes—with highly gestural paint application. Both of these characteristics were frequently “appropriated” from earlier works of art. Schnabel emphasized the material nature of his paintings with the supports on which he painted and by gluing objects to the surface. The “plate paintings” (in which he painted over broken crockery attached to a wooden base) became sensationalized in the popular press. These works first appeared at the end of the 1970’s, after Schnabel had seen the architectural works of Antonio Gaudi in Spain in 1977, but they became well known in the early 1980’s. His work is best compared perhaps with such European artists as Anselm Kiefer. New York art critics dubbed Schnabel a neoexpressionist, and many credited him with a “rebirth” of painting after the periods of minimalist and conceptual art. In critical writing of the 1980’s, he was the focus of concerns about key art issues of the de-
School vouchers debate
■
849
cade: the role of appropriation, the relation of high art and mass culture, gallery business practices and art market hype, the remasculinization of the art scene, and the relation between artistic success and authenticity. In the popular press, Schnabel became the very image of the 1980’s artist and of the financial excess of the decade. Impact Marketing of Schnabel’s paintings brought him spectacular success and fueled a boom in the art market. For many critics and much of the public, such success called into question both Schnabel’s sincerity as a creative artist and the basically mercantile nature of the business of art. Full evaluation of Schnabel’s work as a painter may fall to future generations. In later years, he became a successful screenwriter and filmmaker as well. Further Reading
Eccher, Danilo, curator. Julian Schnabel, 22 novembre 1996-30 gennaio 1997. Bologna, Italy: Galleria d’Arte Moderna, 1996. Hollein, Max, ed. Julian Schnabel: Malerei/Paintings 1978-2003, 29 January-25 April 2004. Frankfort, Germany: Schirn Kunsthalle, 2004. Pearlman, Alison. Unpackaging Art of the 1980’s. Chicago: University of Chicago Press, 2003. Jean Owens Schaefer See also
Art movements; Business and the economy in the United States; Consumerism; Neoexpressionism in painting.
■ School vouchers debate Definition
Controversy over a proposal to use tax dollars to subsidize private school tuition
Julian Schnabel in 1987. (Hulton Archive/Getty Images)
A debate begun in earnest during the 1950’s found renewed vigor during the 1980’s, as education reformers proposed government-funded tuition payments that would divert public funding to private or parochial schools. Those in favor of vouchers argued that they would provide parents with the financial power to choose the best schools for their children. Those opposed to them argued that they represented an abandonment of already troubled public schools, which could ill afford to lose any federal funding. Although the idea of school vouchers did not gain substantial ground during the 1980’s, a confluence of political, social, and economic factors kept the idea alive through the decade.
850
■
The Eighties in America
School vouchers debate
Proponents of school vouchers saw these government-funded tuition grants as a means of providing choice and competition in education. They believed that the public school system was a socialized monopoly, that its problems resulted from this fact, and that competition with private schools could cure those problems. Opponents warned that diverting tax dollars away from public schools and into the hands of private providers was no way to correct the problems facing the American public school system. The debate was shaped by the fact that it was clearly not possible to fix the U.S. education system quickly, so even parents who believed in improving public schools for future generations might not want to send their children to those schools if they had not yet improved. Not a New Debate Although the term “voucher” may have been a new addition to the debate, the core idea was not new. In the founding days of the nation, English economist Adam Smith, in his seminal work An Inquiry into the Nature and Causes of the Wealth of Nations (1776; commonly known as The Wealth of Nations), had called for the government to give money directly to parents. This money would be used to purchase educational services in order to prevent the development of a monopoly over the provision of such services. In 1956, Nobel Prizewinning economist Milton Friedman argued that the existence of the monopoly Adam Smith had predicted two centuries earlier was leading to inefficiencies and a lack of innovation. He believed the quality of education would improve if education was driven by market forces. It was no coincidence that the voucher idea resurfaced in the late 1950’s and early 1960’s. During the height of the Civil Rights movement, public schools were seen as avenues of opportunity and mobility and had become the major focal point of demands for change, community control, and racial equality. Thus, calls for privatization in education reemerged, both among those who wished to avoid the social and political unrest they perceived to exist in public schools, and among those seeking greating educational opportunities than they believed were available in those schools. A Flurry of Activity In 1983, the U.S. Department of Education released A Nation at Risk, which warned of the impending economic doom its authors forecasted for the country as a direct result of a steady erosion of student achievement in American public
schools. The report produced a sense of urgency in education not experienced since the launch of the Soviet satellite Sputnik 1 in 1957. Thousands of initiatives were launched, as educators and elected officials responded to the perceived crisis. Among the resulting reforms were increases in teacher pay coupled with decreases in class size, tougher standards for teacher preparation programs and certification, revamped curricula, district consolidations aimed at efficiencies of scale in management, schools-withinschools programs that reduced the size of student bodies while allowing for larger building sizes, and the revamping of school calendars, including the introduction of year-round education. Missing from all of this activity, however, were the vouchers advocated by Friedman and others. Efforts to bring vouchers into the picture were blocked by school administrators, teacher unions, and liberal reformers unwilling to abandon the public school system to market forces. The fundamental issue at debate was whether or not public tax dollars should be used to pay for a private education. Conservatives, who were committed to a free market approach to education, supported vouchers as a tool of school choice. Under his economic plan, dubbed “Reaganomics,” President Ronald Reagan spoke consistently in favor of school vouchers, private tuition tax breaks, and other public subsidies for private tuition. Religious conservatives and Catholic advocates rallied behind the voucher idea, but the concept was tarnished when pro-segregationists began to support vouchers as a means of avoiding integrated schools. Federal courts ruled against vouchers in several cases, giving the impression that voucher supporters had a racist agenda. In addition, because vouchers might be used to pay for parochial, as well as secular, private schools, some opponents believed that they would violate the separation of church and state. Shift in the Late 1980’s
Calls for privatization reemerged as more demands were placed on public education to address social inequalities related to race and class. Many social programs resulting from the Civil Rights movement and put in place during the late 1960’s and the 1970’s had focused on public schools. Federal, state, and local funds were directed at reducing the degree of inequality, and public schools became the focus of these efforts. During the Reagan era, most of these programs were ended.
The Eighties in America
The call for a voucher system continued, but the arguments shifted. During the 1980’s, many in the American middle class continued to rely on public education. “White flight” from major urban centers into suburbia allowed public education to provide for middle-class needs, because the concentration of relative wealth in the suburbs meant that public schools located there were better funded than were schools in urban, working-class districts. As long as the educational needs of the middle class were met, the perceived need to privatize education was minimal. During the 1980’s, the Reagan and George H. W. Bush administrations pushed for educational reform. Since their conservative, middle-class supporters continued to be served by public education, reform efforts shifted away from privatization. Still, the concept of a voucher program fit well into some versions of the political ideology advocating reduced government involvement in family life— despite the fact that it entailed more federal intervention in local affairs. Vouchers were seen as a means of enabling parents to send their children to whichever schools would best meet their children’s needs, regardless of whether or not those schools charged tuition. However, during the 1980’s, little action was seen on this front. Congress rejected Reagan administration plans to grant tuition tax credits, and the school choice movement remained primarily active only among ultraconservatives. Impact Many opponents of voucher programs pointed to educational funding gaps as the principal reason for American educational woes. Public schools were thought to fail poor and minority schools for a number of reasons, including the national structure for allocating public school resources between and among schools and school districts. Voucher programs, according to the critics, would have further exacerbated the situation. The worst schools, those most in need of resources, would be the least attractive to potential students. Therefore, they would be the schools most likely to suffer low enrollments and financial cuts if parents were able to use vouchers to send their children to private schools. Thus, it was argued, by channeling money away from the poorest public schools and instead providing public subsi-
School vouchers debate
■
851
dies to private schools, vouchers would help individual students at the expense of the overall system. The parents of those individual students, however, were likely to support the voucher idea, because their first loyalties were to their children, not to their public schools. Further Reading
Friedman, Milton. Capitalism and Freedom. Chicago: University of Chicago Press, 1962. Friedman’s seminal work provides the definitive statement of his immensely influential economic philosophy. Gross, B., and R. Gross. The Great School Debate. New York: Simon & Schuster, 1985. Claims the popular notion that current education reforms such as privatization, vouchers, and charter schools are responses to an identified crisis in public education requires further scrutiny. Kirkpatrick, David. School Choice: The Idea That Will Not Die. Mesa, Ariz.: Blue Bird, 1997. Detailed overview of the history of the school voucher concept, including an examination of the various iterations of voucher programs and plans. Levin, Henry. Privatizing Education. Boulder, Colo.: Westview Press, 2001. Explores the voucher debate from a perspective at once domestic and global, demonstrating how it is uniquely American while not necessarily based on educational philosophy. National Commission on Excellence in Education. A Nation at Risk. Washington, D.C.: U.S. Department of Education, 1983. Landmark study of American education during the early 1980’s. Decries the state of education and warns of a “rising tide of mediocrity that threatens our very future as a Nation and a people.” Ravitch, Diane. Left Back: A Century of Failed School Reforms. New York: Simon & Schuster, 2000. Provides an overview of a century of “progressive” reforms in the K-12 educational sector. Rick Pearce See also Consumerism; Education in Canada; Education in the United States; Magnet schools; Mainstreaming in education; Multiculturalism in education; Nation at Risk, A; National Education Summit of 1989; Reagan, Ronald; Reaganomics.
852
■
The Eighties in America
Schreyer, Edward
■ Schreyer, Edward Identification
Governor-general of Canada from 1979 to 1984 Born December 21, 1935; Beausejour, Manitoba Schreyer was Canada’s ceremonial vice-regal head from 1979 to 1984, years that saw two general elections and the patriation of Canada’s constitution. Edward Schreyer, a Manitoban who had served as premier of that province as a member of the leftist New Democratic Party, was appointed governorgeneral by Prime Minister Pierre Trudeau in 1979. Unusually, Schreyer was not a member of Trudeau’s own party, the Liberals, but he had given key support to Trudeau during the 1970 crisis over invocation of the War Measures Act against Québécois sovereignists, and he was trusted by the prime minister. Trudeau’s party was defeated by the Conservatives in 1979, and Joe Clark began a brief tenure as prime minister, but nine months later, a resurgent Trudeau led his party to victory and regained the prime ministry. As a result, Schreyer and Trudeau served jointly during most of Schreyer’s tenure as governorgeneral. Schreyer was the youngest governor-general appointed up to that point, and was only the fifth to be Canadian-born. He was the first governor-general from Manitoba and the first of Ukrainian or German descent. Rideau Hall in Ottawa, where Schreyer and his wife, Lily, resided during Schreyer’s term as governorgeneral, is nominally the official Canadian residence of the monarch of Canada—Queen Elizabeth II, during Schreyer’s term—and Schreyer hosted Queen Elizabeth twice, in 1982 and 1983. The first visit was the most important, as the Queen was visiting as part of the patriation of Canada’s constitution as proclaimed by the Canada Act of 1982, spearheaded by Prime Minister Trudeau. On April 17, 1982, the queen proclaimed the act, and Schreyer thus became the first governor-general to preside over a completely self-governing Canada. Schreyer participated energetically in the cultural role of the governor-general, which included sponsoring awards and prizes for achievement in the arts, sciences, and other areas of public life. Among the writers who received the Governor General’s Award for Fiction during Schreyer’s term were George Bowering, Mavis Gallant, and Guy Vanderhaeghe. Schreyer also concerned himself with the
culture of his own Ukrainian heritage, presiding over the founding of the Center for Ukrainian Studies at the University of Toronto in 1983. Impact Schreyer was a pivotal figure in the transition of the governor-general’s office from a symbol of authority and precedence to an expression of the multiplicity of Canadian life. Schreyer, in standing for the whole of Canada, made clear that the whole of Canada was an abstract concept that could be sustained only by the sum of its parts. This position paralleled a growing sense in the 1980’s that the nationalism of the 1960’s and 1970’s, which had replaced colonial allegiance to Britain with a monolithic vision of Canadian identity, had run its course. Canadians instead began to embrace a spectrum of national identities, while striving to find and subscribe to fundamental values held in common. Schreyer’s combination of energy, modesty, populism, and moderate liberalism made him an apt representative of Canada as it passed through the final years of the Trudeau era. Further Reading
Doern, Russell. Wednesdays Are Cabinet Days: A Personal Account of the Schreyer Administration. Winnipeg, Man.: Queenston House, 1981. McWhinney, Edward. The Governor General and the Prime Ministers. Vancouver: Ronsdale Press, 2005. Nicholas Birns See also Aboriginal rights in Canada; Clark, Joe; Education in Canada; Elections in Canada; Trudeau, Pierre.
■ Schroeder, Pat Identification
U.S. representative from Colorado from 1973 to 1997 Born July 30, 1940; Portland, Oregon Schroeder was a liberal representative willing to advocate causes that many of her Democratic male colleagues in the House of Representatives refused to consider. As a member of the House Committee on Armed Services, she regularly challenged the Reagan administration’s military policies. She considered running for president in 1988. Democrat Pat Schroeder was elected to Congress in 1972. Instead of seeking a seat on a committee dealing with women’s issues, she requested a seat on the
The Eighties in America
Schroeder, Pat
■
853
den Sharing Panel, a task force of the Armed Services Committee. She also introduced legislation to provide women a greater chance of participation in all areas of the military. In 1987, Schroeder spent five months traveling around the United States seeking support for a possible campaign for president of the United States in 1988. She withdrew from the race at an emotional press conference on September 28, 1987. Bursting into tears as supporters chanted “Run, Pat, Run,” she was lampooned on the satirical latenight television program Saturday Night Live. On the program, Nora Dunn portrayed Schroeder attempting to moderate a Democratic primary debate, while repeatedly bursting into tears. Pat Schroeder tearfully announces on September 28, 1987, that she will not seek the Democratic presidential nomination. (AP/Wide World Photos)
House Committee on Armed Services. The House leadership granted her request, and she became the first woman to serve on that committee. In the 1980’s, she used her position on the committee to challenge the Ronald Reagan administration’s defense policies. In addition to addressing defense issues, Representative Schroeder worked on issues relating to women and children, introducing bills to eliminate gender inequities in wages and promotions, to provide funds to open shelters for abused children, and to expand Head Start programs. As the chair of the National Task Force on Equal Rights for Women, she advocated federal subsidization of abortions. After a number of women’s health clinics were violently attacked by antiabortion protesters in late 1984, Schroeder regularly appeared on television news programs to denounce the violence. Schroeder used her position on the Armed Services Committee to change the role of women in the military. Her Military Family Act of 1985 helped improve the situation of military families. In 1988, Schroeder was appointed chair of the Defense Bur-
Impact Schroeder worked to keep the interests of women and children at the forefront of debate in the U.S. House of Representatives. She was one of the cofounders of the Congressional Women’s Issues Caucus. She also challenged the Reagan administration, adding the well-known phrase “Teflon President” to the American political lexicon. Frustrated in her inability to make her charges stick, she once described President Reagan, “He’s just like a Teflon frying pan: Nothing sticks to him.” Further Reading
Lowy, Joan A. Pat Schroeder: A Woman of the House. Albuquerque: University of New Mexico Press, 2003. Schroeder, Pat. Twenty-Four Years of House Work—and the Place Is Still a Mess: My Life in Politics. Kansas City, Mo.: Andrews McMeel, 1998. John David Rausch, Jr. See also Abortion; Congress, U.S.; Elections in the United States, 1988; Glass ceiling; Liberalism in U.S. politics; Military ban on homosexuals; Military spending; Reagan, Ronald; Women in the workforce.
854
■
The Eighties in America
Schwarzenegger, Arnold
■ Schwarzenegger, Arnold Identification Champion bodybuilder and actor Born July 30, 1947; Thal, Austria
Schwarzenegger began a career as an action film star in the 1980’s, performing in a series of successful films. By the end of the decade, he was poised to become a superstar. After winning his first Mr. Universe bodybuilding title in 1967, Arnold Schwarzenegger moved to the United States the following year to pursue business opportunities, including a movie career. His first and last movies of the 1970’s, Hercules in New York (1970) and The Villain (1979), were both commercial and critical failures. However, he won a Golden Globe as New Star of the Year—Actor for a supporting role as a bodybuilder in the film Stay Hungry
(1976), and he was featured in the documentary Pumping Iron (1977). In 1980, Schwarzenegger won his last major bodybuilding title and was cast as the husband of the title character in the television movie The Jayne Mansfield Story (1980). Since Mansfield’s husband was born in Hungary and became a champion bodybuilder, Schwarzenegger was an obvious choice. Schwarzenegger finally found success as the male lead of a theatrical film in Conan the Barbarian (1982). A series of successful action movies, beginning with The Terminator (1984), made him an action star, although none of the films surpassed the milestone $100million mark at the box office. His only film of the 1980’s to achieve that level of success was a comedy, Twins (1988). In 1987, Schwarzenegger received a star on Hollywood Boulevard’s Walk of Fame and was named Male Star of the Year by the National Association of Theater Owners. Forbes magazine listed him as one of the ten wealthiest entertainers in the United States in 1989. By that time, several of his films had achieved cult status, particularly The Terminator, which was initially only a modest success but would spawn two blockbuster sequels in later years. Schwarzenegger became an American citizen in 1983 and laid the groundwork for his political career when he spoke on Ronald Reagan’s behalf at the 1984 Republican National Convention and campaigned for George H. W. Bush in 1988. He joined the United States’ most prominent political family in 1986, when he married Maria Shriver, niece of President John Kennedy and Senators Robert and Edward Kennedy. Impact Schwarzenegger spent the 1980’s building a following among die-hard science-fiction and action fans, earning a reputation as one of the major action heroes of the decade. Each of his films was only a modest success, but he proved to be a consistent leading man and an iconic figure. At the end of the decade, he was poised to achieve superstar status, and during the following decade, he starred in a steady stream of blockbusters. Further Reading
Arnold Schwarzenegger as Conan the Barbarian in 1984’s sequel, Conan the Destroyer. (AP/Wide World Photos)
Andrews, Nigel. True Myths: The Life and Times of Arnold Schwarzenegger. Secaucus, N.J.: Birch Lane Press, 1996. Flynn, John L. The Films of Arnold Schwarzenegger. Rev. ed. Secaucus, N.J.: Carol, 1996.
The Eighties in America
Leamer, Laurence. Fantastic: The Life of Arnold Schwarzenegger. New York: St. Martin’s Press, 2005. Thomas R. Feller See also Action films; Film in the United States; Science-fiction films; Terminator, The.
■ Science and technology Definition
The physical, biological, earth, and computer sciences and the practical technological innovations developed from scientific advances
In the 1980’s, advances in science and technology included the creation and spread of the Internet, biotechnology, and the concept of DNA fingerprinting. During the 1980’s, the implications of advances in science and technology, along with a series of highly visible public disasters, caused the public to question science and technology and led to the growth of the environmental movement, as well as numerous public policy changes designed to protect humans from toxic and radioactive waste. At the same time, advances in information technology changed everyday existence in multiple fields of employment, including education and business. Computers and Information Technology
Throughout the decade, computing and information technologies rapidly and radically transformed, starting in 1980, when Seagate Technology produced the first hard disk drive for computers. The disk could hold five megabytes of data. In 1981, International Business Machines (IBM) released its first personal computer, which ran on a 4.77 megahertz Intel 8088 microprocessor and MS-DOS system software. The company responsible for designing MS-DOS, Microsoft, was only six years old. Simultaneously, Adam Osborne released the first “portable” computer, the Osborne I. It weighed twenty-four pounds and cost $1,795. That same year, Apollo Computer released the first workstation, the DN100. In 1982, the popularity of personal computers rose dramatically with the introduction of the Commodore 64, which sold for $695 and came with 64 kilobytes of random-access memory (RAM). The Commodore 64 would become the best-selling single computer model of all time. Time magazine
Science and technology
■
855
named the computer its “Man of the Year,” saying “Several human candidates might have represented 1982, but none symbolized the past year more richly, or will be viewed by history as more significant, than a machine: the computer.” At the beginning of the 1980’s, the network that would eventually become the Internet was three years old. In 1983, that network split into ARPANET and MILNET, creating a civilian branch as well as a military one. The innovation had been enabled by the introduction of the networking standard Transmission-Control Protocol/Internet Protocol (TCP/ IP) in 1980. In 1985, ARPANET was renamed the Internet, and financial responsibility for the network was assumed by the National Science Foundation. The Internet would become a dominant force in both American and global culture, and sciencefiction writer William Gibson coined the term “cyberspace” as he explored the possible futures created by advances in information technology. At the same time, in 1983, Microsoft Windows was released, as was Microsoft Word. Some 450,000 floppy disks with demonstration copies of Word were distributed as inserts in PC Magazine. The Parallel Computing Initiative, funded by Livermore Laboratory, redefined high-performance computing, starting in 1989. Using massive, coordinated clusters of microcomputers, the project was able to outperform custom-designed supercomputers. As technology to link computers cooperatively advanced, computer clusters would become preferred over individual supercomputers for most processor-intensive high-performance computing tasks. It followed in the footsteps of research conducted by Daniel Hillis of Thinking Machine Corporation, whose machine used sixteen thousand processors and completed billions of operations per second. As the popularity of personal computers increased in the 1980’s, issues regarding the social and ethical implications of information collection and management came to light. Such issues included privacy rights of consumers and citizens, whose personal information was contained in an ever-increasing number of databases as the decade progressed. The spread of computers also raised issues dealing with intellectual property rights, software piracy, and other forms of computer crime such as “hacking” into private databases and spreading computer viruses. The first such virus, written by Rich Skrenta,
856
■
Science and technology
was released in 1982. In 1986, the United States passed the Computer Fraud and Abuse Act; the first person prosecuted under the new law was college student Robert Tappan Morris, Jr., another computer virus author. The son of a computer security expert, Morris said he had been motivated by boredom to create the worm and set it loose on the Internet, causing problems for six thousand users of the sixty thousand hosts connected to the Internet. Biology, Genetics, and Medicine The face of the biological sciences changed over the course of the 1980’s, particularly in the field of biotechnology. In 1980, Stanley Cohen and Herbert Boyer filed a patent application for a process of gene cloning that allowed them to make human insulin from genetically modified (GM) bacteria. In a similar advance, a vaccine for hepatitis B was created through genetic modification. Genetic modification was not universally popular, and in reaction to its development, an anti-biotechnology movement formed under the leadership of Jeremy Rifkin, who argued against awarding patents for GM bacteria and opposed the transfer of genes from one species to another and the release of modified bacteria into the environment. Rifkin delayed the release of the first GM bacteria into the environment for four years. Eventually, in 1987, the bacteria were released and used to make potato plants more frost resistant. In 1984, Alec Jeffreys developed a deoxyribonucleic acid (DNA) fingerprinting technique in the course of his research into human genes. The previous year, Kary Mullis had begun research that led to the development of the polymerase chain reaction (PCR) technique, which allowed scientists to amplify or multiply DNA from a single cell in order to replicate a DNA sample. In 1987, police began using DNA fingerprinting to investigate crime. In 1983, the virus that would come to be known as human immunodeficiency virus (HIV) was first isolated. The virus was linked to acquired immunodeficiency syndrome (AIDS). In 1985, the Food and Drug Administration (FDA) approved a blood test for HIV infection that could be used on blood supplies. In 1987, the FDA approved the anti-AIDS drug azidothymidine (AZT). The decade also witnessed the foundation being laid for one of the most prominent advances in human science, the Human Genome Project. Liver-
The Eighties in America
more Laboratory and Los Alamos collaborated to build human-chromosome-specific gene libraries, developing advanced chromosome-sorting capabilities. The project’s goal was to map the entire human genome; it was hailed as history’s most ambitious biological project. Other advancements in medicine in the 1980’s included the first use of artificial skin to treat burns (1981); the first use of a permanent artificial heart (1982); the first release of a patient with an artificial heart from the hospital (1985); the first heart-lung transplant (1987); and the development of Prozac, the first selective serotonin reuptake inhibitor, by the Eli Lilly Corporation (1987). One notable medical failure of the 1980’s occurred in October, 1984, when Doctor Leonard L. Bailey used the heart of a baboon to replace the failing heart of an infant who became known to the media as Baby Fae. The transplant operation was both unsuccessful and possibly illegal. Although Baby Fae seemed to do well for a few days, her body rejected the new organ, and public scrutiny was drawn to the issue of human experimentation. Energy Crisis and Environmental Movement
Driven by the previous decade’s energy crisis, scientists in the 1980’s explored alternative sources of energy, including biodiesel, hydrogen fuel, and wind energy. A secondary impetus of this research was the rising risk of environmental pollution and an increasing public awareness of the dangers such pollution posed to human well-being. Antinuclear and environmental activists had also drawn significant attention to the difficulties of safely disposing of radioactive and toxic waste. The environmental justice movement spread in the United States and across the world, as it sought to combat the social structures that had thwarted environmental reforms in the past. One of the major impetuses of the movement was the Warren County polychlorinated biphenyl (PCB) disaster. The state of North Carolina was forced to create a landfill to contain the more than thirty thousand gallons of PCB-contaminated oil produced by the disaster. The state chose to locate the landfill in a primarily African American county, leading to charges of environmental racism, particularly after it was revealed that the site was not hydrologically suitable for such disposal. Media coverage of environmental disasters continued to increase public awareness of the dangers
The Eighties in America
posed to the environment by new and existing technologies. It also increased the demand for new technologies that could help prevent further disasters, as well as for new scientific methods for measuring both the risks of disasters occurring and the precise effects of the disasters that did occur. In 1984, a Union Carbide pesticide plant released forty tons of methyl isocyanate into the atmosphere when a holding tank overheated, immediately killing nearly three thousand people in the Indian city of Bhopal and leading to an estimated fifteen thousand to twentytwo thousand subsequent deaths. Union Carbide worked with other companies to create the Responsible Care system, designed to force companies to act responsibly toward humans and the environment. In 1985, the city of Times Beach, Missouri, was completely evacuated. In 1982, the Environmental Protection Agency (EPA) had discovered dangerous levels of dioxin, which it called “the most dangerous chemical known to man,” in the town’s soil. The chemical was believed to be a by-product of the production of hexachlorophene. The same year, a hole in Earth’s protective ozone layer was discovered by researchers in the Antarctic, whose measurements indicated a steep drop in ozone layers over a span of a few years, far larger than any scientist had predicted. The main source of ozone depletion was determined to be the photodissociation of chloroflurocarbon compounds (CFCs). In 1987, forty-three countries signed the Montreal Protocol, in which they agreed immediately to freeze CFC production at its current levels and to reduce their production levels by 50 percent by 1999. In 1986, a disaster at the Chernobyl nuclear power plant caused by the explosion of a reactor depopulated areas of the Ukraine and spewed radioactive material into the atmosphere, exposing parts of the Soviet Union, northern Europe, western Europe, and the eastern United States to radioactive fallout. The emphasis on finding clean forms of energy and ways to reduce hazardous waste in the 1980’s sometimes created problems. Eagerness to find alternative energy sources led to results that were difficult to re-create. In 1988, researchers Stanley Pons and Martin Fleischman announced the discovery of cold fusion at the University of Utah. Cold fusion, known to scientists as low-energy nuclear reactions, involved the creation of nuclear reactions near room temperature and pressure using simple, low-
Science and technology
■
857
energy devices. When two light nuclei were forced to fuse, they formed a heavier nucleus, releasing considerable amounts of energy. After a short period of popular acclaim and widespread media attention, the researchers were accused of being sloppy in their initial research when they were unable to reproduce their results. Efforts by the United States Department of Energy to reproduce the results were similarly unsuccessful, although the department would continue to study the possibility of cold fusion for the next seventeen years. Space Exploration Early in the 1980’s, the U.S. space program seemed to be going strong. On November 12, 1980, the Voyager spacecraft approached Saturn and sent back the first high-resolution images of the planet. As emphasis shifted away from spacecraft for one-time exploration missions, the space shuttle program was created in order to employ reusable spacecraft for near-Earth missions. The following year, on April 12, the first space shuttle, Columbia, was launched. In 1984, astronauts Bruce M. McCandless II and Robert L. Stewart made the first untethered spacewalk. Early in 1986, the Voyager 2 space probe made its first contact with Uranus. However, public fears of “big science”—massive, expensive government programs that were perceived to divert money from social programs—led to the gradual downscaling of the National Aeronautics and Space Administration (NASA). When the space shuttle Challenger exploded seventy-four seconds after takeoff in 1986, killing the seven astronauts aboard, public confidence in the merits of such a program was further shaken. Other countries took up some of the slack: In February, the Soviet Union launched the space station Mir, the first consistently inhabited long-term research facility in space, and the Japanese spacecraft Suisei approached and analyzed Halley’s comet. NASA did not resume space shuttle flights until 1988. The U.S. space program would receive an unexpected boost in the form of President Ronald Reagan’s Strategic Defense Initiative (SDI), first proposed in 1983. Its intent was to use ground- and space-based antiballistic missile weapons systems to defend the United States from nuclear attack. SDI was in part spurred by advances in laser technology made by Livermore Laboratories. In 1984, the world’s most powerful laser, nicknamed “Nova,” be-
858
■
The Eighties in America
Science and technology
came operational. Ten beams produced up to 100 trillion watts of infrared power. The same year, the Strategic Defense Initiative Organization (SDIO) was established to oversee the program and was led by ex-NASA director James Alan Abrahamson. While SDI was never fully developed, the program’s research would provide significant advances in antiballistic technology, and its scientists would explore weapons that included hypervelocity rail guns, spacebased relay mirrors, neutral particle beams, and chemical and X-ray lasers. In 1988, Soviet and American scientists worked together to conduct measurements of nuclear detonations at testing sites in both countries in the Joint Verification Experiment (JVE). A result of the Reagan mantra, “trust, but verify,” the scientists’ efforts were intended to develop and improve verification technologies that would be used to monitor compliance with treaties such as the Threshold Test Ban Treaty (1974) and the Peaceful Nuclear Explosions Treaty (1976). Personal Technology
The 1980’s saw the advent of many technological devices that would affect consumers. While the personal computer was one such advance, others included cell phones, compact discs (CDs), and car alarms. Cell phones began to be manufactured by large companies in the 1980’s. Motorola introduced its DynaTAC phone to the public in 1983, a year after the Federal Communications Commission (FCC) authorized commercial cellular service. The phone weighed sixteen ounces and cost $3,500. In 1984, American Telephone and Telegraph (AT&T) was split into seven “Baby Bells” as a result of an antitrust suit brought by the Department of Justice. Each of the seven telephone companies created by the split had its own cellular business, creating difficulties in expansion as well as diminishing the marketing power available to each company to promote cellular phones. The FCC began awarding licenses for the mobile telephone by lottery, rather than by comparative hearings. Many lottery winners chose to sell their licenses to larger companies, which paid well for the acquisition. In 1987, the FCC declared that cellular licensees could use additional cellular services, allowing companies to begin to employ alternative cellular technologies. By the end of the decade, there were over a million cell phone subscribers in the United States.
Another major change in the technology available to private consumers came in the form of CDs, which were introduced commercially in 1982 when the first music album was released on CD, rock band Abba’s The Visitors. CDs, co-invented by Philips Electronics and the Sony Corporation, would create a revolution in digital audio technology. The discs were enthusiastically received, initially by classical music enthusiasts and audiophiles, and later—as the price of CD players dropped—by music fans in general. CDs were originally marketed specifically to store sound, but their potential for storing other types of digitally encoded data soon became apparent. In 1985, Sony and Philips developed the first compact disc read-only memory (CD-ROM). It stored the entirety of Grolier’s Electronic Encyclopedia—nine million words—in only 12 percent of the space available on the disc. One technology that received a legal boost in the late 1980’s was the car alarm: The New York State legislature passed a law requiring insurance companies to offer a 10 percent discount to vehicle owners whose car was protected with such an alarm. When other states followed suit, the number of vehicles equipped with this technology rose sharply. In 1989, Silicon Graphics unveiled the technology of virtual reality—computer-generated, threedimensional environments containing elements with which users could interact—at a trade show in Boston. First intended for tasks such as flight simulation, the technology was quickly seized upon by game designers as well. Impact While many advances in science and technology occurred in the course of the 1980’s, the most significant was arguably the steady stream of improvements in information technology. The Internet would become a world-spanning entity allowing instantaneous transmission of information and spurring globalization. At the same time, advances in DNA and gene identification would affect fields ranging from agriculture to medicine. Advances in personal technology would create increasing acceptance and use of electronic devices by consumers, paving the way for the various handheld computing devices of the next two decades. Further Reading
Allan, Roy A. A History of the Personal Computer: The People and the Technology. London, Ont.: Allan, 2001. Sometimes quirky but usually informative
The Eighties in America
look at the figures and technologies of the personal computing movement. Benedick, Richard. Ozone Diplomacy. Boston: Harvard University Press, 1991. Provides a detailed examination of the discovery of the ozone hole and the negotiations that led to the Montreal Protocol. Cook, Richard C. Challenger Revealed: An Insider’s Look at How the Reagan Administration Caused the Greatest Tragedy of the Space Age. New York: Avalon, 2007. Traces the history of the space shuttle’s development and deployment, describing the equipment malfunctions and internal NASA decision making that led to the crash. Erickson, Jim, and James Wallace. Hard Drive: Bill Gates and the Making of the Microsoft Empire. New York: John Wiley & Sons, 1992. A thorough examination of the company; provides analysis of Microsoft’s initiatives and releases throughout the 1980’s, including profiles of competitors. Fitzgerald, Frances. Way Out There in the Blue: Reagan, Star Wars, and the End of the Cold War. New York: Touchstone Press, 2000. Explores the history of the Strategic Defense Initiative and President Reagan’s attempt to provide the United States with protection from nuclear attack. Gregory, Jane, and Steve Miller. Science in Public: Communication, Culture, and Credibility. New York: Plenum Press, 1998. Discusses the ways in which science came to the attention of the American public in the 1980’s. Reilly, Philip R. Abraham Lincoln’s DNA and Other Adventures in Genetics. Cold Spring Harbor, N.Y.: Cold Spring Harbor University Press, 2000. Includes explanations of early advances in genetics and DNA, including polymerase chain reaction and mutation analysis, and discusses applications of those techniques. Shilts, Randy. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin’s Press, 1987. This exhaustive account of the spread of AIDS in the United States discusses how the search for a cure shaped medicine and epidemiology. Cat Rambo See also
AIDS epidemic; Apple Computer; Astronomy; Bioengineering; CAD/CAM technology; Cancer research; Challenger disaster; Cold War; Fetal medicine; Genetics research; Halley’s comet; Infor-
Science-fiction films
■
859
mation age; Medicine; Microsoft; National Energy Program (NEP); Nuclear winter scenario; Ozone hole; Plastic surgery; Prozac; Reagan, Ronald; SETI Institute; Space exploration; Space shuttle program; Strategic Defense Initiative (SDI).
■ Science-fiction films Definition
Motion pictures that focus on the impact of actual or imagined science on society or individuals
Early science-fiction films were associated with unrealistic effects that often made them seem campy, especially to later audiences. By the 1980’s, however, computer-assisted special effects made possible a new level of realism that fundamentally transformed the nature of science-fiction cinema. The films of the 1980’s thrived on these new effects technologies, which allowed filmmakers to represent the impossible in a realist and compelling fashion. A new generation of computer-assisted special effects was pioneered in the late 1970’s and showcased in such movies as Star Wars (1977), Superman: The Movie (1978), and Alien (1979). As a result, the 1980’s began with audiences expecting a high level of sophistication from science-fiction films. Many such films were sequels to the groundbreaking work of the late 1970’s, including two Star Wars sequels, The Empire Strikes Back (1980) and Return of the Jedi (1983), as well as Superman II (1980), Superman III (1983), Superman IV: The Quest for Peace (1987), and Aliens (1986). It was in the science-fiction genre that Hollywood’s twin emerging preoccupations with sequels and effects-driven spectacle reached their height, greatly encouraged by the determination that each new movie’s special effects should improve on the standard set by its predecessor. The success of these series served to demonstrate that the most apt literary models for cinematic science fiction were not literary texts but comic books, which similarly dispensed with both inner experience and explanations. Other notable contributions to the superheroic subgenre of cinematic science fiction included Flash Gordon (1980), RoboCop (1987) and Batman (1989). Animated movies made little progress, although the advent in the West of Japanese anime films, with Akira (1988), offered a pointer to the untapped potential of that supplementary medium.
860
■
The Eighties in America
Science-fiction films
Selected Science-Fiction Films of the 1980’s Year
Title
Director
1980
The Empire Strikes Back
Irvin Kershner
Superman II
Richard Lester
Flash Gordon
Mike Hodges
Altered States
Ken Russell
Battle Beyond the Stars
Jimmy T. Murakami
The Road Warrior
George Miller
Memoirs of a Survivor
David Gladwell
The Incredible Shrinking Woman
Joel Schumacher
Escape from New York
John Carpenter
1981
1982
1983
1984
1985
1986
1987
Outland
Peter Hyams
Star Trek II: The Wrath of Khan
Nicholas Meyer
The Thing
John Carpenter
E.T.: The Extra-Terrestrial
Steven Spielberg
Blade Runner
Ridley Scott
Tron
Steven Lisberger
Return of the Jedi
Richard Marquand
Superman III
Richard Lester
Space Raiders
Howard R. Cohen
Star Trek III: The Search for Spock
Leonard Nimoy
The Last Starfighter
Nick Castle
2010: The Year We Make Contact
Peter Hyams
The Terminator
James Cameron
Starman
John Carpenter
Dune
David Lynch
The Brother from Another Planet
John Sayles
Mad Max: Beyond Thunderdome
George Miller and George Ogilvie
Cocoon
Ron Howard
Back to the Future
Robert Zemeckis
Enemy Mine
Wolfgang Petersen
Explorers
Joe Dante
Trancers
Charles Band
Aliens
James Cameron
Star Trek IV: The Voyage Home
Leonard Nimoy
The Fly
David Cronenberg
Short Circuit
John Badham
Invaders from Mars
Tobe Hooper
RoboCop
Paul Verhoeven
Predator
John McTiernan
Batteries Not Included
Matthew Robbins
The Eighties in America
Science-fiction films
Year
Title
1987 (continued)
Innerspace
Joe Dante
Superman IV: The Quest for Peace
Sidney J. Furie
Akira
Katsuhiro btomo
The Blob
Chuck Russell
They Live
John Carpenter
Cocoon: The Return
Daniel Petrie
Alien Nation
Graham Baker
Earth Girls Are Easy
Julien Temple
Batman
Tim Burton
Star Trek V: The Final Frontier
William Shatner
The Abyss
James Cameron
Honey, I Shrunk the Kids
Joe Johnston
Back to the Future, Part II
Robert Zemeckis
The Wizard of Speed and Time
Mike Jittlov
1988
1989
Science-Fiction Franchises
The Star Wars and Star Trek movie franchises—the latter including Star Trek II: The Wrath of Khan (1982), Star Trek III: The Search for Spock (1984), Star Trek IV: The Voyage Home (1986), and Star Trek V: The Final Frontier (1989)—reinvigorated space opera as a cinematic subgenre. These series required large budgets, and budgetary constraints limited the ambitions of such second-rank contributions as The Last Starfighter (1984), while attempts at serious space fiction, such as 2010: The Year We Make Contact (1984), often failed to realize their ambition. The new wave of monster movies produced in the wake of Alien was more consistently successful in its exploitation of new effects; notable retreads of earlier, low-budget films included The Thing (1982), The Fly (1986), and The Blob (1988), while significant new ventures in this vein included The Terminator (1984), Predator (1987), They Live (1988), and Tremors (1989). The Terminator was a low-budget film, but it was sufficiently successful to spawn an important series of higher-budget sequels, and it eventually became the archetype of a new wave of exaggerated action movies whose other exemplars included Mad Max 2: The Road Warrior (1981) and Mad Max: Beyond Thunderdome (1985). The implicit paranoia of monster movies expressed Cold War fears of the decade. Another more hopeful type of alien-centered science fiction arose
■
861
Director
during the 1980’s, however, to resist the worldview of those films. These films offered more sympathetic accounts of nonhuman characters, while often complaining stridently about human tendencies toward intolerance and exploitation. They included Android (1981), Steven Spielberg’s E.T.: The ExtraTerrestrial (1982)—the most successful film of the decade—Starman (1984), Cocoon (1985), Short Circuit (1986), Batteries Not Included (1987), and The Abyss (1989). The difficulties of adapting literary texts to the cinematic medium were amply demonstrated by the Paddy Chayevsky-based Altered States (1980), the Doris Lessing-based Memoirs of a Survivor (1981), the cinematic travesty of Frank Herbert’s Dune (1984), and the inevitable remake in its title year of George Orwell’s Nineteen Eighty-Four (1984). Although it bore little resemblance to its source text and it was initially unsuccessful at the box office, Ridley Scott’s Blade Runner (1982), based on Philip K. Dick’s Do Androids Dream of Electric Sheep? (1968), would eventually be recognized as a major and significant work of cinematic science fiction. The stunning art direction of Blade Runner set new standards in the portrayal of fictional worlds, and the film’s eventual success as a videocassette rental encouraged further interest in Dick’s work. Dick continually questioned the stability of the expe-
862
■
Science-fiction films
rienced world, and this concern of science fiction converged powerfully with film’s ability to construct visceral illusions in such paranoid metaphysical fantasies as David Cronenberg’s Scanners (1981) and Videodrome (1983). Inevitable delays in production, however, ensured that further Dick dramatizations were postponed until subsequent decades. The new special effects were also deployed in a new generation of futuristic satires, including The Incredible Shrinking Woman (1981), The Brother from Another Planet (1984), Repo Man (1984), and Honey, I Shrunk the Kids (1989). These relatively amiable examples were, however, outshone by the scathing Brazil (1985), whose ending was considered too harsh for a U.S. audience that showed a blatant preference for the exuberance of such comedies as Back to the Future
The Eighties in America
(1985), Innerspace (1987), Back to the Future, Part II (1989), and Bill and Ted’s Excellent Adventure (1989). U.S. cinema did make some attempts to address actual trends in science and technology, particularly in computer-inspired movies such as Tron (1982) and War Games (1983), but it demonstrated no conspicuous understanding of how those technologies actually functioned. By contrast, continued experimentation with the new special effects resulted in some remarkably sophisticated visual representations and strikingly iconic images. Although most of these were contained in big-budget movies such as Blade Runner, E.T., Brazil, and Batman, it remained possible for enterprising technicians to produce such unique ventures as Mike Jittlov’s The Wizard of Speed and Time (1989).
Fans wait in line for the premiere of Return of the Jedi, the final film of the first Star Wars trilogy, in New York’s Times Square on May 25, 1983. (AP/Wide World Photos)
The Eighties in America
Scorsese, Martin
■
863
Impact It has been argued that, of all genres, science fiction is one of the most consistently allegorical of social and political concerns, and the sciencefiction films of the 1950’s were frequently interpreted as conscious or unconscious essays on Cold War paranoia. By the 1980’s, these interpretations were well known, and science-fiction cinema, while it continued to engage in allegory, also began to comment upon it. Most of the decade’s offerings were shaped by the desire of the studios to produce big-budget blockbusters that would draw extremely large audiences away from the competing technologies of television and videocassettes. This desire often meant that the most expensive films were the most simpleminded. However, even these films invented rich visual iconographies, making fantastic worlds believable on screen for the first time in decades. Although the decade was dominated by sequels, remakes, and imitations, it consolidated the innovations of the 1970’s and paved the way for further sophistication of futuristic imagery.
evitably, science-fiction movies, but the utility of the book is in filling in the background to the key innovations of the 1980’s. Sobchak, Vivian. Screening Space: The American Science Fiction Film. 2d ed. New York: Ungar, 1987. The updating chapter, “Postfuturism,” describes the 1980’s as a “Second Golden Age” of sciencefiction cinema. Brian Stableford
Further Reading
Scorsese belonged to the generation of American auteurs who began making films during the 1970’s, and he continued to hone his craft during the 1980’s. He directed five very different major motion pictures during the decade, branding each one with his trademark style.
Bukatman, Scott. Terminal Identity: The Virtual Subject in Postmodern Science Fiction. Durham, N.C.: Duke University Press, 1993. Study of the postmodern examination of identity common to much of the significant science fiction of the 1980’s. Hardy, Phil, ed. The Aurum Film Encyclopedia: Science Fiction. Rev. ed. London: Aurum Press, 1991. The chapter “The Eighties: Science Fiction Triumphant” offers a comprehensive chronological survey of titles, with elaborate and intelligent annotations. Kuhn, Annette, ed. Alien Zone: Cultural Theory and Contemporar y Science Fiction Cinema. London: Routledge, 1990. Collection of theoretical essays; its key exemplars include Blade Runner, The Thing, Videodrome, and Aliens. Landon, Brooks. The Aesthetics of Ambivalence: Rethinking Science Fiction Film in the Age of Electronic (Re)Production. Westport, Conn.: Greenwood Press, 1992. Theoretical study that foregrounds the centrality of special effects as a driving force in the genre’s evolution, with specific emphasis on the 1980’s. Rickett, Richard. Special Effects: The History and Technique. New York: Watson-Guptill, 2000. Topically organized study; many of its key examples are, in-
See also Action films; Aliens; Back to the Future; Blade Runner; Empire Strikes Back, The; E.T.: The ExtraTerrestrial; Film in the United States; Horror films; RoboCop; Sequels; Special effects; Spielberg, Steven; Terminator, The; Tron.
■ Scorsese, Martin Identification
American film director and film preservationist Born November 17, 1942; Queens, New York
A sickly child from a tough Italian American neighborhood, Martin Scorsese spent much of his childhood going to church and to the movies, then spent a year in the seminary and completed two film degrees at New York University. These seminal experiences shaped Scorsese’s fascination with guilt and redemption, with the enactment of masculinity, and with the aesthetic and emotional possibilities of film. In the 1970’s, Scorsese made his name as a director with two powerful, urban dramas—Mean Streets (1973) and Taxi Driver (1976). In the 1980’s, he expanded his range of topics, if not his tone, for Scorsese protagonists continued to be men plagued with self-doubt. Thus, Scorsese’s films rubbed against the grain of the success narratives typical of 1980’s Hollywood. Scorsese began the 1980’s directing a film he thought would be his last. Ostensibly the biopic of a champion boxer, Raging Bull (1980) examines how self-loathing drives a man to violence against those he loves. Filmed in striking black and white, edited
864
■
The Eighties in America
Sequels
with great daring, and featuring a bravura performance by Robert De Niro, Raging Bull set a 1980’s standard for serious American filmmaking. Other commercially risky projects followed. The King of Comedy (1983), a dark comedy about celebrity culture, presents another obssesive personality: a crazed fan (again played by Robert De Niro) who captures a television star, portrayed in a surprisingly restrained performance by Jerry Lewis. Next, Scorsese brought his usual paranoia to a common 1980’s figure—the yuppie—when he dramatized a misplaced uptowner’s terrifying night in lower Manhattan in After Hours (1985). Updating and expanding the 1961 Robert Rossen classic The Hustler, Scorsese’s The Color of Money (1986) again centered on a male world of competition and exploitation. Paul Newman, who had played young pool shark “Fast Eddie” Felson in the original, reprised his role. Twenty-five years older, Felson in Scorsese’s film acts as mentor to a cocky newcomer played by Tom Cruise. After numerous postponements, Scorsese worked in the late 1980’s to complete his dream project, The Last Temptation of Christ (1988). The controversial film was, surprisingly, produced by Universal Pictures. Following the provocative novel by Nikos Kazantzakis, Scorsese and screenwriter Paul Schrader presented a human, sexual, confused Christ who is redemed through suffering. The film sparked controversy, protests, and threats and was pulled from distribution by the studio. Scorsese ended the decade with “Life Lessons,” a modest short about an artist and his obssessive loves that was released as part of New York Stories (1989), a three-part feature that also included shorts directed by Woody Allen and Francis Ford Coppola. Although Scorsese received two Oscar nominations and numerous directing awards in the 1980’s, the Academy Award eluded him. Impact Scorsese’s ability to secure financial backing for innovative projects that resisted popular trends made him an influential bridge figure between mainstream and independent filmmaking in the 1980’s. Moreover, along with Allen, Steven Spielberg, and George Lucas, Scorsese contributed to a significant reconception of the nature of film auteurism in general and American filmmaking in particular. In 1989, a poll of American and international film critics ranked Raging Bull the best film of the decade.
Further Reading
Nicholls, Mark. Scorsese’s Men: Melancholia and the Mob. North Melbourne, Vic.: Pluto Press, 2004. Stern, Lesley. The Scorsese Connection. Bloomington: Indiana University Press, 1995. Thompson, David, and Ian Christe, eds. Scorsese on Scorsese. Boston: Faber & Faber, 1996. Carolyn Anderson See also
Academy Awards; Film in the United States; Last Temptation of Christ, The; Raging Bull; Spielberg, Steven.
■ Sequels Definition
Narrative work that represents the continuation of a story begun in an earlier work
During the 1980’s, as blockbusters became increasingly important to Hollywood’s financial model, movie sequels began to drive the film industry’s profits. Because large-scale sequels tended to be easier to promote effectively as “event” films, they were among the most financially successful films of the decade, although they were often less successful with professional critics. The 1980’s opened with Superman 2 (1980), bringing Christopher Reeve back to the big screen for a sequel to the blockbuster Superman: The Movie (1978). The sequel proved popular enough to inspire two more, Superman 3 (1983) and Superman 4: The Quest for Peace (1987). The original movie had been a bigbudget version of a comic book whose protagonist heretofore had been relegated to inexpensive movie serials, a television series, and the low-budget Superman and the Mole Men (1951). By 1989, Superman’s comic-book cohort Batman received a similar bigbudget movie treatment. That film’s success not only led to Batman sequels but also legitimized the adaptation to major motion pictures of other comic book characters with appropriate sequels, a trend which would continue. Another 1980 sequel was The Empire Strikes Back, the second installment in George Lucas’s trendsetting Star Wars trilogy. The final work of the trilogy, Return of the Jedi, followed in 1983. Three prequels would later be made as well. The Star Wars films were deliberately designed in the style of 1950’s movie serials, in which a short chapter would be released
The Eighties in America
each week and each chapter would end with a cliffhanger to draw audiences back the following week. Star Wars did much the same thing, only over a period of years instead of weeks. Steven Spielberg launched another series with a deliberately nostalgic style with Raiders of the Lost Ark (1981). That film, the first in the Indiana Jones series, borrowed from the vintage movie serials, but it and other films in the series each featured a selfcontained story arc. In fact, the series’ second film, Indiana Jones and the Temple of Doom (1984), was a prequel rather than a sequel, taking place before the events portrayed in Raiders of the Lost Ark. The next film, Indiana Jones and the Last Crusade (1989), was a true sequel, taking place later than the first two. All of these movies showcased the latest in special effects. Such realist effects in science-fiction and fantasy films were initially pioneered by Stanley Kubrick’s 2001: A Space Odyssey (1968) and rejuvenated by Star Wars (1977). In 1984, a sequel to 2001 was released: 2010: The Year We Make Contact. Meanwhile, author Arthur C. Clarke, who had written the books on which both films were based, continued to produce several more literary sequels in that series. Many other movies that inspired sequels also featured heavy special-effects elements. John Carpenter’s The Terminator (1984), starring Arnold Schwarzenegger, would be followed by two sequels in subsequent decades. The comedic Ghostbusters (1984) was followed by Ghostbusters II (1989). The success of Star Wars inspired the creators of television’s Star Trek series to create special-effects driven movies, starting with Star Trek: The Motion Picture (1979) and continuing with four sequels during the 1980’s. Other science-fiction sequels of the 1980’s included follow-ups to Alien (1979), Cocoon (1985), and Back to the Future (1985). Indeed, so sure was Universal Pictures of the marketability of sequels to the latter film that Back to the Future, Part II (1989) and Back to the Future, Part III (1990) were filmed simultaneously for separate releases. The James Bond franchise had been a mainstay of the film sequel business since the 1960’s, but the 1980’s saw no less than five Bond films released. Sylvester Stallone, the star of the Rocky franchise, launched a new series featuring a military hero named Rambo with First Blood (1982). Mad Max, the 1979 Australian movie that introduced Mel Gibson to American audiences as an ex-policeman in an apocalyptic future, produced two sequels in the
Sequels
■
865
1980’s: Mad Max 2: The Road Warrior (1981) and Mad Max: Beyond Thunderdome (1985). Gibson would also co-star with Danny Glover in the buddy cop movie Lethal Weapon (1987), which also became a franchise. Die Hard (1988) launched a franchise starring Bruce Willis as a dogged New York detective with a knack for fighting terrorists. Clint Eastwood continued his Dirty Harry detective series with sequels into the 1980’s. Beverly Hills Cop (1984), starring Eddie Murphy, was also successful enough to demand sequels. Police action sequels vied with the fantasy and science-fiction sequels that launched the trend. Some movies of the 1980’s inspired only a single sequel, such as the outdoor adventure Man from Snowy River (1982), as well as relatively low-budget comedies including Airplane! (1980), Arthur (1981), Look Who’s Talking (1989), Porky’s (1982), Revenge of the Nerds (1984), and Crocodile Dundee (1986). There was even a musical sequel, Grease 2 (1982). Sequels differ from earlier movie series, which ranged from B-Westerns to mysteries and comedies featuring the same actor or actors, but had plots unrelated from one picture to another. While some movies in series like the Andy Hardy and Thin Man series could be considered sequels in that the characters grow and change from film to film, such others as the Charlie Chan and Bowery Boys series, as well as innumerable Westerns of earlier decades, simply told individual stories featuring the same characters. In this, they resembled television series of the 1950’s through 1970’s, although in the 1980’s some series began to feature significant character development over the course of the show. Impact During the 1980’s, Hollywood became defined by the drive to make a few blockbusters (largebudget films with extreme profits), rather than a greater number of cheaper, more modestly successful films. As a result, studios’ interest in high-concept stories (that is, stories that could be easily understood and exhaustively summarized in one sentence) increased dramatically. High-concept films were the easiest to market to a mass audience, and there was no higher concept than “the sequel to Raiders of the Lost Ark,” for example. At the same time, blockbusters often featured fantastic worlds portrayed by special effects. It was easier to mimic the overall look and feel of an earlier film and invest in more impressive effects than it was to create entirely new art, set,
866
■
The Eighties in America
SETI Institute
and costume designs from scratch. This fact also drove the trend toward sequels. After the 1980’s, movie sequels became a mainstay of the entertainment industry. Further Reading
Budra, Paul, and Betty A. Schellenberg, eds. Part Two: Reflections on the Sequel. Toronto: University of Toronto Press, 1998. Compilation of essays on literary and cinematic sequels, including two essays on cinema of the 1980’s and 1990’s. Nowlan, Robert A., and Gwendolyn Wright Nolan. Cinema Sequels and Remakes, 1903-1987. Reprint. Jefferson, N.C.: McFarland, 2000. Scholarly overview of both sequels to and remakes of successful films. Stanley, John. Creature Features: The Science Fiction, Fantasy, and Horror Movie Guide. New York: Berkley Trade, 2000. Thousands of capsule reviews of movies from this genre, including sequels. Thompson, David. The Alien Quartet: A Bloomsbury Movie Guide. New York: Bloomsbury USA, 1999. Analyzes different aspects of the Alien films, including themes, directors, and the relation of each film to the others. Paul Dellinger See also
Academy Awards; Action films; Airplane!; Aliens; Back to the Future; Cruise, Tom; Empire Strikes Back, The; Epic films; Film in the United States; Ford, Harrison; Fox, Michael J.; Ghostbusters; Gibson, Mel; Horror films; Murphy, Eddie; Murray, Bill; Raiders of the Lost Ark; Rambo; RoboCop; Schwarzenegger, Arnold; Science-fiction films; Special effects; Spielberg, Steven; Terminator, The.
■ SETI Institute Identification
Institution established to search for signs of intelligent extraterrestrial life Date Founded in 1984 The SETI Institute launched the most significant public project dedicated to searching for any evidence that intelligent life exists on other planets. The stated mission of the SETI (Search for ExtraTerrestrial Intelligence) Institute is “to explore, understand and explain nature and the prevalence of life in the universe.” The institute’s most recognized
project is also named Search for Extra-Terrestrial Life. It is an attempt to detect any radio transmissions reaching Earth from elsewhere that might have been generated by alien civilizations. This project evolved from Frank Drake’s original 1960 Project Ozma experiment, which used a radio telescope to examine the stars Tau Ceti and Epsilon Eridani near the 1.420 gigahertz marker frequency. Project Ozma represented humankind’s first scientific attempt to detect extraterrestrial intelligence. The SETI Institute has evolved from its beginning as a systematic search for intelligent extraterrestrial radio sources into a multifaceted organization dedicated to gaining a better understanding of life in the universe. Inspired by its founder Frank Drake and fired by the charisma of astronomer Carl Sagan, the SETI Institute came to employ over one hundred scientists from a wide variety of disciplines at its Carl Sagan Center for the Study of Life in the Universe and the Center for the Search for Extra-Terrestrial Intelligence. Although searching for intelligent extraterrestrial life may be the SETI Institute’s highestprofile project, other studies involve more fundamental inquiries into planetary formation and evolution. The institute’s projects investigate how life began on Earth and how many other stars in the Milky Way galaxy may have planets that could support life. The SETI Institute’s activities were also popularized Carl Sagan’s novel Contact (1985), as well as by its 1997 film adaptation. Impact The SETI Institute continued to search for evidence of intelligent extraterrestrial life, employing new techniques and resources as they became available. Detection of radio signals from an extraterrestrial civilization is not an easy task. Radio astronomers first have to determine what types of star systems may have planets. Then they have to decide which radio frequency would be the most logical to listen to. Once these decisions have been made, they listen and wait. Computers evolved to make the job manageable, but even with advances in distributed computing, the task remained daunting. Further Reading
Ekers, Ron, et al., eds. SETI 2020: A Roadmap for the Search for Extraterrestrial Intelligence. Mountain View, Calif.: SETI Institute, 2003. Shostak, Seth, and Alex Barnett. Cosmic Company: The Search for Life in the Universe. Cambridge, England: Cambridge University Press, 2003.
The Eighties in America
sex, lies, and videotape
Skurzynski, Gloria. Are We Alone? Scientists Search for Life in Space. Washington, D.C.: National Geographic, 2004. Paul P. Sipiera See also Astronomy; Cosmos; E.T.: The Extra-Terrestrial; Science and technology; Science-fiction films; Space exploration; Star Trek: The Next Generation.
■ sex, lies, and videotape Identification American film Director Steven Soderbergh (1963) Date Premiered at the Sundance Film Festival
January 20, 1989; general release August 18, 1989
■
867
(James Spader), sets in motion changes in the other characters’ relationships. The primary catalyst for these changes is Graham’s collection of videotapes, each recording an interview he has made with a woman about her sexual experiences. John is typical of the 1980’s achievers who care only for their work. Both he and Cynthia, a would-be artist, use each other without any emotional commitment. Ann feels a general sense of malaise, as captured in her sessions with her psychiatrist (Ron Vawter). Graham, who seems to have no direction in life, resorts to his taped interviews because of sexual impotence. In many senses, sex, lies, and videotape offers a younger perspective on the issues addressed in Lawrence Kasdan’s The Big Chill (1983). Soderbergh was clearly influenced by the films of Woody Allen and perhaps even more by foreign films, especially those from France, that were more
Along with the films of Jim Jarmusch, Spike Lee, and Gus Van Sant, Soderbergh’s sex, lies, and videotape heralded the birth of what came to be called American independent cinema. Soderbergh went on to enjoy a career bridging the mainstream/independent divide, directing both low-budget and studio films with great success. The acclaim heaped upon sex, lies, and videotape, a low-budget film by an unknown director and without major stars, at the 1989 Sundance Film Festival is often credited with launching the independent film movement that would become more prominent in the 1990’s. To raise the money to make his feature-film debut, Steven Soderbergh made Winston (1987), a twelve-minute short intended to be shown to potential investors. When it was completed, sex, lies, and videotape proved that American film could examine the mores of the time with subtle humor and understated insight and without the obviousness, didacticism, or sentimentality often seen in mainstream films. Filmed in Soderbergh’s hometown, Baton Rouge, Louisiana, sex, lies, and videotape depicts the unhappy marriage of the frigid Ann (Andie MacDowell) and the smarmy lawyer John (Peter Gallagher), who is having an affair with Ann’s bartender sister, Cynthia (Laura San Giacomo). The arrival of John’s former college friend, Graham
Director Steven Soderbergh holds the Palme d’Or he was awarded for sex, lies, and videotape at the 1989 Cannes Film Festival, as actress Jane Fonda looks on. (AP/Wide World Photos)
868
■
Sexual harassment
The Eighties in America
open than American films in dealing with sexual matters. The director, who was only twenty-six when the film premiered at the Sundance Film Festival, treated his characters—with the notable exception of John—with affection and compassion. He gave considerable latitude to his actors, who found unexpected humor in their characters’ aimlessness and self-absorption.
Sexual harassment was not a new phenomenon in the 1980’s. However, the behavior did not have a legal name until the late twentieth century. Additionally, the public became more familiar with the problem, as the courts established legal definitions and employers became sensitive to their responsibility to prevent the conduct.
Impact
Sex Discrimination The Civil Rights Act of 1964 made it illegal for employers and educational institutions to discriminate against a person with respect to “terms, conditions, or privileges of employment because of such individual’s race, color, religion, sex, or national origin.” In 1980, the Equal Employment Opportunity Commission (EEOC) issued guidelines stating that sexual harassment was a form of sex discrimination. The rules defined the prohibited activity to include unwanted sexual advances, requests for sexual favors, and verbal or physical conduct of a sexual nature. They stated that sexual harassment included giving or removing an economic quid pro quo. Quid pro quo harassment is the easiest type to identify. It generally involves the attempt by an employer or a supervisor to exchange rewards such as raises or promotions in return for sexual favors. Likewise, it may involve punishing an employee who refuses sexual contact by negative changes to the terms of his or her employment. The second type of sexual harassment consists of creating a hostile work environment. In this situation, there may be no negotiation for sexual favors, but rather the general work atmosphere is infiltrated with sexual content or references, affecting employees’ ability to do their jobs.
The publicity surrounding sex, lies, and videotape gave the public more awareness of the Sundance Film Festival, then a relatively small venue, and of independent film in general. The film’s success represented a breakthrough for Harvey and Bob Weinstein’s production company, Miramax Films, which had previously distributed primarily foreign-language films. The film launched Soderbergh’s career as a major film director. It also boosted the careers of its stars, all four of whom went on to significant careers in film, television, or theater. The film won the Sundance Audience Award, received the Palme d’Or as the best film at the Cannes Film Festival, and was nominated for an Academy Award for Best Original Screenplay. Further Reading
Biskind, Peter. Down and Dirty Pictures: Miramax, Sundance, and the Rise of Independent Film. New York: Simon & Schuster, 2004. Palmer, William J. The Films of the Eighties: A Social History. Carbondale: Southern Illinois University Press, 1993. Smith, Lory. Party in a Box: The Story of the Sundance Film Festival. Salt Lake City: Gibbs-Smith, 1999. Michael Adams See also Big Chill, The; Camcorders; Do the Right Thing; Film in the United States; Generation X.
■ Sexual harassment Definition
Unwelcome sexual speech or behavior, engaged in by someone with institutional power over the recipient
During the 1980’s, the U.S. Supreme Court and many lower courts ruled that sexual harassment was a form of sex discrimination. As a result, businesses and academic institutions became more aware of the issue and sought to educate students and employees about it.
Meritor Savings Bank v. Vinson In 1986, the Supreme Court heard a case brought by Mechelle Vinson against her employer, Meritor Savings Bank. Vinson claimed that over a period of five years she was continually subjected to fondling, demands for sexual intercourse, and even rape by her boss, Sidney Taylor. Vinson could not report the harassment to her supervisor, as he was the assailant. She also testified that she feared the loss of her job if she told other bank officials. Taylor and the bank management denied any wrongdoing. Taylor claimed the sexual contact was consensual. The bank asserted that Vinson had not suffered any economic disadvantage.
The Eighties in America
The Supreme Court held that the Civil Rights Act was intended to “strike at the whole spectrum of disparate treatment of men and women.” It agreed with the EEOC’s designation of two types of harassment, comparing a hostile work environment based on sex to a hostile environment for racial minorities. “Surely a requirement that a man or woman run a gauntlet of sexual abuse in return for the privilege of being allowed to work and make a living can be as demeaning and disconcerting as the harshest of racial epithets.” The Court defined a hostile environment as one in which the harassment was severe or pervasive enough to alter the conditions of employment. It left the issue of employer liability undefined, although it did indicate that at a minimum, employers should provide guidelines and grievance procedures to address incidents of sexual harassment. Widespread Harassment
Feminist scholars and legal experts argued that sexual harassment was incorporated into a social structure of unequal power between men and women. They contended that in a patriarchal society, the behavior had more to do with men asserting control over women than with sexual attraction. In that sense, harassment could be seen as part of a continuum of power and control that involved other crimes against women. Just as domestic violence and rape had often been ignored or distorted by the legal system, so sexual harassment was, until the 1980’s, treated as private conduct, “flirting,” or “romance on the job.” Some commentators noted that sexual harassment helped keep women subordinated in the workplace and that inappropriate sexual comments were designed to remind women of their inferior status. Numerous surveys showed that sexual harassment was prevalent in many women’s experience. Studies of women employed by the federal government conducted in 1981 and 1987 reported that 42 percent of women had experienced harassment on the job during the previous two years. Likewise, reports showed that 30 to 40 percent of women working in private business had been harassed. The most dramatic results came from the military, where 64 percent had been touched, pressured for sexual favors, or—in 5 percent of the cases—raped. Women in the military who reported harassment stated that reporting made their lives worse in every way. Sexual harassment was an issue in educational institutions, as well as in employment. Studies of
Sexual harassment
■
869
college campuses conducted by the National Association for Women Deans, Administrators, and Counselors in the 1980’s found that 7 percent of women students said faculty members had made unwanted advances; 14 percent had been asked on dates by professors; 34 percent experienced leering; and 65 percent had been the target of sexual comments. Although a considerable number of them avoided certain classes because of faculty members’ reputations, none of the women had reported the offensive conduct to the administration. Impact Despite the legal protection against sexual harassment, women only infrequently told their supervisors or authorities about the behavior. Many claimed they feared they would not be taken seriously. Mandated corporate training in regard to sexual harassment became widespread, but it was not always taken seriously. Some comedians even lampooned the perceived oversensitivity that required such training to be instituted. Subsequent Events The situation improved slowly and unsteadily in the next decade, particularly after high-profile harassment scandals in 1991 brought further impetus to the national conversation. In that year, Professor Anita Hill testified before a committee of the United States Senate that Clarence Thomas, a nominee for the Supreme Court, had repeatedly sexually harassed her when they worked together a decade earlier. A scandal also resulted from a 1991 convention of the Tailhook Association, an organization composed of naval aviators. Eightythree women and seven men reported being sexually harassed or assaulted during the convention. Further Reading
Atwell, Mary Welek. Equal Protection of the Law? Gender and Justice in the United States. New York: Peter Lang, 2002. Examines how gendered perspectives have been incorporated into the American legal system. Forrell, Caroline A., and Donna M. Matthews. A Law of Her Own: The Reasonable Woman as a Measure of Man. New York: New York University Press, 2000. Study suggesting reforms that would take better account of women’s experience in defining sexual harassment and other such legal terms. MacKinnon, Catherine. “Toward Feminist Jurisprudence.” In Feminist Jurisprudence, edited by Patricia Smith. New York: Oxford University Press,
870
■
The Eighties in America
Shamrock Summit
1993. MacKinnon is largely responsible for defining and conceptualizing sexual harassment. Rundblad, Georganne. “Gender, Power, and Sexual Harassment.” In Gender Mosaics: Social Perspectives, edited by Dana Vannoy. Los Angeles: Roxbury, 2001. Analysis of sexual harassment emphasizing the extent to which it expresses social power struggles rather than (or alongside) sexual desire. Mary Welek Atwell See also
Feminism; Meritor Savings Bank v. Vinson; Supreme Court decisions; Women in the workforce; Women’s rights.
■ Shamrock Summit The Event
A meeting between U.S. president Ronald Reagan and Canadian prime minister Brian Mulroney Date March 17, 1985 Place Quebec City, Canada The amicable meeting between President Reagan and Prime Minister Mulroney underscored the growing closeness between their two North American countries. During the ministry of Prime Minister Pierre Trudeau, which ended in 1984, Canada had often been at loggerheads with its neighbor and closest ally, the United States. When Brian Mulroney was voted into office on September 4, 1984, this situation changed. Mulroney firmly aligned Canada with the United States in geopolitical terms, abandoning Trudeau’s fitful attempt to stake out a neutralist position between the Americans and the Soviet Union. Mulroney also redirected Trudeau’s interest in developing countries to reaffirm economic and political ties with Canada’s large, industrialized trading partners. Though Mulroney did not have a total ideological affinity with the conservative Ronald Reagan—he was much further to the left on issues concerning the welfare state and the environment, for instance— the meeting between the two leaders scheduled for March 17, 1985, in Quebec City was anticipated to be a positive one, and it exceeded expectations in this regard. The two leaders not only found common ground on policy positions but also formed a close personal bond. The fact that March 17 was Saint Pat-
rick’s Day, dedicated to the patron saint of Ireland (whose symbol was the shamrock), underscored the two men’s ethnic origins in the Irish diaspora. When Reagan and Mulroney joined in a duet of the song “When Irish Eyes Are Smiling,” it was not only a moment of joviality but also a sense of shared identity, a solidarity of both ethnicity and moral temperament that welded the leaders in an affirmation of mutual beliefs. The meeting was also important for Mulroney’s international image, as it was the first time most casual observers of world politics had heard of him. The Shamrock Summit, however, was not popular among many Canadian media commentators and those in the general population who held a strongly Canadian nationalist ideology. They believed that Mulroney had capitulated to the colossus to their south and had relinquished Canada’s idealistic and peace-seeking approach to the international situation, as well as the nation’s economic independence and the quality of life of its populace. Impact Many observers, remembering that previous Canadian prime ministers such as Sir Wilfrid Laurier had involuntarily left office because they were perceived to be too pro-American, waited for the meeting to damage Mulroney’s political viability. This, however, did not occur. Partially ballasted by Canada’s economic boom in the 1980’s, Mulroney remained popular enough to lead his party to victory in the next election, retaining the prime ministry in the process. The meeting also helped Reagan at a time when he was beginning to encounter some unusual political difficulties in the wake of his overwhelming reelection the year before, such as the controversy over his visit to the Nazi graves at Bitburg, Germany, two months later. Further Reading
Martin, Lawrence. Pledge of Allegiance: The Americanization of Canada in the Mulroney Years. Toronto: McClelland and Stewart, 1993. Simpson, Jeffrey. The Anxious Years: Politics in the Age of Mulroney and Chrétien. Toronto: Lester, 1996. Nicholas Birns See also
Canada and the United States; CanadaUnited States Free Trade Agreement; Mulroney, Brian; Reagan, Ronald.
The Eighties in America
■ Shepard, Sam Identification
American playwright, actor, director, and screenwriter Born November 5, 1943; Fort Sheridan, Illinois Considered one of the most influential playwrights of his generation, Shepard helped shape contemporary American theater. In addition to penning numerous plays, the awardwinning dramatist became well known for his work as an actor, director, and screenwriter throughout the 1980’s. Sam Shepard began his career as an actor and playwright in the Off-Off-Broadway theaters of New York in the early 1960’s. His work gained critical acclaim throughout the 1970’s, earning him numerous awards, including a Pulitzer Prize for Buried Child (pr. 1978, pb. 1979) in 1979. Shepard continued to establish himself as one of the great American dramatists during the 1980’s with the publication of True West (pr. 1980, pb. 1981), Fool for Love (pr., pb. 1985), and A Lie of the Mind (pr. 1985, pb. 1986). A theme central to Shepard’s work is loss; his characters often suffer from feelings of alienation and
Shepard, Sam
■
871
search for connection and identity in an unstable world. His plays also deal with the notion of masculinity and examine the social role of the American male. Shepard’s works are also deeply rooted in the myth of the American West and the old frontier, a landscape that is at once hopeful and destructive, but is above all distinctly American. During the 1980’s, Shepard’s career flourished, and he became well known as an actor, director, and screenwriter. Throughout the decade, Shepard performed in multiple films, and it was on the set of Frances (1982) that Shepard met his longtime partner Jessica Lange, with whom he would have two children. Shepard’s rise to fame in film was bolstered by an Academy Award nomination for his performance in The Right Stuff in 1983. From then on, Shepard was known not only as a great American playwright but also as a prominent presence in American film. Shepard’s popularity continued to increase, and in 1984, the Palme d’Or at the Cannes Film Festival was awarded to Paris, Texas. Shepard had written the film’s screenplay, adapting his own book, Motel Chronicles (1982). He participated in many other films of the decade, including Resurrection (1980), Fool for Love (1985), Crimes of the Heart (1986), Baby Boom (1987), Far North (1988), and Steel Magnolias (1989). Impact In the span of his long career, Shepard experienced the height of his popularity during the 1980’s and was inducted into the American Academy of Arts and Letters in 1986. His works having achieved national acclaim, he secured a permanent place in American theater and became one of the most produced playwrights in America. His influence on stage and screen are evident in the numerous works of scholarship devoted to him, as well as in the popularity of his plays with modern audiences. Further Reading
Roudane, Matthew, ed. Cambridge Companion to Sam Shepard. New York: Cambridge University Press, 2002. Tucker, Martin. Sam Shepard. New York: Continuum, 1992. Wade, Leslie A. Sam Shepard and the American Theatre. Westport, Conn.: Greenwood Press, 1997. Danielle A. DeFoe See also Sam Shepard. (Martha Holmes)
Academy Awards; Film in the United States; Mamet, David; Theater.
872
■
Shields, Brooke
■ Shields, Brooke Identification American model and actor Born May 31, 1965; New York, New York
Shields was a 1980’s personality icon who gained fame as a result of her modeling and acting career. Because of her numerous magazine cover shots, she is considered to be one of the most photographed supermodels of the twentieth century.
The Eighties in America
tinued to act in films such as The Muppets Take Manhattan (1984) and Sahara (1984). After completing college, she appeared in several made-for-television movies. With few exceptions, many of her film and modeling appearances seemed to exploit her sensuality, yet she maintained a wholesomeness that endeared her to many. She received People’s Choice Awards for Favorite Young Performer four years in a row, from 1981 to 1984.
Impact Brooke Shields’s career as a child star served During the early 1980’s, it seemed that Brooke as an example to others. Despite her fame and pubShields was everywhere: She was on the cover of fashlic attention, she and her mother both worked hard ion magazines, the subject of newspaper and magato ensure that her childhood and education would zine articles, in advertisements, on talk shows, a be as normal and mainstream as possible. Her fame costar of several films, and the subject of several and noted average lifestyle when not working, coubooks. Shields began her career as a child model, pled with her above-average intelligence, beauty, progressing to film roles in the late 1970’s. Her conand sensibility, made her an attractive role model to troversial appearance as a child prostitute in the many. 1976 film Pretty Baby generated much media attenFurther Reading tion about her career, her mother, and her looks. Bonderoff, Jason. Brooke. New York: Zebra Books, She starred in several well-known teen films, includ1981. ing Blue Lagoon (1980), costarring Christopher AtItalia, Bob. Brooke Shields. Minneapolis: Abdo & kins, and Endless Love (1981). She also appeared in Daughters, 1992. fourteen controversial television commercials for Calvin Klein jeans that were censored by several networks. The jean commercials generated much notoriety for Shields with their sensual photography and her statement in one of the ads, “You know what comes between me and my Calvins? Nothing.” Shields appeared in numerous Bob Hope television specials and several of the Circus of the Stars television shows. In 1982, a Brooke Shields doll was mass produced in her likeness, as were several outfits for it. Because much of Shields’s work seemed to exploit her beauty, critics were quick to accuse her mother and manager, Terri Shields, of pushing Brooke to pursue a modeling and acting career that was not in her best interests when she was too young to make her own decisions. Shields was often noted as being likable and downto-earth despite her child star status. She attended Princeton University from 1983 to 1987, graduating with honors and a degree in French literaBrooke Shields poses in a public service image in the mid-1980’s. (Hulton ture. During her college years, she conArchive/Getty Images)
The Eighties in America
Simpson, Maria. “Does She or Doesn’t She? Revisited: The Calvin Klein Jeans Ad.” Etc.: A Review of General Semantics 38 (Winter, 1981): 390-400. Susan E. Thomas See also
Advertising; Film in the United States;
Teen films.
■ Shultz, George P. Identification
U.S. secretary of state from 1982 to 1989 Born December 13, 1920; New York, New York Although Shultz was a staunch anticommunist, he opposed the Reagan administration covert operations that led to the Iran-Contra affair.
Shultz, George P.
■
873
defense spending, which he saw as reckless. He was also at odds with John Poindexter, the national security adviser, and Oliver North, one of Poindexter’s staffers, over various proposed arms deals in the Middle East, especially with Iran. These deals involved the covert sale of arms to Iran, which just a few years earlier had held American hostages for more than a year, causing a national and international crisis. North and Poindexter used the proceeds from these sales illegally to fund the Contras, a rebel army in Nicaragua to which Congress had banned military aid. When these events became public, Shultz was untouched by the ensuing Iran-Contra scandal, because it was clear that he was opposed to selling weapons to Iran. During the first intifada in Palestine, Shultz attempted without success to convene an international conference to achieve a cease-fire and the creation of an autonomous region in the West Bank and the Gaza Strip.
On July 16, 1982, President Ronald Reagan appointed George P. Shultz to replace Alexander Haig Impact In addition to pursuing policies that ended as U.S. secretary of state. Shultz had a distinguished the Soviet Union, Shultz was significant in his resisacademic career as a professor and dean of the welltance to the illegal activities of some of his colleagues known University of Chicago Graduate School of in the Reagan administration. He left office on JanuBusiness, which included on its staff many illustrious ary 20, 1989, but continued to be a strategist for the conservative economists. He also had an impresRepublican Party. sive background in government, having served on the Council of Economic Advisors of President Dwight D. Eisenhower (1953-1961) and as secretary of labor (1969-1970), Office of Management and Budget director (19701972), and secretary of the Treasury (1972-1974) under President Richard M. Nixon. This distinguished record served Reagan’s need to find someone with impeccable credentials to replace Haig, but there was nonetheless some feeling that Shultz was more dovish than the rest of the Reagan cabinet. He had a long record as an anticommunist, helped develop the Reagan Doctrine, supported the Grenada invasion, opposed negotiations with Daniel Ortega’s Sandinista government, and even advocated invading Nicaragua to get rid of Ortega. Still, he sparred with CasSecretary of State George P. Shultz, right, reads a statement to the press regarding U.S. par Weinberger, the secretary of depolicy toward Central America in November, 1987. The statement represents a comprofense, over the prodigious growth of mise with House Speaker Jim Wright, center. (AP/Wide World Photos)
874
■
The Eighties in America
Simmons, Richard
Further Reading
Madison, Christopher. “Shultz Shows How to Survive and Even Prosper in His High-Risk Post at State.” National Journal 18, no. 7 (February 15, 1986). Shultz, George. Turmoil and Triumph: My Years as Secretary of State. New York: Charles Scribner’s Sons, 1993. Richard L. Wilson See also
Cold War; Foreign policy of the United States; Grenada invasion; Haig, Alexander; IranContra affair; Israel and the United States; Middle East and North America; North, Oliver; Poindexter, John; Reagan, Ronald; Reagan Doctrine; Soviet Union and North America; Weinberger, Caspar.
■ Simmons, Richard Identification
American fitness expert, motivational speaker, talk show host, and author Born July 12, 1948; New Orleans, Louisiana Simmons’s personal approach toward helping overweight people changed the way people viewed diet and exercise in the 1980’s. He promoted not just a weight-loss program but also a positive journey to a healthier lifestyle. After personally experiencing several failed attempts to lose weight and keep it off, Richard Simmons opened Slimmons, an exercise club for people who were battling weight problems but who felt too embarrassed to join a gym. Simmons was concerned about the many dangerous ways in which people were trying to lose weight. After considerable research and consultations with doctors and nutritionists, he developed a diet plan called Deal-A-Meal. On his Emmy Award-winning talk show, The Richard Simmons Show, Simmons inspired others by sharing his own story with his audience and by presenting viewers who had lost weight using his healthy living, diet, and aerobics programs. Interaction with his viewers extended to making personal phone calls or visiting people who wrote to him about weight problems. He also worked to oversee the progress of those struggling with morbid obesity. As an additional aid in preparing nutritional meals, Simmons wrote The Deal-A-Meal Cookbook (1987) and Richard Simmons’ Never-Say-Diet Book (1980).
Since Simmons felt that exercise is a major part of losing weight, he developed the Sweatin’ to the Oldies series of aerobic exercise videos featuring people in various stages of weight loss exercising with him to upbeat music. His trademark outfit of shorts and a tank top was first worn in these exercise videos. While serving as the chair for the Spina Bifida Association, Simmons saw the need for an exercise program for the physically challenged and developed the Reach for Fitness program. The exercises in this program were adapted so people with various types of physical and medical challenges could participate. Simmons further expanded his exercise videos to include seniors and featured his mother, Shirley, along with other celebrities’ parents, in Richard Simmons and the Silver Foxes. Impact Simmons’s devotion to helping others by providing inspiration, education, and motivation encouraged people in all walks of life to change their eating habits, to follow nutritional guidelines, and to maintain a realistic exercise schedule. In addition, his followers developed a deeper awareness of health issues and the dangers of being overweight. Further Reading
Simmons, Richard. Still Hungry After All These Years. New York: G. T., 1999. Stearns, Peter N. Fat History: Bodies and Beauty in the Modern West. New York: New York University Press, 1997. Elizabeth B. Graham See also
Aerobics; Diets; Food trends; Home video
rentals.
■ Sioux City plane crash The Event
A DC-10 jet endures a severe crash, but 185 people on board miraculously survive Date July 19, 1989 Place Sioux City, Iowa The Sioux City plane crash was one of the most famous air disasters of the 1980’s, because it was a miracle that anyone survived, because it was captured on film, and because it caught the popular imagination. It was later the subject of fictional motion pictures and television documentaries. One of the most amazing cases of survival in a commercial plane crash began with United Airlines
The Eighties in America
Flight 232 from Denver to Philadelphia with a stop in Chicago. The DC-10, a wide-bodied jet with three engines—one on each wing and one in the tail— took off a little after 2:00 p.m. central daylight time. The three-member flight crew consisted of Captain Alfred Haynes; First Officer William Records, the copilot; and Second Officer Dudley Dvorak, the flight engineer. About an hour after takeoff, a fan disk in the center engine broke in two, shattered the engine, blew through the engine case, and tore holes in the tail section of the plane. It punctured all three hydraulic systems, allowing the fluid to run out. None of this damage was known in detail at the moment of the incident, but the flight crew felt a sharp jolt running through the plane. The copilot, who was at the controls, noticed that the airliner was off course and tried to correct the course with the controls, but he found the plane unresponsive. The crew also discovered that the autopilot was off, that the engine in the tail indicated a malfunction, and that the three hydraulic systems had lost all pressure. Although the three systems were designed to back one another up, their collective failure meant that none of the usual wing or tail controls on the aircraft would operate, creating a severe emergency situation. The fan blade had broken and locked the controls while the plane was in a slight right turn. The hydraulic loss meant the plane was in a circular pattern while descending about fifteen hundred feet with each cycle. A DC-10 flight instructor, Dennis E. Fitch, was a passenger on board and offered to help. Eventually, he managed to stem the downward cycle a bit by running the two remaining engines at different speeds to steer the plane and gain or lose altitude. He also managed to lower the landing gear, but he was not able to restore the critical hydraulics. The crew notified air traffic controllers, who indicated that the closest sizable airport was at Sioux City, Iowa. An emergency landing was organized there in the roughly thirty minutes available between the airborne incident and the crash landing. The crew dumped the plane’s excess fuel, and everyone took care to avoid allowing the plane to pass over populated areas. This was very difficult, as the plane could make only right-hand turns. On final approach, the crew realized they could not attempt landing on the airfield’s longer runway and therefore notified the tower that they would try to reach a
Sioux City plane crash
■
875
shorter runway on which the scrambled fire trucks had parked. Fortunately, the trucks were able to vacate the runway in time, but this situation delayed their response to the fire that resulted from the plane’s impact. A DC-10 would normally land at a speed of about 140 knots while descending at 300 feet per minute. The best the crew could do was to land their crippled plane at 240 knots while descending at 1,850 feet per minute. This resulted in a crash, but not as severe a crash as one might have expected for such a seriously crippled plane. A strong gust of wind blew the plane to the right of the runway and caused the right wing to hit the ground first, causing fuel to leak and ignite. The tail broke off first, and the rest of the plane bounced repeatedly, eventually rolling over on its back and sliding sideways into a cornfield next to the runway. Of the 296 people aboard, 185 survived. Most of the 111 deaths resulted from the impact, but some were the result of smoke inhalation as the fire engulfed the section above the wings. Most survivors had been sitting ahead of the wings, and some were lucky enough to walk out of the crashed plane and into the cornfield unharmed. A number of factors allowed for a better chance of survival than might have been expected from such a seriously damaged plane. The inherent crashworthiness of newer wide-bodied air transports played a part in the relatively high survival rate, as did the shallow angle of descent as a result of the crew’s heroic efforts to land the plane as safely as possible. The incident occurred in daylight in good weather on the one day of the month when the Iowa Air National Guard was on duty at Sioux Gateway Airport. It also occurred at a time of day when extra personnel were available at both a regional trauma center and a regional burn center. The subsequent investigation revealed that the fan blade broke because of fundamental weakness in its design. The weakness was corrected as a result. Additional investigation indicated that the weakness was missed during maintenance checks. The critical hydraulic failure was remedied by installing special fuses to prevent fluid loss in all hydraulic systems. Impact The Sioux City plane crash resulted in several engineering improvements for the DC-10 but had an even greater impact on procedures for training flight crews for emergencies. The Sioux City,
876
■
Sitcoms
Iowa, emergency preparedness procedures were studied after the crash, so emergency responders could improve their readiness for crises. Further Reading
Faith, Nicholas. Black Box: The Air-Crash Detectives: Why Air Safety Is No Accident. Osceola, Wis.: Motorbooks International, 1997. “How Swift Starting Action Teams Get off the Ground: What United Flight 232 and Airline Flight Crews Can Tell Us About Team Communication.” Management Communication Quarterly 19, no. 2 (November, 2005). Schemmel, Jerry. Chosen to Live: The Inspiring Story of Flight 232 Survivor Jerry Schemmel. Littleton, Colo.: Victory, 1996. Trombello, Joseph. Miracle in the Cornfield. Appleton, Wis.: PrintSource Plus, 1999. Richard L. Wilson See also
Air India Flight 182 bombing; Cerritos plane crash; Pan Am Flight 103 bombing.
■ Sitcoms Definition
Comic television series in which regular characters and situations recur from episode to episode
Sitcoms in the 1980’s focused on humor, friendship, and relationships between friends or family members. They largely discarded the broad physical comedy popular in earlier years. Early situation comedies, or sitcoms, had filled “family viewing time” between 8:00 p.m. and 9:00 p.m. with lightweight, escapist stories and likable, somewhat predictable characters. When the more irreverent shows of cable television were introduced in the early 1980’s, network television programming changed in order to compete for an increasingly sophisticated audience. Fresh and different approaches were needed. Innovative prime-time soap operas, such as Dallas, and so-called reality shows appeared, convincing the networks to incorporate serialization and greater realism into their new sitcoms. With continuing story lines, more true-to-life plots, poignancy, and even occasional sad endings, the new shows remained funny but incorporated much less slapstick humor.
The Eighties in America The Domestic Sitcom Almost since the advent of television, the networks had featured family shows such as The Adventures of Ozzie and Harriet and Father Knows Best. The 1980’s family sitcom often retained the depiction of a nuclear family with both parents in the home. There were variations, however, such as One Day at a Time (1975-1984), Who’s the Boss? (19841992), and Diff’rent Strokes (1978-1986), which portrayed divorced parents or orphaned children. Even these shows, though, used surrogate parental figures, such as a housekeeper or handyman, to maintain the two-parent model. The 1980’s family sitcom retained another characteristic of earlier shows: The families always lived in a nice house or apartment with all the necessities for a comfortable lifestyle. There were several major differences between 1980’s and traditional 1950’s and 1960’s sitcoms, however. Family breadwinners, whether male or female, often worked at unglamourous jobs for little pay. Although Good Times, which aired in the 1970’s, depicted an indigent African American family, Roseanne (1988-1997) was unique in featuring a Caucasian working-class family in which both parents worked at a series of low-paying jobs and were often in dire financial straits. The show’s humor often came from the parents’ ability to laugh at their circumstances. The Tortellis (1987) featured another working-class family composed of a television repairman, his second wife, and his children from his first marriage. It was a spin-off from another successful show, Cheers (1982-1993), but it did not capture enough viewers’ imagination. One Day at a Time portrayed a divorced mother with two teenage daughters, and its plots centered on single parenting and teenagers. Family Ties (1982-1989) was probably closest to the American notion of the “ideal” family with its middle-class parents and teenage kids with typical, and sometimes atypical, teenage concerns. In Diff’rent Strokes, a wealthy white man adopted the two young sons of his recently deceased African American housekeeper and took them home to live in his opulent Manhattan townhouse. The show became quite popular in spite of its improbable theme, and it produced a couple of spin-offs, one the highly successful The Facts of Life (1979-1988). The Cosby Show (1984-1992) was one of the most successful and influential sitcoms of the decade and perhaps of all time. Portraying a middle-class African American family with two professional parents and five generally well behaved, intelligent, and respectful chil-
The Eighties in America
dren (ranging in grade level from college to kindergarten), it was a landmark show. Nothing like it had been seen on television before, and audiences loved it. Much of the series’ appeal was probably due to the comedic acting of Bill Cosby, who played the father, but audiences could also relate to the parentchild struggles at the center of the show and appreciate the way the two parents stayed in control while clearly loving and cherishing their children. Married . . . with Children (1987-1997) rebelled against most other family sitcoms, puting forth a completely different view of American family life. A longrunning show, it debuted on the new FOX network after being turned down by the other networks for being too different. It was: The father was crass, the mother was coarse, the pretty teenage daughter was a bimbo, and the teenage son, possibly the most intelligent of the bunch, seemed amoral. This depiction of a dysfunctional family was a hit. The Singles Scene
Another sitcom trend featured the affairs of unmarried people who were either too young to settle down, as in Happy Days (1974-1984) or The Facts of Life, or too caught up in their careers, as in Murphy Brown (1988-1998), M*A*S*H (19721983), and Bosom Buddies (1980-1984). While romance occurred in such shows, it was not usually a major concern. Cheers typified this genre with its ensemble cast of partnerless barflies hanging out at the bar “where everybody knows your name” and plotlines revolving around the camaraderie and activities of the bar’s staff and regular customers. Designing Women (1986-1993) and The Golden Girls (19851992) followed the lives of single, mature women living or working together with infrequent interactions with men. Romance was occasionally part of the plot, but emphasis was placed more on the way the women enjoyed one another’s company and reveled in their ability to make their lives satisfying and fulfilling. Impact The 1980’s sitcoms changed the way Americans viewed life in the United States by addressing more realistic concerns, which were delineated with humor and a frankness in plot, dialogue, and character portrayal far removed from the sugarcoated sit-
Sitcoms
■
877
Popular 1980’s Sitcoms Program
Airdates
Network
M*A*S*H
1972-1983
CBS
The Jeffersons
1975-1985
CBS
Three’s Company
1977-1984
ABC
The Facts of Life
1979-1988
NBC
Gimme a Break!
1981-1987
NBC
Family Ties
1982-1989
NBC
Newhart
1982-1990
CBS
Cheers
1982-1993
NBC
Mama’s Family
1983-1990
NBC
Punky Brewster
1984-1986
NBC
Kate and Allie
1984-1989
CBS
The Cosby Show
1984-1992
NBC
Who’s the Boss?
1984-1992
ABC
Night Court
1984-1992
NBC
227
1985-1990
NBC
The Golden Girls
1985-1992
NBC
Growing Pains
1985-1992
ABC
ALF
1986-1990
NBC
Designing Women
1986-1993
CBS
A Different World
1987-1993
NBC
Married . . . with Children
1987-1997
FOX
The Wonder Years
1988-1993
ABC
Roseanne
1988-1997
ABC
Murphy Brown
1988-1998
CBS
Seinfeld
1989-1998
NBC
coms of earlier years. With cable television in 50 percent of American homes by 1987 providing largely unregulated programming to an increasingly sophisticated audience, the network sitcoms in selfdefense became less puritanical, conservative, and traditional. This situation generated increased controversy over the effect of sitcoms, and Hollywood in general, on the nation’s values and moral character. Further Reading
Brooks, Tim, and Earle Marsh. The Complete Directory to Prime Time Network and Cable TV Shows, 1946Present. 7th ed. New York: Ballantine Books, 1999. Provides details about every prime-time television
878
■
Skinheads and neo-Nazis
program shown from 1946 to 1998, with cast lists, airdates, and occasional critical insights into why a show succeeded or failed. Reddicliffe, Steven, ed. “TV Guide”: Fifty Years of Television. New York: Crown, 2002. Lavishly illustrated history of significant television programs. Includes photographs of leading actors and performers from the 1950’s to the start of the twenty-first century and celebrity commentaries on certain televised events. Staiger, Janet. Blockbuster TV: Must-See Sitcoms in the Network Era. New York: New York University Press, 2000. Analyzes and discusses why some sitcoms achieve much greater success than others, with particular focus on Laverne and Shirley and The Cosby Show. Jane L. Ball See also
African Americans; Cable television; Cheers; Cosby Show, The; Designing Women; Facts of Life, The; Family Ties; FOX network; Golden Girls, The; Married . . . with Children; M*A*S*H series finale; Television.
■ Skinheads and neo-Nazis Definition
Members of white supremacist movements
The skinheads developed into one of the largest and most violent white separatist movements in the United States in the 1980’s, a decade characterized by an increase in hate groups nationwide. The skinhead phenomenon had its origins in England in the early 1970’s. The movement generally attracted white, urban, working-class youth between the ages of thirteen and twenty-five. These individuals were concerned about the economic and social obstacles they were encountering in Great Britain because of their limited education and competition from immigrants. Skinheads could be identified by their shaved or closely cropped hair, their tattoos, and their combat boots. Some skinheads were involved in racial attacks against Pakistani immigrants and homosexuals. Over time, the skinhead phenomenon spread from England to continental Europe, where it also attracted working-class youth. By the early 1980’s, skinheads began to appear in the United States. While the American skinheads’ appearance
The Eighties in America
was similar to that of their European counterparts, their socioeconomic background was more diverse, with the movement comprising alienated middleclass and working-class youth. Many came from broken homes, and becoming a skinhead gave these youths a new identity and sense of belonging. The American skinhead movement was also more diverse in ideology. Some skinhead groups followed a white supremacist ideology, while others were nonracists. In fact, there were also African American skinheads. The racially oriented skinheads adopted an eclectic pattern of racial beliefs. Some followed orthodox Nazi ideology, while others adhered to a mixture of racial beliefs including populism, ethnocentrism, and ultranationalist chauvinism. The racial skinheads had a special war cry, “RAHOWA,” which stood for “racial holy war.” These skinheads targeted minority groups, including African Americans, Asians, and Hispanics. They also attacked homosexuals and homeless people. “White power” music was one of the major recruiting tools of the skinhead movement. The first white power band, Skrewdriver, was started by Ian Stuart Donaldson in England in 1977. Donaldson, who dropped his surname and became known as Ian Stuart, aligned himself with the neofascist British National Front in 1979. In the United States, skinhead music was linked to “Oi,” a music form distinct from punk rock, hardcore, or heavy metal. These bands played a type of rock whose lyrics focused on bigotry and violence. In time, a number of skinhead bands emerged in the United States with names such as Angry White Youth, Extreme Hatred, Aggravated Assault, Aryan, Thunder Bound for Glory, RAHOWA, and New Minority. The skinhead movement was a decentralized movement with no hierarchy or central leadership. Many different skinhead groups operated in the United States, with the greatest concentration on the West Coast. In the 1980’s, several neo-Nazi organizations began to try recruiting the racial skinheads into their organizations. The most notable attempt was by former Ku Klux Klan member Tom Metzger and his son, John Metzger. Tom Metzger, a television repairman from Fallbrook, California, was the founder and leader of a neo-Nazi organization called the White Aryan Resistance (WAR). Metzger began actively to recruit skinheads into WAR by portraying his organization as anti-authoritarian and pro-working class. In 1986, he founded the Aryan
The Eighties in America
Youth Movement, a division of WAR that targeted skinheads for recruitment, and included an Aryan Youth Movement newspaper among his WAR publications. Metzger also held the first so-called hate rock fest, Aryan Fest, in Oklahoma in 1988. This event attracted skinheads from throughout the United States and served as a recruiting tool for Metzger and his organization. Within a few years, the Aryan Youth Movement successfully formed alliances with skinheads in a number of cities, including San Francisco, California; Portland, Oregon; Tulsa, Oklahoma; Cincinnati, Ohio; Detroit, Michigan; and New York City. Metzger’s attempt to control the skinhead movement was curtailed following the murder of an Ethiopian immigrant by three skinheads in Portland, Oregon, in November, 1988. After the skinheads pleaded guilty to murder, the Southern Poverty Law Center brought a civil wrongful death suit against the Metzgers on behalf of the victim’s family and won a $12.5 million verdict. This judgment ruined Metzger financially and effectively ended his recruitment of skinheads through the White Aryan Resistance. Impact
The racial skinhead movement in the United States attracted alienated youth during the 1980’s. In 1989, the Anti-Defamation League estimated there were three thousand activist skinheads in thirty-one states. Although the movement was small and decentralized, skinheads were responsible for a large number of violent acts. Many of these were crimes of opportunity that were carried out spontaneously by skinheads. From 1987 to 1990, skinheads were responsible for at least six murders in the United States. In addition, skinheads committed thousands of other violent crimes, including beatings, stabbings, shootings, thefts, and synagogue desecrations.
Further Reading
Dobratz, Betty A., and Stephanie Shanks-Meile. The White Separatist Movement in the United States. Baltimore: Johns Hopkins University Press, 1997. Analysis of the white separatist movement, including skinheads, based on inter views, movementgenerated documents, and participant observation. Hamm, Mark S. American Skinheads: The Criminology and Control of Hate Crime. Westport, Conn.: Prae-
SkyDome
■
879
ger, 1993. Sociological analysis of the skinhead movement and hate crimes. Moore, Jack B. Skinheads Shaved for Battle: A Cultural History of American Skinheads. Bowling Green, Ohio: Bowling Green University Popular Press, 1993. Examines the roots of the skinhead movement, both English and American, as well as the ideas, activities, modes of organization, and role of music in the movement. Ridgeway, James. Blood in the Face: The Ku Klux Klan, Aryan Nations, Nazi Skinheads, and the Rise of a New White Culture. New York: Thunder’s Mouth Press, 1990. Traces the evolution of the racial Right in the United States, with a focus on racial organizations and their activities in the 1980’s. William V. Moore See also African Americans; Crime; Domestic violence; Gangs; Nation of Yahweh; Racial discrimination; Terrorism.
■ SkyDome Identification Major League Baseball stadium Date Opened on June 5, 1989 Place Toronto, Ontario
When SkyDome opened in 1989, it was the world’s first sports stadium with a retractable domed roof. When the Toronto Blue Jays baseball team entered the American League in 1977, they played their home games in Exhibition Stadium, an old football arena reconfigured for baseball. The team soon began plans for a new home that would become an architectural and technological wonder: the world’s first convertible indoor-outdoor sports stadium. In 1965, the Houston Astros had opened the Astrodome, a covered stadium that made every baseball game played in it an indoor event. The Blue Jays wanted a ballpark that could be closed to protect baseball fans from the often frigid early- and lateseason Canadian weather, but they also wanted to allow fans to enjoy baseball outdoors on sunny afternoons and warm evenings. To meet the team’s needs, architects Rod Robbie and Michael Allen of the Stadium Corporation of Toronto designed SkyDome, a sports arena with a retractable roof that could be left open during fair weather and closed during foul weather. SkyDome opened for play on June 5, 1989.
880
■
The Eighties in America
Slang and slogans
SkyDome’s roof consists of three interlocking panels that cover the baseball diamond and grandstand when the roof is in place. In twenty minutes, however, a series of gears and pulleys can be engaged to retract the panels toward the outfield perimeter of the stadium, exposing the entire field and more than 90 percent of the fifty-four thousand seats to the open air. Baseball players maintain that a batted ball travels farther when the stadium is enclosed because of a downdraft created by the retracted panels that rest beyond the outfield fences when the stadium is open. Although the roof can be left open to the sun and rain, SkyDome’s playing surface is covered with artificial turf rather than grass. An aerial view of SkyDome, with the retractable roof closed. (Lee M./ GFDL) Besides its retractable roof, SkyDome made another important contribuTackach, James, and Joshua B. Stein. The Fields of tion to sports stadium design. Along with the usual Summer: America’s Great Ballparks and the Players food-and-drink concession stands available in all maWho Triumphed in Them. New York: Crescent Books, jor sports arenas, SkyDome included a hotel, restau1992. rant, and health club, so fans could spend their enJames Tackach tire day—even their entire vacations—in SkyDome. Such auxiliary facilities began to appear in other See also Architecture; Baseball; Sports. large sports stadiums built after the opening of SkyDome. Impact The opening of SkyDome inspired the construction of retractable-dome stadiums in other cities hosting Major League Baseball teams. Moreover, it represented a significant architectural and engineering feat generally, inspiring innovations in the design of other major urban structures. Finally, the combination of several facilities, including a hotel, within SkyDome both anticipated and participated in the movement toward mixed-use and “destination” structures in general, which attempted to draw consumers to a single location featuring multiple types of attractions and spending opportunities. Further Reading
Gershman, Michael. Diamonds: The Evolution of the Ballpark—From Elysian Fields to Camden Yards. Boston: Houghton Mifflin, 1993. Lowry, Philip J. Green Cathedrals: The Ultimate Celebration of Major League and Negro League Ballparks. New York: Walker, 2006.
■ Slang and slogans Definition
Linguistic innovations
New slang and slogans in the 1980’s stemmed from marketing and merchandising, social and political life, science, entertainment, and trends among young people. During the 1980’s, marketing agencies reflected in their slogans a no-nonsense, no-frills approach that corresponded to the minimalist movement in literature and music of the decade. Ad campaigns eschewed flowery language in favor of assertive bluntness with such slogans as “Just do it” (Nike) and “It works every time” (Colt 45). Coca-Cola’s slogans of the decade were equally pithy: “Coke is the real thing” and “Coke is it.” Even when marketers promoted indulgence, they did so tersely: “Reassuringly expensive” (Stella Artois) and “All the sugar and twice the caffeine” (Jolt Cola). Patriotism also fea-
The Eighties in America
tured in 1980’s ad campaigns, most notably in the slogans “Made the American way” (Miller High Life) and “Pump your money back into Canada” (Petro Canada). The most famous slogan of the advertising world was “Where’s the beef?,” croaked by diminutive character actor Clara Peller in a fast-food commercial. A new word was coined to name a new type of television commercial in the 1980’s, by combining two other words. Marketers began to package hourlength promotional films for television as if they were talk shows or news programs, often with a celebrity host. Programmers blended the first two syllables of the word “information” with the last two of “commercial” to yield the format’s new name: infomercial. Politics and Science Such blending of words was also common in political usage in the 1980’s. When people with ties to the Reagan administration were found to be trading arms to Iran and funneling the profits to the Nicaraguan Contras, the last syllable of “Watergate,” premiere political scandal of the 1970’s, was blended with the name of the rebel cadre and the Middle Eastern country to provide names for the new scandal: “Irangate” and “Contragate.” A blend of Reagan’s name with the last two syllables of “economics” provided the press with a convenient word for the president’s financial theories: “Reaganomics.” “Just Say No,” the slogan of an anti-drug campaign spearheaded by First Lady Nancy Reagan, was controversial, as it seemed to exemplify the vastly different approaches to social problems embraced by conservatives and liberals. The former applauded the slogan for its suggestion that the answer to America’s drug problem was straightforward and involved individuals taking personal responsibility for their actions. The latter derided it as an evasion of the complexity of the drug issue and a refusal to see broad social inequities as contributing to young people’s drug use. One new term that was often used in a jocular way stemmed from a criminal trend of the decade: frustrated postal workers shooting colleagues in a string of highly publicized incidents. Research indicated that stress and resentment lay behind the workers’ rampages, so, for most of the 1980’s, many Americans used the term “to go postal” to mean “to become violently angry.” By the beginning of the following
Slang and slogans
■
881
decade, new policies at post offices had lessened tensions, and the term faded somewhat from the national lexicon. An increase in the number of women in the American workplace in the 1980’s resulted in new terms for issues confronting many female workers, such as “glass ceiling” (the invisible boundary that frustrated women’s attempts to rise to positions of corporate authority), “mommy track” (work options for women who chose to combine career with motherhood), and “biological clock” (a woman’s recognition of the limited time frame within which she could bear children). Both women and men of a certain class and lifestyle were often labeled “yuppies,” a term derived from the first letters of either “young urban professional” or “young upwardly mobile professional,” combined with the last syllable of “hippie.” Many of these terms related to one of the major coinages of the 1980’s: “political correctness,” an expression originally employed by progressives to poke fun at some of their own orthodoxies. The term was soon appropriated by opponents on the right, who removed the humor from the term while preserving the accusation. Science and technology contributed numerous neologisms during the 1980’s. Undeniably the grimmest was the acronym AIDS, for acquired immunodeficiency syndrome, the medical scare and scourge of the latter two decades of the twentieth century. To name a popular invention of the era, blending provided “camcorder,” a combination of camera and recorder. Conventional derivation (joining existing roots and stems) using the prefix “tele-” yielded “telecommuting” (working from home via telephone and computer), “teleconference” (a conference via phone or computer), “televangelism” (evangelism on television), and “telemarketing” (selling over the telephone). Entertainment As in previous decades, entertainment and trends among youth affected the national lexicon. Arnold Schwarzenegger’s line from The Terminator (1984), “I’ll be back,” replaced Douglas MacArthur’s “I shall return” as a common comic rejoinder at any unwanted departure. Probably the most-used catchphrase from television was Mr. T’s expression of contempt from the series The A-Team: “I pity the fool who. . . .” However, the most talkedabout linguistic trend of the 1980’s was that originated by “Valley girls,” a neologism itself that was in some ways a misnomer, as it did not strictly apply to
882
■
The Eighties in America
Smith, Samantha
teens in the San Fernando Valley in California, nor necessarily only to females. Much of what came to be called “valspeak” was typical of many teenagers around North America and exhibited traits that had been common among youth for decades. However, the speech patterns associated with Valley girls were the focus of a hit single by Frank Zappa and his daughter Moon Unit Zappa, “Valley Girl.” As a result, the California sociolect (that is, a language variation based on social group rather than region) became famous and provided a name for the speech pattern. Although some phrases from “valspeak” became infamous (including “Gag me with a spoon” and “As if!”), the distinctive aspect of the speech pattern was its intonation—a vaguely Southwestern twang and a tendency to raise the voice at the end of every utterance, as if asking a question. Impact As with much linguistic innovation, most of the slang and slogans of the 1980’s disappeared after a few years. However, certain trends of the era continued in the speech of young people in following decades, especially the tendency toward clipping (“ex” for ex-partner or spouse, for example) and the upraised inflection at the ends of statements.
■ Smith, Samantha Identification
Ten-year-old American girl who wrote a letter to Soviet leader Yuri Andropov in 1982 Born June 29, 1972; Houlton, Maine Died August 25, 1985; Lewiston-Auburn, Maine The correspondence between American Smith and the leader of the Soviet Union became a well-publicized symbol of attempts to improve relations between the two countries. In 1982, young Samantha Smith, afraid of nuclear war, asked her mother to write a letter to the new leader of the Soviet Union, Yuri Andropov. Her mother, Jane Smith, replied that Samantha should be the one to write the letter. In December, therefore, Smith wrote to Andropov, congratulating him on becoming the head of the Soviet Union and asking him if he wanted nuclear war and why he wanted
Further Reading
Bryson, Bill. Made in America. New York: Perennial, 1994. Highly accessible and thorough history of American English. In the 1980’s. http://inthe80’s.com/glossary.shtml. Excellent on-line compendium of 1980’s slang. Morris, William, and Mary Morris. Harper Dictionary of Contemporary Usage. New York: Harper & Row, 1985. Usage dictionary from the 1980’s that reflects attitudes toward language change at the time. Thomas Du Bose See also
Advertising; AIDS epidemic; Biological clock; Glass ceiling; Infomercials; Just Say No campaign; Mommy track; Mr. T; Peller, Clara; Political correctness; Post office shootings; Reaganomics; Science and technology; Terminator, The; Valley girls; Yuppies.
A 1985 Soviet stamp commemorating Samantha Smith.
The Eighties in America
A Historic Correspondence In December, 1982, Samantha Smith sent this letter to Soviet leader Yuri Andropov: Dear Mr. Andropov: My name is Samantha Smith. I am ten years old. Congratulations on your new job. I have been worrying about Russia and the United States getting into a nuclear war. Are you going to vote to have a war or not? If you aren’t please tell me how you are going to help to not have a war. This question you do not have to answer, but I would like to know why you want to conquer the world or at least our country. God made the world for us to live together in peace and not to fight. She received a reply from Andropov on April 26, 1983, in a letter excerpted here: You write that you are anxious about whether there will be a nuclear war between our two countries. And you ask are we doing anything so that war will not break out. Your question is the most important of those that every thinking man can pose. I will reply to you seriously and honestly. Yes, Samantha, we in the Soviet Union are trying to do everything so that there will not be war on Earth. This is what every Soviet man wants. This is what the great founder of our state, Vladimir Lenin, taught us. . . . In America and in our country there are nuclear weapons—terrible weapons that can kill millions of people in an instant. But we do not want them to be ever used. That’s precisely why the Soviet Union solemnly declared throughout the entire world that never—never—will it use nuclear weapons first against any country. In general we propose to discontinue further production of them and to proceed to the abolition of all the stockpiles on earth. It seems to me that this is a sufficient answer to your second question: “Why do you want to wage war against the whole world or at least the United States?” We want nothing of the kind. No one in our country—neither workers, peasants, writers nor doctors, neither grown-ups nor children, nor members of the government—want either a big or “little” war. We want peace—there is something that we are occupied with: growing wheat, building and inventing, writing books and flying into space. We want peace for ourselves and for all peoples of the planet. For our children and for you, Samantha.
Smith, Samantha
■
883
to conquer the United States. Excerpts of Samantha’s letter were published in the Soviet daily Pravda, but when the girl did not hear from Andropov himself, she wrote to Anatoly Dobrynin, the Soviet ambassador in Washington, D.C., asking if Andropov would reply. The Soviet leader did reply in April, 1983, assuring Samantha that his country did not want nuclear or any other kind of war, nor to conquer the world or the United States. It only wanted peace and friendly relations. He ended his letter by inviting Samantha and her family to visit the Soviet Union. Thus, in the summer of 1983, the family traveled to the Soviet Union for two weeks. They visited Moscow, Leningrad, and the children’s summer camp Artek, near Yalta. Samantha was impressed with what she learned there and with the friends she made among Soviet children. Afterward, she attended the Children’s International Symposium in Kobe, Japan. There, she gave an account of her letters and her trip to the Soviet Union. She proposed an international granddaughter exchange, in which the grandchildren (or nieces and nephews) of world leaders would visit the countries of their adversaries to live with the leaders of that country for two weeks. Samantha became such a popular and world-renowned figure that she was cast in a television series, Lime Street, to be produced in 1985. However, on August 25 of that year, she tragically died in an airplane crash. Samantha and her father were on a small commuter plane from Boston to Auburn, Maine, when the pilot, following a nonstandard radar vector on the approach, crashed into some trees, killing all six passengers and two crew members aboard the plane. In October, 1985, the Samantha Smith Foundation promoting international understanding began. A statue of Smith stands in front of the State Cultural Building in Augusta, Maine.
884
■
Smoking and tobacco
Impact Samantha’s courage and childlike approach to Andropov became a symbol of hope for the peaceful resolution of world problems. Six years after her death, the Cold War ended. Her direct impact on the Cold War is difficult to assess, but she was warmly remembered in both the Soviet Union and the United States, as well as around the world. Further Reading
Galicich, Anne. Samantha Smith: A Journey for Peace. Minneapolis: Dillon Press, 1987. Account of Smith’s life for juvenile readers. Samantha Smith. http://www.samanthasmith.info/ index.htm. Web site dedicated to Smith that includes a historical time line and copies of her correspondence with Andropov. Smith, Samantha. Journey to the Soviet Union. Boston: Little, Brown, 1985. Smith’s own account of her trip to the Soviet Union. Frederick B. Chary See also Cold War; Day After, The; Foreign policy of the United States; Military spending; Reagan’s “Evil Empire” speech; Soviet Union and North America; Strategic Defense Initiative (SDI).
■ Smoking and tobacco Definition
Production, consumption, and health consequences of cigarettes, cigars, pipes, and chewing tobacco
During the 1980’s, an estimated 390,000 people in the United States died from complications due to cigarette smoking. Despite accelerated efforts by the tobacco industry to market tobacco products, the emergence of antismoking efforts—including significant antitobacco legislation— resulted in a steady decline in U.S. smoking throughout the decade. In 1978, Joseph Califano, head of the Department of Health, Education, and Welfare under President Jimmy Carter, proposed several actions to fight cigarette smoking. These actions included raising taxes on cigarettes, eliminating smoking on airplanes and in restaurants, and ending government subsidies to tobacco growers. With little support for the proposals from others in the Carter administration and strong opposition from the tobacco industry, many of these proposals were blocked. The plan, however,
The Eighties in America
laid the essential groundwork for a more successful campaign against smoking that emerged in the 1980’s. One of the most important steps in the campaign against smoking and tobacco was the appointment of C. Everett Koop as U.S. surgeon general in 1981. Koop emerged as a powerful antismoking advocate, authoring reports on environmental tobacco smoke, nicotine addiction, and the negative health consequences of smoking for women. At a national conference of antismoking groups, delegates developed the “Blueprint for Action,” which outlined the necessary steps to build a more aggressive antismoking movement. The following year, the American Cancer Society, the American Lung Association, and the American Heart Association formed a strong coalition called Smoking or Health and launched a rigorous antismoking lobbying campaign in Washington, D.C. In 1985, the American Medical Association called for a complete ban on cigarette advertising and promotion. During the same year, the city of Los Angeles banned smoking in most public places and in many businesses. Regulating Smokeless Tobacco
Although it was slow in coming, strong evidence had accumulated by the mid-1980’s that smokeless tobacco also presented significant health risks, particularly that of oral cancer. The evidence mounting against smokeless tobacco worried public health officials, because smokeless tobacco use had been rising among young boys, in part because they thought it would not cause cancer or be addictive. Smokeless tobacco products had not been required by Congress to carry warning labels; however, the state of Massachusetts enacted legislation to require such labels, and twenty-five other states decided to follow suit. The federal government enacted the Smokeless Tobacco Health Education Act of 1986, which required three rotated warning labels on smokeless tobacco packages. The law also banned advertising smokeless tobacco products on electronic media and required warning labels on all packaging and all advertising except billboards.
The Tobacco Industry Seeks New Markets
Efforts to educate the public about the risk factors for cancer and cardiovascular disease contributed to an overall decline in nicotine dependence. By the end of the 1980’s, more than half of men and more than half of American white adults who had ever smoked
The Eighties in America
cigarettes had stopped. The poor, less educated, and minority groups did not experience the same degree of reduction in smoking rates as did white, middleand upper-class Americans. Moreover, most new users of cigarettes were female, reversing earlier trends. The decline in smoking rates within the United States led tobacco companies to expand into new, global markets. Four of the six multinational tobacco conglomerates were based in the United States. In the middle years of the 1980’s, the U.S. government—in cooperation with the United States Cigarette Export Association (USCEA)—worked to promote the international sale of tobacco products, especially in Asia. In Asian countries, import quotas, high taxes, and other restrictions were alleged to limit unfairly U.S. tobacco firms’ access to the markets they sought. In the face of U.S. threats, however, countries such as Japan, Taiwan, South Korea, and Thailand removed many of their restrictions on tobacco imports, leading to a more than 75 percent growth in cigarette trading in these markets and a rapid rise in smoking rates in those nations. Moreover, as smoking among U.S. adults declined, tobacco companies also targeted children as potential new consumers. Joe Camel, for example, was adopted as the official mascot of Camel cigarettes, and the character—who had been conceived thirty years earlier—was redesigned to be both more cartoonish and more “cool,” in order to appeal to a young demographic. In a series of ads that first appeared in 1987, Joe Camel appeared as a cool party animal, sporting a cigarette, sunglasses, and a tuxedo, and with adoring young women nearby. In the wake of this advertising campaign, the market share of Camel cigarettes among teenagers increased more than twentyfold. Impact The 1980’s witnessed the emergence of new leaders in Congress with the political skills to guide antitobacco legislation through both the Senate and the House of Representatives. For example, the Comprehensive Smoking Education Act (1984)—which required that four strongly worded warnings be rotated on cigarette packages and advertisements and that the warnings also be displayed prominently on all advertisements—contributed to the decline in smoking in the United States. In 1988, Congress banned smoking on domestic air flights of less than two hours in duration. The ban on smoking in airplanes was later expanded to all domestic U.S.
Soap operas
■
885
commercial air travel lasting six hours or less. Reports from the National Research Council and the Office of the Surgeon General promoted the view that passive smoking presented health risks to nonsmokers. They warned that nonsmokers living with smokers had an increased risk of lung cancer, and children living with smoking parents had an increased risk of developing respiratory problems. However, significant legislation to create smoke-free environments would not emerge until decades later. Further Reading
Kluger, Richard. Ashes to Ashes: America’s HundredYear Cigarette War, the Public Health, and the Unabashed Triumph of Philip Morris. New York: Alfred A. Knopf, 1996. Examines the role of the tobacco industry in promoting cigarette consumption even as the mounting medical evidence pointed to adverse health consequences. Pampel, Fred. Tobacco Industry and Smoking. New York: Facts On File, 2004. Easy-to-read compilation of the important dates and events in cigarette history. Snell, Clete. Peddling Poison: The Tobacco Industry and Kids. Westport, Conn.: Praeger, 2005. Good discussion regarding how the tobacco industry has marketed its products to children and youth. Includes interesting discussion of marketing Joe Camel to children. Wolfson, Mark. The Fight Against Big Tobacco. New York: Aldine de Gruyter, 2001. Documents the governmental and grassroots efforts to limit tobacco use in the United States. Mary McElroy See also Advertising; Business and the economy in the United States; Cancer research.
■ Soap operas Definition
Television serial melodramas
Reflecting the Reagan era of consumerism and excess, soap operas in the 1980’s were heavy on glamour and outrageous plots, fueling the fantasies of audience members who were facing the economic realities of the decade. Daytime and prime-time soap operas, or soaps, peaked in the 1980’s. High ratings led producers to shoot on location, emphasize adventure and boardroom stories, and spend lavishly on costumes and
886
■
Soap operas
sets. The “supercouple” phenomenon—which had its origin in the 1970’s—was fully developed, and “love in the afternoon”—originally a network advertising catchphrase—became a generic term. Celebrity fans of daytime soaps began making cameos on their favorite serials. Prime-time soaps lured many former movie stars to play key roles and made effective use of big-budget cliffhangers. Both daytime and nighttime soap operas featured males and females equally, but the characters remained predominantly white. As the decade closed, daytime serials responded to criticisms of their overt sexual content and made attempts to address contemporary social issues, such as AIDS, lesbianism, cocaine addiction, and interracial marriage. Daytime Soap Operas General Hospital ’s Luke and Laura story line began in 1979, and by 1980 the couple was extremely popular. During the summer of 1980, the two were on the run from the mob; the plot presented contemporary twists on old film staples, including a waltz in Wyndham’s department store reminiscent of 1930’s musicals and an adapted “Walls of Jericho” scene from It Happened One Night (1934). The couple’s popularity increased until Luke and Laura’s wedding photo appeared on the cover of Newsweek in 1981, as their plotlines generated more mainstream press coverage for General Hospital than any daytime soap had previously received. The wedding of Luke and Laura was also daytime television’s single highest-rated event. Other soap operas also thrived in the shadow of General Hospital’s success, as ratings for the genre as a whole climbed. Advertising rates were strong, and production costs (even with the location shoots) were reasonable, resulting in major profits for the networks. For example, the Soap Opera Encyclopedia notes that during the Luke and Laura heyday and for a few years afterward, General Hospital “alone accounted for one-quarter of ABC’s profits.” The star-crossed romance plot was nothing new, but it received renewed emphasis during the 1980’s. In addition to Luke and Laura on General Hospital, similar story lines involved Greg and Jenny on All My Children, Tom and Margo on As the World Turns, and Bo and Hope and Steve and Kayla on Days of Our Lives. A number of stars noted that they were fans of the soap opera genre and appeared on their favorite shows; examples include Carol Burnett on All My Children and Elizabeth Taylor on General Hospital.
The Eighties in America
Daytime soaps also began to feature millionaire families at the center of their narratives. Moguls had already been present in 1970’s narratives, and a wider spectrum of economic classes was represented in 1980’s daytime soaps than in prime-time narratives, but nonetheless, increasing numbers of corporate capitalists became major soap opera characters, including James Stenbeck and Lucinda Walsh on As the World Turns, Victor Kiriakis on Days of Our Lives, and Adam Chandler on All My Children. Prime-Time Soap Operas In an effort to rival the success of the 1978 Columbia Broadcasting System (CBS) hit Dallas, the American Broadcasting Company (ABC) presented Dynasty in 1981. Like Dallas, the latter series dealt with a feuding family living together in one giant mansion and a working-class outsider who had joined the family through marriage. Dynasty, however, was more elegant, especially after it removed many of its more working-class supporting characters after the first season and brought in Joan Collins to portray the devious Alexis Carrington. In addition to these two shows, other popular prime-time soaps included Knots Landing (a spinoff of Dallas) and Falcon Crest. In the early to mid-1980’s, these four shows were near the top of the ratings. With bigger budgets than their daytime counterparts, nighttime soaps had casts that included former movie stars (such as Jane Wyman in Falcon Crest and John Forsythe in Dynasty), expensive costumes, and cliffhangers with action-film violence and special effects: Both Dallas and Dynasty had scenes in which oil rigs exploded, and one Dynasty season-ending cliffhanger featured a scene in which Moldavian rebels attacked a wedding party with automatic weapons. Some of these gimmicks were more successful than others. The “Who shot J. R.?” cliffhanger on Dallas in the summer of 1980, using no special effects and minimum violence, was a publicity phenomenon. Many critics felt in retrospect that the Moldavian massacre on Dynasty in 1985 signaled the beginning of the end for that series. Nighttime soaps did not broach contemporary social topics as often as did their daytime counterparts. Dynasty did feature a major gay character, Steven Carrington, but even though Steven had scenes with male lovers, they were fairly chaste in comparison to the several sex scenes that he had with female partners. Major characters on both Dallas and Knots Landing struggled with alcoholism, but nighttime
The Eighties in America
narratives usually centered on melodramatic power struggles and glamour. Impact Daytime and prime-time soap operas in the 1980’s relied heavily on romance, action, and glamour, representing a significant shift away from earlier eras in the genre. Escapist fare, soap operas mainly served as glossy entertainment, but the beginnings of more issue-oriented storytelling in daytime soap operas—especially in terms of homosexuality, interracial relationships, and AIDS—picked up momentum as the decade moved forward. Further Reading
Anger, Dorothy. Other Worlds: Society Seen Through Soap Opera. Peterborough, Ont.: Broadview Press, 1999. Explores American and British soap operas and discusses their social significance. Includes an appendix, “Soaps’ Most Daring Stories [and the ones where they chickened out].” Frentz, Suzanne, ed. Staying Tuned: Contemporary Soap Opera Criticism. Bowling Green, Ohio: Bowling Green State University Popular Press, 1992. Collection of essays on daytime soap operas, discussing topics such as college students’ viewing habits, early AIDS story lines, supercouples, and more. Schemering, Christopher. The Soap Opera Encyclopedia. New York: Ballantine Books, 1988. Excellent concise synopsis of the genre as whole, as well as summaries of every soap opera (both daytime and prime-time) in the history of the genre, including such forgotten curiosities as the Christian Broadcasting Network’s early 1980’s entry Another Life. Julie Elliott See also
Dallas; Dynasty; General Hospital; Televi-
sion.
■ Soccer Definition
International team sport
Professional soccer suffered a setback in the United States in the 1980’s, when the major league folded in 1984. The Canadian national team, however, enjoyed a brief highlight by winning a regional international championship in 1985 and participating in the 1986 World Cup. Throughout the decade, though, soccer remained a marginal spectator sport in North America.
Soccer
■
887
In the United States, the 1980’s opened promisingly for soccer fans, with the North American Soccer League (NASL) drawing an average of fourteen thousand supporters for its games. Employing expensive, well-known foreign players, the league modified international soccer rules to encourage more goals and to prohibit ties, which were common in other nations. When top striker Giorgio Chinaglia scored two goals to lead the New York Cosmos to a 3-0 win over the Fort Lauderdale Strikers in the 1980 Soccer Bowl, the American Broadcasting Company (ABC) covered the game watched by fifty thousand fans in the stadium. Despite this modest success, high spending on star players’ salaries, rapid expansion, and a lack of American-born players increasingly endangered the viability and popularity of the sport. At the end of the season, three NASL teams folded. The national team did qualify for the 1980 Olympic Games with two wins, one draw, and a loss against opponents Costa Rica and Suriname, but the U.S. boycott of the Olympics that year deprived the team of its chance to prove its international mettle. Two years later, the team failed to qualify for the 1982 World Cup after losing to Mexico 5-1. By 1981, the NASL saw an exodus of its European star players, and game attendance fell, adding financial strains to the clubs. ABC canceled its contract, lowering the visibility of the sport. Increasingly, the Major Indoor Soccer League (MISL) was called upon to pick up the slack, together with college level competition. In 1982, Chinaglia’s goal won the New York Cosmos its fifth league title in an NASL reduced to fourteen teams. Lack of quality players, fewer fans, and a salary war with the MISL weakened the league. On the other hand, U.S. collegiate soccer teams enjoyed growth, especially women’s teams. In 1983, the idea to let the national team compete in the NASL as Team America failed abysmally, as players showed little inclination to leave their clubs for it. While the Tulsa Roughnecks beat the Toronto Blizzard 2-0 to win the championship, the league experienced a financial crisis. In the 1984 Olympics, the U.S. national team defeated Costa Rica 3-0 in Stanford, California, on July 29 in front of seventy-eight thousand spectators, the biggest soccer audience ever in the United States. Italy then defeated the team 1-0, and Egypt tied it 1-1, preventing the United States from advancing beyond the first round. The same year, the NASL folded, and
888
■
The Eighties in America
Soccer
sional outdoor soccer enjoying a renaissance due to regional leagues, collegiate soccer on the rise, and the women’s national team emerging, U.S. soccer looked with hope to the next decade. Canada
U.S. national soccer team striker Paul Caligiuri. In 1989, Caligiuri scored the winning goal in a CONCACAF match against Trinidad and Tobago, thereby qualifying his team to compete in the World Cup. (AP/Wide World Photos)
the United States failed to qualify for the 1986 World Cup. From 1985 to 1988, U.S. professional soccer was played primarily indoors. Slowly, regional outdoor leagues emerged. On July 4, 1988, the United States won the right to host the 1994 World Cup. In the 1988 Olympics, the national team competed valiantly, tying Argentina 1-1 and the Republic of Korea 0-0 before succumbing to the Soviet Union 2-4 and losing the right to advance. Rebuilding the national team paid off in 1989, when the United States beat Trinidad and Tobago 1-0 during the Confederation of North, Central American and Caribbean Association Football (CONCACAF) Gold Cup competition. The victory, thanks to striker Paul Caligiuri’s goal, qualified the United States for the 1990 World Cup. With profes-
Professional soccer in Canada experienced a false start, when the Canadian Professional Soccer League folded after its first year in 1983; before the league collapsed, the Edmonton Brickmen won the championship match against the Hamilton Steelers 2-0. In contrast, at the 1984 Olympics, the Canadian team acquitted itself well. It tied Iraq 1-1 after a leading goal by Gerry Gray and came back from a 0-1 defeat against Yugoslavia to beat Cameroon 3-1 with two goals by Dale Mitchell and one by Igor Vrablic. In the next round, Brazil tied Canada at the end of regulation play after a lead by Mitchell. Brazil won 5-3 on penalty kicks. The decade’s highlight for Canadian soccer was winning the CONCACAF Gold Cup in 1985. After tying Costa Rica 1-1 at home, the team defeated Honduras 1-0 in Tegucigalpa, tied Costa Rica 0-0 abroad, and achieved a 2-1 victory against Honduras at home. This victory not only won Canada the cup but also qualified the team to play for the World Cup for the first time in its history. At the 1986 World Cup competition in Mexico, however, Canada failed to score a single goal, losing 0-1 to France and 0-2 to Hungary and the Soviet Union. A post-cup match-fixing scandal in Singapore led to one-year suspensions of four players, including Vrablic. In 1987, the Canadian Soccer League (CSL) was founded; it survived for five years. The Hamilton Steelers participated in all three finals held in the 1980’s but lost to the Calgary Kickers in 1987 and the Vancouver 86ers in 1988 and 1989. The CSL eventually merged with the American Professional Soccer League in 1993.
Impact During the 1980’s, soccer endured a rollercoaster experience in North America. After auspicious signs in the early 1980’s, the fiscal irresponsibility of U.S. clubs—coupled with a lack of American players, no local roots, and diminishing television quotas—caused the NASL to fold by 1984. However, the large turnout for soccer matches during the 1984 Olympic Games in Los Angeles persuaded international soccer authorities to allow the United States to host the 1994 World Cup. This decision corresponded with a renaissance of professional U.S. soccer, as regional leagues grounded in their com-
The Eighties in America
munities began to thrive. Moreover, the sport became increasingly popular among young people and college students in the United States, bringing about a resurgence by the late 1980’s. In Canada, the successes of the 1984 Olympics and the win of the 1985 CONCACAF Gold Cup notwithstanding, soccer had difficulty attracting significant audiences. The efforts of the CSL ultimately failed and led to a merger with the U.S. professional league in the next decade. Further Reading
Hunt, Chris, ed. The Complete Book of Soccer. Richmond Hill, Ont.: Firefly Books, 2006. Comprehensive compendium of the sport. Markovits, Andrei, and Steven Hellerman. Offside: Soccer and American Exceptionalism. Princeton, N.J.: Princeton University Press, 2001. Study of the U.S. attitude toward the world’s most popular sport. Szymanski, Stefan, and Andrew Zimbalist. National Pastime: How Americans Play Baseball and the Rest of the World Plays Soccer. Washington, D.C.: Brookings Institution, 2005. Comparison of different nations’ relationships with their sports and athletes. R. C. Lutz See also
Olympic Games of 1980; Olympic Games of 1984; Olympic Games of 1988; Sports.
■ Social Security reform Definition
Attempts to maintain the financial solvency of the U.S. government’s Old-Age, Survivors, and Disability Insurance program
The 1980’s saw the first major change in the Social Security program: Congress raised Social Security taxes and limited benefits in order to prevent the program from falling into bankruptcy. Created in 1935, the Social Security program provides insurance for people with permanent disabilities, those reaching old age, and their survivors. It is paid for by taxing wages. The program ran into financial trouble during the 1970’s. The twin economic problems marring the 1970’s, rising inflation and high unemployment, led to dramatically higher Social Security costs coupled with declining revenues. As the decade wore on, Social Security came
Social Security reform
■
889
closer to bankruptcy. Demographics also conspired against Social Security, as life expectancies began to exceed the retirement age by a decade or more. Instead of retirees claiming benefits for a few months or years, many would receive Social Security payments for decades, putting further pressure on the program. By 1980, those familiar with the program knew that significant changes would have to be made to prevent bankruptcy. Both the president and members of Congress faced difficult decisions about how to reform a program that had defied all previous attempts to change it. Social Security came to be known as the “third rail” of American politics. Like the electrified third rail of a subway system, the program was thought of as killing the career of any politician who touches it. Thus, few were willing to attempt to change or reform it. The resilience of the program was especially apparent during the Ronald Reagan years: Reagan believed in shrinking the size and fiscal responsibilities of the federal government, but Social Security escaped the budget cuts of his administration, which weakened many other social welfare programs. The inauguration of President Reagan had marked a sea change in the American perspective on social welfare policies generally. Every such program experienced climbing costs, and the effectiveness of each one was called into question. The programs thus became a target of budget cutters seeking to balance the government budget and change social policy. The only major program to escape major cuts was Social Security, the largest and most expensive in terms of benefits paid and recipients. As Social Security floundered toward bankruptcy, reform seemed inevitable. Reagan’s early attempts to change the system proved controversial, however. His proposed reforms included reducing the quantity of money received by those taking early retirement and receiving Social Security before the age of sixty-five, then 80 percent of full benefits. Reagan proposed reducing the benefit level of those retiring early to 55 percent of full benefits. This proposal would have both reduced the amount of money paid to early retirees and reduced the number of people choosing to retire early. Moreover, those workers choosing to defer their retirement to age sixty-five would both defer the day when they began to draw benefits and continue to pay more money into the Social Security trust fund as they continued to earn taxable wages. The president’s plan received little
890
■
The Eighties in America
Social Security reform
support, and it was unanimously rejected by the Republican-controlled Senate. After this defeat, Reagan sought political cover, delegating the problem to a bipartisan commission that was to propose changes that the president could reject if he deemed them too controversial. Headed by economist Alan Greenspan, who would later become chairman of the Federal Reserve, the Social Security Commission faced the problem of increasing benefits paid out to retirees and decreasing revenues from workers’ taxes. Composed of labor, business, and political leaders, the commission considered many solutions, all with potentially disastrous consequences for any politician who proposed them. After several months of debate and hearings, the commission offered a plan and several suggestions for stabilizing Social Security. The first was to raise revenue by increasing the Social Security taxes paid by workers and businesses. The commission also proposed adding federal and state employees to the Social Security system, which would raise the number of workers paying into the system but also the number of potential retirees who could draw benefits in the future. A more controversial possibility, raising the retirement age to sixty-seven, was offered as a suggestion rather than a proposal, as members of the commission could not agree to it. The commission’s work, though, would be wasted if Congress did not pass their recommendations. Inclusion of political leaders, including the selfproclaimed defender of Social Security, Florida representative Claude Pepper, eased the political process. Social Security reform bills following some of the commission’s proposals and suggestions began to be introduced in Congress. The first bill sought to raise Social Security taxes: It passed easily, as House members realized the boon to the federal purse created by a tax increase. The second major proposal came from the commission’s suggestion to raise the retirement age. The bill, offered by Texas congressman J. J. Pickle, proposed gradually raising the retirement age to sixtyseven. Starting in the year 2000 with gradual increases until the year 2020, retirees would have to work longer before receiving full retirement benefits. Moving the retirement age from sixty-five to sixty-seven in this fashion proved politically palatable, as it would affect only future workers, including many who were not yet of voting age, while not
affecting those currently approaching or already having reached retirement. Constituents between the ages of sixty and sixty-seven might otherwise have fought aggressively against the change. Pickle’s proposal passed narrowly. The rise in the retirement age had little immediate impact on the solvency of Social Security, but it was intended to stabilize the system in the future, as more money poured into the system while the number of retirees was reduced. The full reform bill easily passed the Senate and was signed by the president on April 20, 1983. The involvement of Republicans and Democrats, labor and business, had allowed the reforms to be passed with a minimum of partisan rancor. Impact The Social Security Commission and the legislation it spawned solved the immediate problem of the program heading toward bankruptcy, but it did not solve the longer-term problems of the program, including an aging population, longer life expectancy for retirees, and the baby-boom generation, whose size would entail significant drains upon the program’s coffers when baby boomers began to retire in 2013. The decision to raise both taxes and the retirement age may have only pushed the harder decisions into the future, though Congress and the president demonstrated that changes in Social Security could be achieved using a bipartisan approach. Further Reading
Beland, Daniel. Social Security. Lawrence: University of Kansas Press, 2005. Fast-paced book that describes how Social Security evolved from a limited old-age program to a larger social welfare program, as well as summarizing attempts to reform it. Berkowitz, Edward. Robert Ball and the Politics of Social Security. Madison: University of Wisconsin Press, 2003. Describes the life and efforts of the head of the Social Security Administration during the 1960’s and early 1970’s. Koitz, David. Seeking Middle Ground on Social Security Reform. Stanford, Calif.: Hoover Press, 2001. Analyzes the various approaches to changing the Social Security system in order to solve the financial and other problems with the system. Douglas Clouatre See also
Business and the economy in the United States; Conservatism in U.S. politics; Congress, U.S.; Demographics of the United States; Elections in
The Eighties in America
the United States, 1980; Income and wages in the United States; Reagan, Ronald; Reagan Revolution; Reaganomics.
■ Soviet Union and North America Definition
Relations between the Union of Soviet Socialist Republics and the United States and Canada
The 1980’s opened with the Soviet Union a bitter enemy of the United States and Canada. By the end of the decade, North American leaders had come to an accommodation with Soviet leader Mikhail Gorbachev, and the Soviet state was on the verge of collapse. The 1980’s witnessed the most dramatic shift in relations with the Soviet Union since Washington had recognized Moscow in 1933, perhaps since the Russian Revolution itself. It started with the 1980 election of Ronald Reagan, a bitter foe of the Soviet Union, to the U.S. presidency and ended with the collapse of the Soviet state. Since 1917, with the formation of Soviet Russia, relations between the countries had run the gamut, from military invasion by American forces in 1918 and 1919 to a firm alliance against Nazi Germany in World War II. After that war, however—beginning in the late 1940’s—the two governments were locked in an adversarial Cold War. This relationship, too, fluctuated, from confrontational during the the last days of Joseph Stalin (whose regime ended when he died in 1953) through the era of peaceful coexistence under Nikita Khrushchev (1953-1964) and détente under Leonid Brezhnev (1964-1982) in the 1960’s and 1970’s. Relations were complicated with the successful communist revolution in China in 1949. Until 1971, ChineseAmerican relations were even worse than those with the Soviet Union; Washington refused to recognize the government in Beijing. However, a dramatic reversal under president Richard M. Nixon and Secretary of State Henry Kissinger led to a new ChineseAmerican accord and placed pressure on Moscow, whose own relations with its Chinese communist neighbor-ally were worse than either nation’s with the United States—even exploding into border clashes in 1969.
Soviet Union and North America The Years of Confrontation
■
891
A deterioration in U.S.Soviet relations occurred near the opening of the decade when the Soviet Union helped to overthrow the government in Afghanistan and Soviet troops entered the Central Asian country in late December, 1979, to prop up a more friendly government. Washington reacted with condemnation, and President Jimmy Carter “punished” Moscow by refusing to allow American athletes to participate in the Moscow Olympics of 1980. In 1984, Moscow retaliated by boycotting the Summer Olympics, held in Los Angeles. Carter also announced an embargo on selling grain to the Soviets at a time when they desperately needed it. Canada joined in the Olympic boycott and for a while withheld grain from Moscow, promising not to fill the gap caused by Washington’s embargo when it resumed sales. Much more serious for the Soviets was the long war in which they became involved in Afghanistan, where many casualties sapped Moscow’s resources. The war in Afghanistan, in fact, was a major cause of the Soviet Union’s demise. Throughout the decade, the new U.S. president, Ronald Reagan, supported the anti-Soviet insurgency against the pro-Soviet Afghani government and the Red Army fighting in the country. In the early years of the 1980’s, Washington and Moscow confronted each other all over the world—in Central Asia, East Asia, Latin America, and Africa. Strategic arms talks begun in the previous years came to a standstill. Reagan wanted to build even more weapons, hoping that dragging the Soviet Union into an arms race would be too expensive for Moscow and force the Soviets to make concessions. At the same time Reagan won the presidency, Conservative leader Margaret Thatcher became the prime minister of the United Kingdom. Together, Reagan and Thatcher put up a solid anticommunist front against Moscow. In addition to opposing the Soviets in Afghanistan, Reagan led a campaign against Soviet activity in Poland, where the independent union Solidarity was outlawed and where, in 1981, General Wojciech Jaruzelski imposed martial law. Reagan and Thatcher blamed Moscow. Reagan approached the Soviet Union with strident confrontation. In a speech delivered on June 8, 1982, he declared the country an “evil empire.” He funded anti-Soviet movements in developing nations such as Afghanistan and Nicaragua, where the
892
■
Soviet Union and North America
insurgent Contras battled the left-leaning government of Daniel Ortega. To some degree these efforts backfired. The fundamentalist Islamic Taliban replaced the Sovietbacked government in Afghanistan, which would lead to a brutally repressive regime that later caused problems for Washington. In Nicaragua, the backing of the Contras involved illegal activity that erupted into the Iran-Contra scandal of 1986. At one point Reagan even joked about his hostility toward the Soviets. While preparing for his weekly radio address to the nation on August 4, 1984, instead of the usual “one, two, three” testing to check the line, the former actor made the statement: My fellow Americans, I’m pleased to tell you today that I’ve signed legislation that will outlaw Russia forever. We begin bombing in five minutes.
The joke, intended to be made in private, leaked, causing an international scandal. Reagan supported a plan to establish a space- and surface-based antiballistic missile system called the Strategic Defense Initiative (SDI), or the “Star Wars” program. Although this controversial program was never adopted, the U.S. government did fund massive new weapons systems. Reagan pushed for the deployment of Pershing and cruise missiles in Western Europe, for increased allocations in European allies’ military budgets, and for their adopting his antiSoviet policies—with mixed results. Reagan also objected to European Community (EC) contracts with the Soviet Union for building natural gas pipelines. For a while in 1982, Washington banned the use of U.S. technology in such projects, lifting the ban later that year when an agreement with the EC on trade policies was reached. The United States also resumed grain sales to the Soviet Union and initiated Strategic Arms Reduction Treaty talks (START), first presented by Reagan in Geneva on June 29, 1982. These disarmament negotiations continued off and on. Washington pushed for the so-called zero option, which linked the removal of Pershing missiles from Europe to the Soviets’ reduction of intermediate-range nuclear forces (INFs) and mobile missile launchers from the East. Andropov and Chernenko
As the decade proceeded, the Soviet Union underwent major changes. On November 10, 1982, Brezhnev died after suffering a long illness. A struggle for leadership between
The Eighties in America
Soviet liberals and old-guard bureaucrats continued while two more elder statesmen, Yuri Andropov and Konstantin Chernenko, successively assumed the Soviet leadership role. Andropov tried to reestablish the era of détente, but Reagan was adamant against any easing of tensions. After receiving a letter from a New England child, Samantha Smith, asking why he was opposed to peace, Andropov responded by saying that he and the Soviet people were in fact eager for peace between the two countries and invited the girl and her family to the Soviet Union for a visit. He tried to divide Western Europe from the United States, with the particular goal of preventing U.S. Pershing missiles on the Continent. However, during his tenure, on September 1, 1983, Soviet warplanes shot down a Korean Air Lines commercial aircraft, flight KAL 007, while it was flying near the border between Soviet and Korean airspace; 269 civilian passengers and crew members died. Moscow maintained that the aircraft had entered Soviet airspace. The Soviets believed that the plane was on a spying mission, in a deliberate attempt on the part of the United States to provoke the Soviets. Andropov died in 1984 after only sixteen months in office. Chernenko succeeded him but died the following year. Gorbachev In the subsequent power struggle, the reformer Mikhail Gorbachev succeeded to the Soviet leadership position. A year later, in April of 1986, the Soviet Union suffered a catastrophe when the nuclear plant at Chernobyl in the Ukraine suffered a meltdown, causing explosions and sending radioactive contamination as far as Belorus. After the Soviets briefly denied the accident, Gorbachev realized that he had to appeal to the West for help. Afterward, Gorbachev introduced the reform policies glasnost (openness) and perestroika (economic reconstruction). Unlike liberalization attempts in previous Soviet periods, Gorbachev’s policies were genuine and effective. Eventually, even democratic elections took place in the Soviet Union. Furthermore, Gorbachev began a real policy of cooperation with the West. Gorbachev became a folk hero around the world, a phenomenon called “Gorbymania.” Even Reagan came to appreciate his efforts, and in a series of summit meetings in Geneva (1985), Reykjavik (1986), Washington, D.C. (1987), and Moscow (1988) the two leaders put forward meaningful programs, although disagreements still existed. In 1987,
The Eighties in America
Soviet Union and North America
■
893
the two countries reached an arms reduction treaty. In 1988, Moscow left Afghanistan. Ultraconservative supporters of the president attacked him for his about-face with regard to Moscow, but Reagan counterattacked with the same vitriol he had once reserved for the Soviets. The last piece of the puzzle was the dismantling of the Soviet’s Eastern European empire. Reagan chided Gorbachev for not permitting the final democratic reforms in his “satellite countries.” On June 12, 1987, Reagan stood at the Berlin Wall—the symbol of the division between East and West, erected in 1961—and shouted, “Mr. Gorbachev, tear down this wall!” Gorbachev responded by telling his communist allies that Moscow would not interfere in their internal affairs. One by one, the countries of Eastern Europe broke with their communist leadership and introduced democratically elected governments.
Atlantic Treaty Organization (NATO). In the early years the Liberal Party prime minister Pierre Trudeau pushed the United States for a more conciliatory tone. He disagreed with Reagan about linking disarmament to Soviet behavior. In 1984, however, the Conservative Brian Mulroney replaced Trudeau and adopted as harsh a stance as that of Reagan. Mulroney was even more reluctant than the American president to come to terms with Gorbachev in the last years of the decade. In 1988, Canada expelled some Soviet diplomats accused of military and industrial espionage. However, both the Soviets and Canadians attempted to exploit differences from Washington’s policies. Bilateral talks between Ottawa and Moscow began in 1984 and led to an agreement in 1989 on cooperation in the Arctic and the north, with Canada providing much technical assistance.
Canada Canadian relations with the Soviet Union after World War II mirrored those of the United States, as both countries were members of the North
Impact The 1980’s witnessed a remarkable reversal of history centered on Soviet relations with the West and particularly the United States. The forty-year-
U.S. president Ronald Reagan and Soviet leader Mikhail Gorbachev relax together during their first summit meeting in Geneva, Switzerland, in November, 1985. (Ronald Reagan Presidential Library)
894
■
The Eighties in America
Space exploration
old Cold War was coming to an end as Gorbachev introduced liberalization into the Soviet Union in part because of pressure from the United States. Subsequent Events
By the end of the decade the whole system was on the verge of collapse, and on December 31, 1991, after a failed last-ditch attempt by communist hard-liners the previous summer to overthrow Gorbachev, the Soviet Union dissolved into its constituent republics.
Further Reading
Boyle, Peter G. American-Soviet Relations: From the Russian Revolution to the Fall of Communism. New York: Routledge, 1993. A scholarly monograph by a respected historian and specialist on international relations. Bibliography. Garthoff, Raymond L. The Great Transition: AmericanSoviet Relations and the End of the Cold War. Washington, D.C.: Brookings Institution, 1994. A meticulous documented analysis by a diplomat and author of several important books on Soviet-American relations. Gorbachev, Mikhail. Perestroika and Soviet-American Relations. Madison, Conn.: Sphinx Press, 1990. A collection of Gorbachev’s speeches and interviews to Western audiences from late 1987 through 1989. Halliday, Fred. From Kabul to Managua: Soviet-American Relations in the 1980’s. New York: Pantheon Books, 1989. A critical analysis of the two superpowers by a controversial socialist professor from the London School of Economics. Hill, Kenneth L. Cold War Chronology: Soviet-American Relations, 1945-1991. Washington, D.C.: Congressional Quarterly, 1993. An annotated chronology of the events from World War II until the end of the Soviet Union in 1991. Ishaq, Mohammed. The Politics of Trade Pressure: American-Soviet Relations, 1980-88. Brookfield, Vt.: Ashgate, 1999. A monograph centering on economic issues between Moscow and Washington. LaFeber, Walter. “The Two—or Three?—Phases of U.S.-Soviet Relations, 1981-1986.” In Crisis and Confrontation: Ronald Reagan’s Foreign Policy, edited by Morris H. Morley. Totowa, N.J.: Rowman & Littlefield, 1988. An examination of the policies and changes in Soviet-U.S. relations in the first half of the decade by one of America’s most distinguished historians of the country’s international affairs.
Nossal, Kim Richard. “The Politics of Circumspection: Canadian Policy Towards the USSR, 19851991.” International Journal of Canadian Studies 9 (Spring, 1994). A scholarly examination of Canadian attitudes toward the Soviet Union after the advent of Gorbachev. Stein, Janice Gross. The Odd Couple: Analytical Perspectives on Canada’s Relationship with the Soviet Union. Toronto: Centre for Russian and East European Studies, University of Toronto, 1986. A short academic analysis prepared for a conference on Canadian-Soviet relations. Trofimenko, G. A. Lessons of Peaceful Coexistence: FiftyFive Years of Soviet-American Diplomatic Relations. Moscow: Novosti Press Agency, 1988. An analysis by a leading Russian academic specializing in American studies. Frederick B. Chary See also
Berlin Wall; Cold War; Foreign policy of Canada; Foreign policy of the United States; Goodwill Games of 1986; Intermediate-Range Nuclear Forces (INF) Treaty; Miracle on Ice; Olympic boycotts; Olympic Games of 1980; Olympic Games of 1984; Reagan’s “Evil Empire” speech; Reykjavik Summit; Smith, Samantha; Strategic Defense Initiative (SDI).
■ Space exploration Definition
Use of satellites and other spacecraft to gather scientific information about space and other planets
During the 1980’s, U.S. space vehicles frequently exceeded all expectations, gathered a wealth of new information about planets and distant astronomical phenomena, and fulfilled the goal of visiting all the major planets in the solar system. The 1980’s witnessed a shift in emphasis in the U.S. space program. The National Aeronautics and Space Administration (NASA) turned away from human space flights of exploration, such as the moon voyages that had dominated attention in the 1970’s, in favor of unpiloted probes and satellites. Crewed spaceflight, by contrast, was dedicated to delivery of cargo, maintenance of satellites, and performance of scientific experiments in low Earth orbit through the Space Transportation System. This program’s in-
The Eighties in America
Space exploration
■
895
augural flight was made by the space shuttle Columbia on April 12, 1981. However, the disastrous explosion of Challenger while launching on January 28, 1986, grounded the four-shuttle fleet for two years and resulted in even more emphasis being placed upon the use of uncrewed vehicles. Satellite Observatories
NASA put into Earth orbit four U.S.-produced astronomical observatories, two of which studied the Sun and two of which surveyed distant phenomena in the universe. Solar Maximum Mission was launched on February 14, 1980, and, after repair by a space shuttle crew in 1984, collected data on solar flares until 1989. Orbited on October 6, 1981, Solar Mesosphere Explorer monitored fluctuations in the Sun’s production of ultraviolet light for five years. The Infrared Astronomical Satellite (IRAS), the first of its kind, reached orbit on January 26, 1983, in a joint project with the United Kingdom and the Netherlands. During its ten months of operations, IRAS examined more than 96 percent of the The Voyager spacecraft undergoes vibration testing. (NASA-JPL) heavens, cataloging thousands of previously unknown galaxies and star-birthing in 1977 on a “Grand Tour” of the solar system’s larggas clouds. It also discovered five new comets in est planets. From November 12 until December 15, the solar system. The Cosmic Background Explorer 1980, Voyager 1 photographed the moons, rings, (COBE) entered orbit on November 18, 1989, and and atmosphere of Saturn and took magnetic and for four years mapped infrared background radiatemperature readings. It revealed greater complextion to learn about the origin of the universe. ity in the rings than had previously been realized, found three new moonlets, and detected a thick, hyLanders and Deep Space Probes The decade drocarbon-rich atmosphere on the moon Titan, a opened with six of NASA’s most renowned missions possible venue for life. Voyager 2 began collecting under way. Two landers were on Mars, taking photodata on Saturn on June 5, 1981, and by the time it graphs of the surface and gathering data on Martian flew beyond instrument range three months later, soil and atmospheric conditions. Both had been it had taken high-resolution photographs of four launched in 1975; Viking 1 was operational until Nomoons and further studied Saturn’s ring system. vember 13, 1982, and Viking 2, until April 12, 1980. Voyager 2 went on to Uranus, becoming the first In addition, four probes were speeding deep into craft to visit that planet. The flyby there, lasting from the solar system. Pioneer 10, launched in 1972, and November 4, 1985, to February 25, 1986, produced Pioneer 11, launched in 1973, were far past their photographs of the planet’s thick clouds, showed chief planetary objectives but continued transmitthat it rotated at 98 degrees from its orbital plane, exting to Earth valuable data about interplanetary amined its nine rings, and discovered ten new, small magnetic fields, dust, and cosmic rays. moons. The probe’s next encounter (and another The most productive and far-ranging probes, howfirst) was with Neptune, making its closest approach ever, were Voyager 1 and Voyager 2, both launched
896
■
The Eighties in America
Space shuttle program
to the planet on August 25, 1989. Voyager 2 discovered the Great Dark Spot, a hole in Neptune’s cloud cover; spotted six new moonlets; and scrutinized its largest moon, Triton, detecting active volcanism. Two additional highly successful probes were launched in 1989. Magellan was released into Earth orbit by a space shuttle on May 4 and then blasted off toward Venus, and on October 18, Galileo was also released from a space shuttle and sent on its way to Jupiter. Impact Once Voyager 2 passed Neptune in 1989, all of the solar system’s major planets had been visited by U.S. probes, a triumph for NASA quite in addition to the wealth of photographs and data the Voyager probes returned. These successful programs encouraged the agency to focus on deep space probes through the next two decades. The COBE infrared observatory produced data that gave crucial support to a cosmological theory that the early universe experienced a spurt of “inflationary” growth. The theory later became accepted among cosmologists. Subsequent Events
The Voyager probes soon passed Pioneer 10 to become the most distant humanmade objects and throughout the decade continued returning data about the outermost reaches of the solar system. Magellan reached Venus on August 10, 1990, and began mapping its surface with radar. After photographing two asteroids en route, Galileo entered orbit around Jupiter on December 7, 1995, and was operational for eight years.
Further Reading
Evans, Ben, with David M. Harland. NASA’s Voyager Missions: Exploring the Outer Solar System and Beyond. Chichester, England: Springer, 2004. Covers planetary discoveries of the two Voyager probes, and the historical background to them, in detail; describes the design, launch, and flights of the probes themselves. With an abundance of blackand-white and color photographs. Godwin, Robert, and Steve Whitefield, eds. Deep Space: The NASA Mission Reports. Burlington, Ont.: Apogee Books, 2005. Contains original articles, overviews, and technical descriptions of the Pioneer and Voyager missions written during the programs, with many illustrations. A resource for space exploration enthusiasts and amateur historians.
Neal, Valerie, Cathleen S. Lewis, and Frank H. Winter. Spaceflight: A Smithsonian Guide. New York: Macmillan, 1995. This pleasantly written, nontechnical general history of American spaceflight includes a chapter on planetary probes and offers many dramatic photographs and graphics. Tobias, Russell R., and David G. Fisher, eds. USA in Space. 3 vols. 3d ed. Pasadena, Calif.: Salem Press, 2006. Comprehensive collection of nontechnical articles that detail the history of American space programs and individual voyages. With black-andwhite photographs and graphics. Roger Smith See also Astronomy; Challenger disaster; Garneau, Marc; Ride, Sally; Science and technology; SETI Institute; Space shuttle program.
■ Space shuttle program Definition
Program using reusable manned spacecraft
The space shuttle provided the National Aeronautics and Space Administration with the means to conduct varied missions for scientific, governmental, military, and commercial customers. President Richard M. Nixon approved the space shuttle program in April, 1972. Capable of hauling cargo and personnel to low Earth orbit and returning to a runway landing, the Space Transportation System (STS) promised launch service cost reduction and reusability. In the realm of American crewed flight, after the Apollo program, only three Skylab missions and a joint U.S.-Soviet docking (ApolloSoyuz Test Project) were conducted during the remainder of the 1970’s. Shuttle development suffered budget constraints and technical problems. The first orbiter, Columbia, had been scheduled for a March, 1978, launch, but that date slipped repeatedly. Columbia ’s First Missions In 1981, the nation turned its attention toward Kennedy Space Center (KSC) for the first crewed National Aeronautics and Space Administration (NASA) spaceflight since 1975. Aboard Columbia on STS-1 were veteran astronaut John W. Young and rookie astronaut Robert L. Crippen, Jr. On April 10, a computer timing discrep-
The Eighties in America
Space shuttle program
■
897
ancy scrubbed Columbia’s launch, but on April 12 the space shuttle was successfully launched. For thirty-six orbits, Young and Crippen evaluated systems, executed thruster firings, and held public affairs events. After two days, Young gently landed Columbia on Edwards Air Force Base’s dry lake bed. Columbia launched again on November 12. STS-2 included testing of the Remote Manipulator System (RMS), a mechanical arm designed to manipulate cargo around Columbia’s payload bay Data were collected with an imaging radar system that filled up the bay. Columbia returned to Edwards Air Force Base after two days because of a fuel cell problem. Columbia launched on March 22, 1982, for the seven-day STS-3 test flight, which included RMS arm operations and demonstrations of the The space shuttle Columbia lifts off on its first mission on April 12, 1981. (NASA shuttle middeck’s utility to support CORE/Lorain County JVS) research. The RMS arm grappled the Plasma Diagnostics Package, an equipped with solid-fueled Payload Assist Modules instrument that recorded magnetohydrodynamic en(PAMs) to boost them to geosynchronous positions. vironments around Columbia. Unacceptable weather An attempted space walk (or extravehicular activity, delayed Columbia’s reentry. Columbia was diverted to EVA) was thwarted by space suit problems. Columbia Northrup Strip at White Sands Missile Range, New landed at Edwards Air Force Base after five days. Mexico. Columbia’s final test flight launched on June 27. Both solid rocket boosters (SRBs) were lost when Challenger and Spacelab 1 Challenger first launched parachutes failed to deploy properly. That had no efon April 4, 1983. Aboard was a four-person crew and fect on the shuttle’s reaching orbit. Aboard was the a Tracking and Data-Relay Satellite (TDRS), the first classified military payload Cirris. STS-4 demoncomponent in NASA’s network of geostationary comstrated that NASA could fly shuttles under Departmunications satellites designed to establish continument of Defense (DOD) secrecy regulations. The ous communications during shuttle missions. TDRS mission ended after seven days, providing a patriotic needed the more powerful Inertial Upper State (IUS) setting for a Fourth of July celebration. President solid-fueled booster for its boost to geostationary altiRonald Reagan and shuttles Enterprise and Challenger tude. STS-6’s TDRS suffered an IUS motor failure. were in place by Edwards Air Force Base’s concrete Over the course of many weeks, thrusters eventually runway to greet the returning astronauts. Reagan nudged TDRS-A into operational position. Astrodeclared the STS fleet operational and signaled nauts tested EVA translation methods within ChalNASA’s 747 carrier aircraft to taxi down the runway, lenger’s payload bay. The shuttle landed at Edwards beginning Challenger ’s delivery flight to KSC. Air Force Base after five days.Challenger’s STS-7 misColumbia’s STS-5 mission launched on November sion began on June 18. Among this initial five-person 11 with the first four-person crew, and two comcrew was the first American woman in space, Dr. Sally munications satellites. Those satellites were housed Ride. Two commercial satellites were deployed. Alwithin protective enclosures until deployment and though scheduled to attempt the first landing at
898
■
The Eighties in America
Space shuttle program
Space Shuttle Missions in the 1980’s Mission Name
Dates
Astronauts
STS-1 Columbia
April 12-14, 1981
John W. Young and Robert L. Crippen, Jr.
STS-2 Columbia
November 12-14, 1981
Joseph H. Engle and Richard H. Truly
STS-3 Columbia
March 22-30, 1982
Jack R. Lousma and C. Gordon Fullerton
STS-4 Columbia
June 27-July 4, 1982
Thomas K. Mattingly II and Henry W. Hartsfield, Jr.
STS-5 Columbia
November 11-16, 1982
Vance D. Brand, Robert F. Overmyer, Joseph P. Allen, and William B. Lenoir
STS-6 Challenger
April 4-9, 1983
Paul J. Weitz, Karol J. Bobko, Donald H. Peterson, and F. Story Musgrave
STS-7 Challenger
June 18-24, 1983
Robert L. Crippen, Jr., Frederick H. Hauck, John M. Fabian, Sally Ride, and Norman E. Thagard.
STS-8 Challenger
August 30-September 5, 1983
Richard H. Truly, Daniel C. Brandenstein, Dale A. Gardner, Guion S. Bluford, Jr., and William E. Thornton
STS-9 Columbia
November 28-December 8, 1983
John W. Young, Brewster H. Shaw, Owen K. Garriott, Robert A. Parker, Byron K. Lichtenberg, and Ulf Merbold
STS 41-B Challenger
February 3-11, 1984
Vance D. Brand, Robert L. Gibson, Bruce McCandless II, Ronald E. McNair, and Robert L. Stewart
STS 41-C Challenger
April 6-13, 1984
Robert L. Crippen, Jr., Francis R. Scobee, George D. Nelson, James D. A. von Hoften, and Terry J. Hart
STS 41-D Discovery
August 30-September 5, 1984
Henry W. Hartsfield Jr., Michael L. Coats, Judith A. Resnik, Steven A. Hawley, Richard M. Mullane, and Charles D. Walker
STS 41-G Challenger
October 5-13, 1984
Robert L. Crippen, Jr., Jon A. McBride, Kathryn D. Sullivan, Sally Ride, David C. Leestma, Marc Garneau, and Paul D. Scully-Power
STS 51-A Discovery
November 8-16, 1984
Frederick H. Hauck, David M. Walker, Anna L. Fisher, Dale A. Gardner, and Joseph P. Allen
STS 51-C Discovery
January 24-27, 1985
Thomas K. Mattingly II, Loren J. Shriver, Ellison S. Onizuka, James F. Buchli, and Gary E. Payton
STS 51-D Discovery
April 12-19, 1985
Karol J. Bobko, Donald E. Williams, M. Rhea Seddon, Jeffrey A. Hoffman, S. David Griggs, Charles D. Walker, and Sen. E. Jake Garn
STS 51-B Challenger
April 29-May 6, 1985
Robert F. Overmyer, Frederick D. Gregory, Don L. Lind, Norman E. Thagard, William E. Thornton, Lodewijk van den Berg, and Taylor G. Wang
STS 51-G Discovery
June 17-14, 1985
Daniel C. Brandenstein, John O. Creighton, Shannon W. Lucid, John M. Fabian, Steven R. Nagel, Patrick Baudry, and Sultan Salman Al-Saud
STS 51-F Challenger
July 29-August 6, 1985
C. Gordon Fullerton, Roy D. Bridges, Jr., F. Story Musgrave, Anthony W. England, Karl G. Henize, Loren W. Acton, and John-David F. Bartoe
STS 51-I Discovery
August 27-September 3, 1985
Joseph H. Engle, Richard O. Covey, James D. A. van Hoften, John M. Lounge, and William F. Fisher
The Eighties in America
Space shuttle program
■
Mission Name
Dates
Astronauts
STS 51-J Atlantis
October 3-7, 1985
Joseph H. Engle, Richard O. Covey, James D. A. van Hoften, John M. Lounge, and William F. Fisher
STS 61-A Challenger
October 30-November 6, 1985
Henry W. Hartsfield, Jr., Steven R. Nagel, James F. Buchli, Guion S. Bluford, Bonnie J. Dunbar, Reinhard Furrer, Ernst Messerschmid, and Wubbo J. Ockels
STS 61-B Atlantis
November 26-December 3, 1985
Brewster H. Shaw, Jr., Bryan D. O’Connor, Mary L. Cleave, Sherwood C. Spring, Jerry L. Ross, Rodolfo Neri Vela, and Charles D. Walker
STS 61-C Columbia
January 12-18, 1986
Robert L. Gibson, F. Bolden, Jr., Franklin R. ChangDiaz, Steven A. Hawley, George D. Nelson, Robert J. Cenker, and Congressman Bill Nelson.
STS 51-L Challenger
January 28, 1986
Francis R. Scobee, Michael J. Smith, Judith A. Resnik, Ellison S. Onizuka, Ronald E. McNair, Gregory B. Jarvis, and Christa McAuliffe
STS-26 Discovery
September 29-October 3, 1988
Frederick H. Hauck, Richard O. Covey, John M. Lounge, George D. Nelson, and David C. Hilmers
STS-27 Atlantis
December 2-6, 1988
Robert L. Gibson, Guy S. Gardner, Richard Mullane, Jerry L. Ross, and William M. Shepherd
STS-29 Discovery
March 13-18, 1989
Michael L. Coats, John E. Blaha, James P. Bagian, James F. Buchli, and Robert C. Springer
STS-30 Atlantis
May 4-8, 1989
David M. Walker, Ronald J. Grabe, Norman E. Thagard, Mary L. Cleave, and Mark C. Lee
STS-28 Columbia
August 8-13, 1989
Brewster H. Shaw, Richard N. Richards, James C. Adamson, David C. Leestma, and Mark N. Brown
STS-34 Atlantis
October 18-23, 1989
Donald E. Williams, Michael J. McCulley, Franklin R. Chang-Diaz, Shannon W. Lucid, and Ellen S. Baker
STS-33 Discovery
November 22-27, 1989
Frederick D. Gregory, John E. Blaha, F. Story Musgrave, Manley L. Carter Jr., and Kathryn C. Thornton
899
Source: National Aeronautics and Space Administration.
KSC, the Challenger was forced to divert to Edwards Air Force Base after six days. STS-8 lifted off on August 30, performing the shuttle’s first night launch. The crew included the first African American astronaut, Guion Stewart Bluford, Jr. A Payload Test Article was used to assess the RMS arm’s ability to handle massive objects. An Indian communications satellite was dispatched to geosynchronous orbit. After six days, Challenger landed at night at Edwards Air Force Base. On November 28, Spacelab 1 launched aboard Columbia. The European Space Agency (ESA) built and provided Spacelab to NASA in exchange for
sending astronauts into space. The pressurized module housed inside Columbia’s payload bay allowed scientists to perform research programs. Ten days after launch, after seventy-two separate science projects concluded, Columbia landed at Edwards Air Force Base. STS-9 marked John Young’s last command. 1984 Missions
Challenger’s STS 41-B flight began on February 3, 1984, and included two satellite deployments and tests of a Manned Maneuvering Unit (MMU) that could independently take astronauts several hundred feet away from the shuttle. During two EVAs, the MMU was put through its paces. Astro-
900
■
Space shuttle program
naut Bruce McCandless II flew it nearly one hundred meters from the space shuttle. Each satellite’s PAM failed, stranding these satellites in useless orbits. Engineers began devising rescue plans for these expensive communications satellites. Challenger achieved the first landing at KSC on February 11. STS 41-C was the first satellite repair mission. The Solar Max observatory had malfunctioned, but its problems were understood. Challenger launched on April 6, and, after rendezvousing, astronaut George D. Nelson flew the MMU to Solar Max but could not dock. Commander Crippen then maneuvered Challenger within thirty-five feet of the observatory, and the RMS arm grappled it. Solar Max was secured to a work platform where astronauts repaired it. The observatory was released to continue its research. Discovery’s first flight (STS 41-D) occurred on August 30. The space shuttle carried three satellites and a large folded solar array designed for a space station. The latter contained no functional solar cells, as this was an array deployment dynamics test. Discovery landed at Edwards Air Force Base after six days.Challenger launched on October 5. STS 41-G included deployment of an Earth resources satellite, the first time that seven people were simultaneously launched into space, and the first space walk performed by an American woman, Kathryn D. Sullivan. Astronauts Sullivan and David Leetsma demonstrated satellite refueling methods. Challenger landed at KSC after eight days. Discovery launched on November 8. STS 51-A deployed two satellites like those on STS 41-B and captured the wayward STS 41-B satellites, Westar 6 and Palapa B-2. Astronauts Dale Gardner and Joseph P. Allen used the MMU to dock and capture the satellites, strapping them in Discovery’s payload bay for return to Earth at KSC after eight days. 1985 Missions
The STS 51-C mission was the first fully classified DOD shuttle flight. Inside Discovery on January 24, 1985, was a classified electronic intelligence (ELINT) payload. After deployment, it suffered an IUS malfunction, but the ELINT satellite did collect reconnaissance data. After three days, Discovery returned to KSC. Senator Jake Garn of Utah was an observer on STS 51-D, which launched on April 12. Two satellites were deployed from Discovery’s payload bay. However, the Syncom IV satellite failed to activate after deployment. Astronauts attached a makeshift device to the RMS arm’s end effector during an unsched-
The Eighties in America
uled space walk. They dragged this device against a satellite lever designed to start a timer for solidfueled rocket ignition; the attempt failed. Discovery landed at KSC after six days.Challenger’s STS 51-B/ Spacelab 3 mission began with an April 29 liftoff and lasted seven days. Scientists inside Spacelab performed fifteen primary experiments in materials science, life sciences, fluid physics, atmospheric physics, and astronomy. On July 17, Discovery flew its fifth mission, STS 51-G. NASA’s crew hosted a Saudi prince, Sultan Salman AlSaud, as Saudi Arabia had partially paid for the mission. During the mission, astronauts deployed three satellites, including an Arabsat communications satellite.Challenger’s STS 51-F/Spacelab 2 mission experienced an ascent mishap on July 29. One main engine shut down prematurely. The space shuttle executed an abort-to-orbit profile, entering a lower altitude than preferred. Challenger’s payload bay carried a sophisticated pointing system for a suite of four solar astrophysics telescopes. The shuttle’s thrusters raised the orbit slightly, and research was carried out despite nagging pointing system problems. Discovery flew STS 51-I, launching on August 27. Three satellites were deployed during the sevenday mission. A spacewalking repair of the Syncom IV satellite succeeded, and the satellite’s solid rocket motor fired to boost it into an operational position. Discovery landed at Edwards Air Force Base. STS 51-J saw the first flight of Atlantis, launched on October 3. During the four-day classified DOD mission, a military communications satellite was deployed. Atlantis returned to Earth at Edwards Air Force Base. The seven-day-long STS 61-A/Spacelab D-1 research mission was the first to involve control from an international (German) control center. Three ESA payload specialists flew among the first eightperson crew. Challenger returned to Earth at Edwards Air Force Base. Atlantis’s STS 61-B mission launched on November 26 . Three satellites were deployed, and astronauts tested orbital construction techniques. Atlantis landed at Edwards Air Force Base after seven days. Robust 1985 space shuttle operations stretched program assets thin. Pressure existed to increase flight rates to make the shuttle financially selfsufficient, but problems surfaced during the early years of operations. One recurring, serious problem involved SRB joint O-ring erosion. Joint design re-
The Eighties in America
quired modification, which would have halted flight operations. 1986 Missions and the Challenger Disaster
Columbia’s six-day STS 61-C mission launched on January 12 with Congressman Bill Nelson of Florida aboard. This flight included investigations of Halley’s comet. Columbia landed at Edwards Air Force Base. STS 51-L was to deploy TDRS-B. President Reagan had directed NASA to search for a Teacher in Space participant, and Christa McAuliffe of New Hampshire was selected to join six other astronauts. Challenger experienced several weather delays and technical difficulties before its launch on January 28. Media and educational institutions focused on the historic Teacher in Space’s journey. An SRB joint O-ring failed at ignition, leading to a catastrophic vehicle breakup seventy-three seconds after liftoff. All crew members onboard were killed. A presidential commission issued a final report critical of how NASA left problems unaddressed while schedule pressures kept shuttles flying. A safer SRB joint design was developed and tested into the summer of 1988. Shuttle payload manifests had been severely disrupted, but NASA focused on safety; sadly, that lesson had not been learned until seven astronauts lost their lives and a $2 billion orbiter was destroyed.
Return to Flight
On September 29, 1988, NASA stood poised for Discovery’s STS-26 launch. Discovery carried another TDRS. Five astronauts deployed TDRS-C, paid tribute to Challenger’s crew, and landed at Edwards Air Force Base after four days. Atlantis’ s classified STS-27 mission launched on December 2. Discovery lifted off on March 13, 1989, to begin the five-day STS-29 mission. After deploying TDRS-D and completing secondary experiments, the space shuttle landed at Edwards Air Force Base. Subsequent missions included STS-30, which carried the Magellan probe, sent to map Venus, and STS-34, which dispatched the Galileo probe to Jupiter. Classified DOD missions included Columbia’s STS-28 and Discovery’s STS-33.
Impact Space shuttle operations commenced in 1981 and in four years ramped up to a flight per month. Fleet operations and the shuttle workforce experienced tremendous pressure. Vehicle problems were not addressed to better ensure safety, resulting in the Challenger launch accident of January, 1986.
Special effects
■
901
After redesigns, the shuttle fleet resumed flight operations in September, 1988. Flight rates never again matched that of 1985, but the shuttle’s function shifted from a commercial satellite delivery system to a research platform for Spacelab missions, a transport vehicle for Phase One astronauts going to Russia’s Mir space station, and a workhorse for International Space Station (ISS) construction. Subsequent Events
Challenger, Discovery, Atlantis, and Endeavour followed Columbia in turn. As a result of thermal protection system damage encountered during launch, Columbia and another seven-person crew were lost in February, 2003. After careful review, President George W. Bush directed NASA to complete the ISS by 2010 and then retire the shuttle fleet. Shuttles would be replaced by an Apollo-like Crew Exploration Vehicle, and astronauts would again leave low Earth orbit, where for thirty years the shuttle had been constrained to operate.
Further Reading
Harland, David M. The Story of the Space Shuttle. New York: Springer-Praxis, 2004. Covers shuttle origins through the post-Columbia accident period. Jenkins, Dennis R. Space Shuttle: The History of the National Transportation System—The First One Hundred Missions. New York: D. R. Jenkins, 2001. A technical text details flight operations. Reichhardt, Tony. Space Shuttle: The First Twenty Years—The Astronauts’ Experiences in Their Own Words. London: Dorling Kindersley, 2002. As this title suggests, the text provides numerous firstperson accounts of space shuttle experiences. David G. Fisher See also Challenger disaster; Garneau, Marc; Reagan, Ronald; Ride, Sally; Science and technology; Space exploration.
■ Special effects Definition
Images and sounds in motion pictures that are created or manipulated through means other than filming or recording the thing being represented
During the 1980’s, both the film and special effects industries experienced a remarkable revival. The art and technology of special effects became increasingly sophisticated and
902
■
Special effects
spectacular, resulting in large budgets and huge box-office blockbusters. Beginning in the late 1890’s, the father of special effects, Georges Méliès, used effects such as stopaction and double exposure, perspective tricks, and mirrors to create illusions in film. As filmmaking evolved during the 1900’s, the art and science of special effects also grew to include techniques such as cel animation, prosthetic makeup, model making, claymation, matte painting, and finally computer technology. Facing financial difficulties in the 1960’s, major film studios eliminated special effects departments to reduce overhead costs. However, for Stanley Kubrick’s 1968 movie 2001: A Space Odyssey, artist Douglas Trumball invented the “slit scan” technique to create the famous “stargate” light sequence. The film’s special effects helped make it a phenomenal box-office success. Partly as a result, in the 1970’s a new generation of film directors began to revive special effects, setting the tone for the 1980’s. Steven Spielberg’s Jaws (1975) became the first blockbuster of the post-studio era, making over $100 million. His dazzling Close Encounters of the Third Kind (1977) employed some of the effects specialists who had worked on 2001. George Lucas’s Star Wars (1977) used the first computer-linked camera control system. The success of these movies helped guide the studios to change their business models. Over time, they ceased to focus on making a steady series of medium-budget, relatively successful films. Instead, they began to focus more of their energies on creating fewer but more lucrative films, blockbusters with massive budgets that sought to realize even more massive profits. Most such blockbusters were effectsdriven, relying on spectacle rather than narrative as their primary draw. During the 1980’s, filmmakers met the public’s growing expectation for impressive special effects and large-budget action and science-fiction films. In 1982, Disney produced Tron, an adventure film set largely in virtual reality that featured the first extensive use of computer-generated imaging (CGI). Other milestones in special effects history included Blade Runner (1982), which envisioned Los Angeles in 2015; The Last Starfighter (1984), the first film to use CGI instead of traditional, physical models of spaceships; and Aliens (1986), which used in-camera visuals.
The Eighties in America
In the 1980’s, Lucas’s Industrial Light and Magic (ILM) special effects studio worked on over fifty major films, became the world’s most successful special effects company, and produced landmark effects, especially CGI. The company added impressive CGI effects in the climactic scene of Spielberg’s Raiders of the Lost Ark (1981). In Star Trek II: The Wrath of Khan (1982) the “Genesis sequence” was the first completely computer-generated sequence in a major motion picture and the first use of a fractal-generated landscape. In two further collaborations with Spielberg, ILM produced the effects for E.T.: The ExtraTerrestrial (1982)—the most successful film of the decade—and Young Sherlock Holmes (1985), for which the company created the first fully computer generated character, the “stained-glass man.” Also in 1985, ILM created alien life-forms and a spaceship for Cocoon. Labyrinth (1986) opened with a digital owl, the first CGI animal. For the MetroGoldwyn-Mayer and Lucasfilm coproduction Willow (1988), ILM performed the first digital morphing (the seamless evolution of one image or character into another). A Disney production, Who Framed Roger Rabbit (1988), combined hand-painted imagery with computer animation. Back to the Future, Part II (1989) used a computer-controlled camera to create stunning split-screen photography. In The Abyss (1989), ILM made the first computer-generated 3dimensional (3-D) character, the alien pseudopod, which swam in the first digitally animated water. Raiders of the Lost Ark, Cocoon, Who Framed Roger Rabbit, and The Abyss won Academy Awards for Best Visual Effects. In 1986, Lucas sold ILM’s computer graphics department to Apple Computer founder Steve Jobs, and Pixar was born. Pixar produced Luxo Jr. (1986), a two-minute short film about a pair of desk lamps. This short was the first fully computer-animated film. Tin Toy (1988) was the first computer animation to win an Academy Award—for Best Animated Short Film—and was the inspiration for Toy Story (1995). Impact Special effects in the 1980’s had profound cultural, economic, and commercial impacts. The film and special effects industries were both revitalized, as audiences came to expect spectacular special effects and high-budget action films. During this decade, the average cost to produce a film rose from $10 million to $23 million, but movie theater reve-
The Eighties in America
nues were higher than ever. In 1989, these revenues totaled $5.03 billion. The industry also reaped profits from movie-related merchandise, such as books, clothing, toys, and computer games. There was also a renaissance in the art and technology of special effects. Research and technological development intensified to meet increasing audience expectations. The leading studio, Industrial Light and Magic, continued to provide groundbreaking effects for films, music videos, television commercials, and themepark attractions. Further Reading
McCarthy, Robert. Secrets of Hollywood Special Effects. Boston: Focal Press, 1992. Comprehensive, indepth source on special effects techniques, including case studies and over two hundred illustrations. Discussions of wire flying and levitation, rain and water, snow, steam, smoke, fire, and chemical effects. Pinteau, Pascal. Special Effects: An Oral History—Interviews with Thirty-Eight Masters Spanning One Hundred Years. New York: Harry N. Abrams, 2004. Fascinating revelations about special effects techniques used in film, television, and theme parks. Includes over one thousand photographs and illustrations. Rickitt, Richard. Special Effects: The History and Technique. New York: Billboard Books, 2000. A beautifully illustrated, comprehensive history, including interviews with hundreds of special effects masters, a helpful glossary, and a section on special effects landmarks. Vaz, Mark Cota, and Patricia Duignan. Industrial Light and Magic: Into the Digital Realm. New York: Ballantine Books, 1996. Behind-the-scenes account of ILM’s accomplishments from 1986 through the mid-1990’s. Includes over six hundred illustrations and a foreword by Steven Spielberg. Alice Myers See also
Academy Awards; Action films; Aliens; Back to the Future; Blade Runner; Empire Strikes Back, The; E.T.: The Extra-Terrestrial; Film in the United States; Ford, Harrison; Raiders of the Lost Ark; Sciencefiction films; Spielberg, Steven; Tron; Who Framed Roger Rabbit.
Spielberg, Steven
■
903
■ Spielberg, Steven Identification
American film director and producer Born December 18, 1946: Cincinnati, Ohio In many ways, the 1980’s was cinematically Spielberg’s decade. Not only did all three of his Indiana Jones movies appear during the decade, but also E.T.: The ExtraTerrestrial became the decade’s single most successful film, and The Color Purple relaunched Spielberg’s career as a serious, albeit at times controversial, filmmaker. Steven Spielberg’s Jaws (1975) began a transformation in Hollywood, as the blockbuster film changed the studios’ expectations and strategies in producing and marketing motion pictures. It immediately defined Spielberg as a major force within the industry, although he ended the 1970’s with one of his rare failures, 1941 (1979). In 1981, Spielberg’s Raiders of the Lost Ark, the first film in the Indiana Jones series, introduced the daring, resourceful hero. Indiana Jones and the Temple of Doom (1984) and Indiana Jones and the Last Crusade (1989) also pitted the heroic individual American against the forces of darkness, a conflict in which the United States was triumphant. These nostalgic films, set in the 1930’s and deliberately designed to be stylistically reminiscent of old Saturday afternoon serials, capitalized on the 1980’s nostalgia for earlier, idealized decades. They resonated with similar themes of nostalgia and of the role of the United States in the battle between good and evil that were central to Ronald Reagan’s presidency. By most filmmakers’ standards, all three Indiana Jones films were popular and financially successful, but the second failed to live up to the extremely high expectations that had attached to every Spielberg project by the time it was released. Spielberg’s reputation was furthered by the huge impact of his science-fiction film E.T.: The ExtraTerrestrial (1982), which gave American culture the phrase “E.T. phone home” and expanded the market for a chocolate-covered, peanut-butter candy. In the movie, Spielberg explored the impact of an alien on conventional, placid middle-class suburban life, a theme he had touched on previously in an earlier science-fiction movie, Close Encounters of the Third Kind (1976). Spielberg’s critical reputation increased considerably with the release of The Color Purple (1985), set in the segregated South and based on the celebrated
904
■
Sports
The Eighties in America
novel by the African American author Alice Walker. The film traced the devastating effects of segregation and employed an almost entirely African American cast. He followed with Empire of the Sun (1987), based on the novel by J. G. Ballard and set in the late 1930’s in Shanghai, during the Japanese military takeover. Empire of the Sun followed the adventures of a young boy in an internment camp during World War II and the emotional effects wrought on him by his imprisonment. Both films garnered critical attention not before received by Spielberg’s films. Steven Spielberg in 1985. (AP/Wide World Photos) Impact
Steven Spielberg helped reenergize American films of the 1980’s with his technical skill, his reverence for the traditions of Hollywood, and his innovative reworking of standard themes of American cinema. Further Reading
Brode, Douglas. The Films of Steven Spielberg. New York: Carol, 1995. McBride, Joseph. Steven Spielberg: A Biography. New York: Simon & Schuster, 1997. Perry, George. Steven Spielberg. New York: Thunder’s Mouth Press, 1998. Silet, Charles L. P., ed. The Films of Steven Spielberg: Critical Essays. Lanham, Md.: Scarecrow Press, 2002. Charles L. P. Silet See also
Academy Awards; Action films; Color Purple, The; Epic films; E.T.: The Extra-Terrestrial; Ford, Harrison; Jewish Americans; Raiders of the Lost Ark; Science-fiction films; Twilight Zone accident.
■ Sports Definition
Athletic contests, both team and individual
In the final full decade before the cable and Internet revolutions, sports fans relied on “old” media (newspapers, radio, and television) to learn about the highlights from the sports world, and in the 1980’s there were many.
In the 1980’s, going to bed before a favorite team’s game had ended often meant that discovering the final score was difficult. There was no Internet accessible from home computers. One likely could not turn on the television and check ESPN; cable at this point was only just fully integrating itself into most American homes. The 1980’s were a time in which information about sports stars, games, and highlights had to be garnered from a printed newspaper or from the radio or a local television station. The Internet, the iPod, and other sources that allow for instantaneous tracking and relaying of information were not yet available. Today, it is taken for granted that a game, regardless of how unimportant it is, can be found on cable television. It also is widely known that anytime an athlete or team approaches an important milestone that ESPN or some cable entity will be airing it live. Furthermore, information about teams, players, and events can now easily be gathered “in real time” from various news agencies. In short, immediate information is available. In such an environment, athletes can become instant stars and events can become instant classics. Whether the athlete or the game actually deserves such a label often is doubtful. While many people often did not learn about stars and classic moments as quickly or have the certainty of seeing them live as they do today, many events, people, and moments captivated the nation during the 1980’s.
The Eighties in America The Olympics and the Miracle on Ice The brightest moment of the 1980’s might very well have taken place just two months into the decade. In 1980, no professional athlete (as defined by the West) could take part in the Olympics. The absence of paid athletes allowed for the larger-than-life story that was the triumph of the U.S. men’s hockey team, which won the gold medal in Lake Placid. Although the U.S. team had no professionals, many of its members would go on to play in the National Hockey League (NHL). Along the way, the Americans defeated the vaunted Soviets, a team considered the best in the world and that included, many in the West believed, professionals because of the subsidies provided to them by their government. The Soviet Union insisted that it was never in violation of Olympic rules. Just six years later, the distinction between “amateur” and “professional” was a moot point; the terms had been deleted from the Olympic Charter, the document that outlines the rules and regulations of the Olympics. Among the first sports to benefit from the inclusion of professionals was tennis, which after a sixty-four-year hiatus returned to the Olympic program in 1988. In Seoul, one of the best tennis players in the world, Steffi Graf, won a gold medal. Politics also overshadowed the Olympic movement during the 1980’s. President Jimmy Carter led a boycott of the 1980 Moscow Olympics, in protest of the Soviet Union’s military invasion of Afghanistan in December, 1979. Though widely criticized both at home and especially abroad, Carter’s boycott call was endorsed by a number of Western nations. Soviet and East German athletes, in the absence of many of their Western colleagues, dominated the Games, winning a combined 321 medals. The image of the Olympics as a place for athletes to gather in friendly competition free of international politics had been shattered. Four years later, the Soviets led a boycott of their own and refused to send their athletes to Los Angeles, the site of the 1984 Summer Games. Thirteen Eastern Bloc nations joined the Soviets, who said the boycott was necessary because the United States could not guarantee the safety of Soviet athletes while they were on American soil. American athletes, especially swimmers such as Mary T. Meagher, and track-and-field stars, most notably Carl Lewis, took advantage of the missing Eastern Bloc athletes to win numerous Olympic medals. The entire country seemed to revel in Olympic fever.
Sports
■
905
Although the Americans, the Soviets, and their allies were reunited at the 1988 Summer Olympics in Seoul, they actually had the chance to compete against each other two years earlier. In the mid1980’s, Ted Turner, founder of the Cable News Network (CNN), began what he called the Goodwill Games, which brought together athletes from the Eastern and Western Blocs in an athletics program that included multiple Olympic sports. The first Goodwill Games took place in 1986 and were hosted by Moscow. The games were broadcast in the United States on Turner Network Television (TNT), a network that was owned by Turner at the time. The location allowed American audiences a glance inside the Soviet Union, a somewhat rare opportunity during the Cold War. The Goodwill Games continued into the early twenty-first century, but their expressed purpose—to allow athletic competition to trump international politics—was never more important than in 1986. Football
In the final seconds of the 1982 National Football Conference (NFC) championship game at Candlestick Park in San Francisco, Dwight Clark of the San Francisco 49ers ran to the back of the end zone and jumped as high as he could. When he came down, a football that had been thrown by quarterback Joe Montana was in his hands. “The Catch,” as it has become known, not only allowed the 49ers to defeat the Dallas Cowboys and advance to the franchise’s first Super Bowl but also ensured that the 49ers would become the NFC’s glamour team for the remainder of the decade and beyond. Though the Cowboys were considered “America’s Team,” the 49ers would go on to win 159 regular-season games and five Super Bowls over the next fourteen years. If the 49ers were the NFC’s premier team during the 1980’s, the closest equivalent in the American Football Conference (AFC) was the Denver Broncos. Although the Oakland/Los Angeles Raiders had a strong reputation and Super Bowl victories in 1981 and 1984, during the decade the Broncos won more regular-season games (ninety-three to eightynine) and appeared in more Super Bowls (three to two) than the Raiders. The Broncos lost those three championship games by a combined score of 136-40, highlighting an era in which NFC teams, including the 49ers, were simply better than their AFC counterparts. The Broncos (who went on to win two Super Bowls in the 1990’s) built their team around
906
■
Sports
one man—Hall of Fame quarterback John Elway, who in 1983 was drafted by the then Baltimore Colts, a team for which Elway refused to play. He was soon dealt to Denver. In January, 1987, the legend of Elway fully developed. Though his team trailed 20-13 in the final six minutes of the AFC championship game in Cleveland, Elway led his team on an improbable fifteenplay, 98-yard touchdown drive that tied the score and forced overtime. In the extra period, he again drove the Broncos down the field, using nine plays to cover sixty yards. A short field goal followed, and the Broncos stunned the Cleveland Browns with a 23-20 victory. Fans in three National Football League (NFL) cities were stunned—for a different reason—during the 1980’s. They watched helplessly as their teams moved to new cities: the Oakland Raiders to Los Angeles in 1982, the Baltimore Colts to Indianapolis in 1984, and the St. Louis Cardinals to Phoenix, Arizona, in 1988. More franchises—including the Raiders, who returned to Oakland—would relocate in the 1990’s. Baseball
In 1981, the Major League Baseball (MLB) season was damaged by a strike that lasted about two months and canceled more than seven hundred games. Many fans blamed the owners, although it was the players who decided to walk out. The strike also overshadowed the remarkable rookie season of Los Angeles Dodgers’ pitcher Fernando Valenzuela. He began his first year in the majors with an 8-0 record and an earned run average (ERA) of 0.50. He finished the season with a 13-7 record, a 2.48 ERA, seven shutouts, and as the winner of the Rookie of the Year and Cy Young awards. The Dodgers also won their first world championship since 1965, defeating the New York Yankees in six games. An even more improbable Dodgers team won the World Series in 1988, when it defeated the Oakland Athletics. Los Angeles pitcher Orel Hershiser completed an amazing season, which included setting a major-league record for consecutive shutout innings (fifty-nine), by being named the League Championship Series and World Series Most Valuable Player (MVP). Fittingly, he threw a complete game in the decisive game 5, which the Dodgers won 5-2. In 1989, Nolan Ryan, baseball’s all-time leader in career strikeouts with 5,714, recorded 301 strike-
The Eighties in America
outs, marking the sixth and final time he reached the 300 plateau in a single year. Roger Clemens, who currently ranks second in all-time strikeouts, began his remarkable career in the 1980’s. He went 24-4 in 1986, when the Boston Red Sox won their first division title in more than a decade. Two years later, he struck out 291 batters; only once has Clemens recorded more strikeouts in a single season. In fact, the top eight pitchers in all-time strikeouts played during some or all of the 1980’s. One of baseball’s most historic records fell on September 11, 1985, when Cincinnati’s Pete Rose hit a single to left field. It was career hit 4,192 for Rose, eclipsing Ty Cobb’s record. Rose ended his playing career one year later, finishing with 4,256 hits, but he remained the Reds’ manager through August, 1989. His Hall of Fame credentials seemed secure: He had won three World Series championships and an MVP award, and he was a seventeentime All-Star. However, Rose’s professional credentials were crushed by a looming controversy—did he bet on baseball games, including some involving the team he was managing? A damaging report, written by investigator John Dowd, left little doubt in the minds of the sport’s executives that Rose was gambling (including on his Reds games) and therefore had to be banned from the game. Rose maintained that he did nothing wrong, but in late 1989 he agreed to be placed on baseball’s ineligible list and resigned as manager. Basketball Basketball during the 1980’s was dominated by two men, although by the end of the decade a third man had come along to earn his spot among National Basketball Association (NBA) superstars. Earvin “Magic” Johnson proved that he was going to be something very special during his rookie year of 1979-1980, when he averaged more than eighteen points and seven rebounds per game for the Los Angeles Lakers. In the critical sixth game of the NBA finals, Johnson moved from guard to center, starting for the injured Kareem Abdul-Jabbar, and scored forty-two points and grabbed fifteen rebounds, as the Lakers held off the Philadelphia 76ers and won their first NBA championship in almost a decade. Johnson also won the first of his three NBA finals MVP awards that year. Larry Bird entered the league in the same year Johnson did. Bird won the league’s Rookie of the Year award, averaging more than twenty-one points
The Eighties in America
and ten rebounds per game for the Boston Celtics. Bird’s impact was felt much more in the standings: During the 1978-1979 season, the Celtics won twentynine games, but with Bird they won sixty-one games the following year. The team also advanced to the NBA’s Eastern Conference finals, losing to Philadelphia. Johnson’s Lakers and Bird’s Celtics took home many championships during the 1980’s. Los Angeles won five NBA championships and played for three others. Boston won three titles and played for two others. The most intense period of this rivalry took place between the 1983-1984 and 1986-1987 seasons. Over that four-year stretch, the teams met three times in the NBA finals; the Lakers won twice. Toward the end of the decade, basketball’s man of the 1990’s took center stage. Michael Jordan joined the NBA in 1984, a few months after helping the U.S. men’s basketball team win an Olympic gold medal. He was an instant superstar, averaging more than 28 points per game for the Chicago Bulls. Jordan won multiple scoring titles over the remainder of the decade, but he and his teammates could not get past Boston in the middle of the decade, nor Detroit at the end of it, to advance to the NBA finals. Jordan and the Bulls would dominate the league in the 1990’s. Hockey
The Lakers and Celtics were basketball’s dynasties in the 1980’s. In hockey, the dynasty tag was owned by the New York Islanders at the beginning of the decade and by the Edmonton Oilers at the end of it. The Islanders won the Stanley Cup from 1980 through 1983. The Oilers lost the 1983 finals, but they quickly rebounded and were hockey’s best team in 1984 (defeating the Islanders and denying New York a fifth consecutive title), 1985, 1987, 1988, and 1990. The Oilers had all the makings of celebrity status: They were young, cocky, and brash, and they had perhaps the greatest player in the history of the sport—Wayne Gretzky. During the 1980’s, he scored more than two hundred points in a single season four times, scored more than fifty goals in a season nine times, and won the league MVP award eight straight times (nine times overall). The Oilers began to be dismantled in 1988, when owner Peter Pocklington agreed to trade Gretzky to the Los Angeles Kings. Although Gretzky never won another championship in his career, his superstar
Sports
■
907
status ensured that hockey would survive on the West Coast. Sold-out crowds became the norm in Los Angeles. Tennis and Golf Women’s tennis was the purview of Martina Navratilova during the 1980’s. She won fifteen of her eighteen grand slam singles titles, including the most prestigious, Wimbledon, on six different occasions. She also won the singles title at the U.S. Open four times, in 1983, 1984, 1986, and 1987. Two men repeatedly etched their names into the golf record books in the 1980’s. Tom Watson won three of his five Open Championships, his only U.S. Open title, and the second of his two Masters championships during the decade. He also was named the PGA Golfer of the Year three times. The “Golden Bear,” Jack Nicklaus, also proved he still had some bite left in him. The dominant golfer of the 1970’s won three more majors in the 1980’s, including the 1986 Masters. He rallied from a four-shot deficit on the final day to win the crown. Impact The 1980’s saw professional and college football replace baseball as America’s most popular spectator sport, while the superstars of the NBA helped to elevate the league to equal status with the NFL and MLB. The Olympics also experienced a significant change, as the 1988 Summer Games marked the first time that professional athletes were allowed to take part in the Games. Further Reading
Craig, Roger, and Matt Maiocco. Tales from the San Francisco 49ers Sideline. New York: Sports Publishing, 2004. This is a somewhat typical but entertaining sports account. Elway, John, Marc Serota, and Elise Glading. Elway. Media, Pa.: Benchmark Press, 1998. An easy-toread book. Not the best choice for a scholarly project, but it provides a glimpse at Elway as a person and an athlete. Feinstein, John. A Season on the Brink: A Year with Bobby Knight and the Indiana Hoosiers. New York: Simon & Schuster, 1987. This remains one of the most popular sports books of all time, chronicling the legendary basketball coach and his team. Ironically, this book covers the 1985-1986 Indiana season. One year later, the Hoosiers won the national championship. Guttmann, Allen. A History of the Olympic Games. 2d ed. Champaign: University of Illinois Press, 2002.
908
■
Spotted owl controversy
The Eighties in America
One of the best books to help explain the growth of the Olympics, as well as the positive and negative elements that are now associated with this premier international event. Podnieks, Andrew. The Great One: The Life and Times of Wayne Gretzky. New York: Triumph Books, 1999. A complete look at Gretzky and his impact on his sport. Rose, Pete, and Rick Hill. My Prison Without Bars. New York: Rodale Books, 2004. Rose uses this book to make one case: that, despite his legal problems, he belongs in the Baseball Hall of Fame. Senn, Alfred Erich. Power, Politics, and the Olympic Games. Champaign, Ill.: Human Kinetics Books, 1999. The writing is sometimes difficult to follow, but Senn nevertheless presents a well-researched book about the outside forces that weigh on the Olympic Games. Anthony Moretti See also
Arena Football League; Baseball; Baseball strike of 1981; Basketball; Bird, Larry; Boitano, Brian; Boxing; Brett, George; Decker, Mary; Elway, John; Football; Gibson, Kirk; Golf; Goodwill Games of 1986; Gretzky, Wayne; Griffith-Joyner, Florence; Hershiser, Orel; Hockey; Holmes, Larry; Jackson, Bo; Johnson, Magic; Lemieux, Mario; LeMond, Greg; Leonard, Sugar Ray; Lewis, Carl; Louganis, Greg; McEnroe, John; Miracle on Ice; Montana, Joe; Navratilova, Martina; Olympic boycotts; Olympic Games of 1980; Olympic Games of 1984; Olympic Games of 1988; Play, the; Retton, Mary Lou; Rice, Jerry; Rose, Pete; Ryan, Nolan; SkyDome; Soccer; Taylor, Lawrence; Tennis; Thomas, Isiah; Tyson, Mike; Valenzuela, Fernando; Watson, Tom; Wave, the.
■ Spotted owl controversy Definition
Debate over logging in the habitat of an arguably endangered species
The rarity of the spotted owl, a relative of the barred owl adapted to old-growth forests of the Pacific Northwest, led to prohibitions on logging in federal forests lying within its range. Oregon and Washington are both the largest timber producers in the United States and home to some of
The northern spotted owl. (U.S. Fish and Wildlife Services)
the nation’s most tenacious environmental groups. As a result, environmentalists fought timber and mining companies for decades in the Pacific Northwest. Logging is a traditional livelihood, generating thousands of jobs and hundreds of millions of dollars in the area every year. By the same token, the relatively unspoiled beauty of the region fuels a particularly strong environmental movement. The National Forest Service favored the timber industry in this battle. Thus, in the 1980’s, logging increased until it reached a rate at which old-growth forests would disappear in ten to fifteen years if nothing was changed. James G. Watt became secretary of the interior in 1981 and unapologetically promoted mining and lumbering on federal lands, with little thought of conservation. His tenure sparked a showdown between environmentalists and loggers. The forest service announced a record cut of trees in nineteen national forests in Washington and Oregon, with one
The Eighties in America
in four of the resulting logs to be shipped overseas. The University of Oregon’s Western Natural Resources Law Clinic (WNRLC), which specialized in environmental law, worked with nature organizations to develop a strategy to combat the government. The clinic’s lawyers decided to argue that the northern spotted owl should be placed on the endangered species list. Because the owl lived in the threatened forests, its placement on the list would prevent the government from logging there under the Endangered Species Act (ESA). Environmentalists had used the ESA in 1973 to stop the Tellico Dam in Tennessee from being built, because it would have encroached on the habitat of the snail darter. Using the same strategy, the WNRLC and activist groups argued that the northern spotted owl nested and reared its young only in old-growth forests, which were set to be cut by logging companies. Each pair of owls needed one thousand acres or more of uncut forest to rear their young. Without this unspoiled acreage, the spotted owl would disappear. Of the nineteen forests targeted for cutting, thirteen held spotted owls. If the spotted owl were placed on the endangered species list, cutting in national forests would be reduced by 50 percent. The owl had been studied by the federal government for inclusion on the the list, but the government had demurred. When the WNRLC argued its case in federal district court in Seattle, the judge ruled that the government had acted illegally by not placing the owl on the list. Lawsuits against the government by activist groups tied the issue up in court, and no lumber was harvested. Impact Eventually, Congress brokered a compromise that favored conservationists. Logging was reduced by 50 percent, and the federal government would not sell timber from areas identified as spotted owl habitats. At the same time, the timber industry was spared drawn-out lawsuits and court challenges. The controversy became important both for residents of the Pacific Northwest and for the Reagan administration, which was ideologically opposed to precisely the sort of federal interference with business interests represented by the ESA. Further Reading
Chase, Alston. In a Dark Wood: The Fight Over Forests and the Myths of Nature. New Brunswick, N.J.: Transaction, 2001.
Springsteen, Bruce
■
909
Yaffee, Steven Lewis. The Wisdom of the Spotted Owl: Policy Lessons for a New Century. Washington, D.C.: Island Press, 1994. James Pauff See also
Environmental movement; Exxon Valdez oil spill; Reagan, Ronald; Watt, James G.
■ Springsteen, Bruce Identification American singer-songwriter Born September 23, 1949; Freehold, New Jersey
A critical and popular success as the decade began, Springsteen became in the 1980’s a genuine cultural phenomenon, someone whose multiplatinum albums and extremely popular tours made him by 1985 one of the world’s most talked-about entertainers. Bruce Springsteen released five albums during the 1980’s, The River (1980), Nebraska (1982), Born in the U.S.A. (1984), Bruce Springsteen and the E Street Band Live, 1975-85 (1986), and Tunnel of Love (1987). Each album sold well: Four reached number one on the Billboard 200, and the other reached number three. Born in the U.S.A., however, sold in such numbers (15 million copies in the United States alone) and for so long (eighty-five weeks in the top ten) that it warrants consideration alongside Michael Jackson’s Thriller (1982) and Prince’s Purple Rain (1984) as the most important album of the decade. To understand the extraordinary popularity of Born in the U.S.A., one must look at the cultural milieu into which the album was released. The year 1984 was both an Olympic year and an election year. Staged in Los Angeles, the 1984 Summer Olympic Games kindled in Americans a new and unabashed patriotism, as the nation celebrated not only its Olympic athletes but also the people (notably Peter Ueberroth) responsible for staging a globally admired Olympic Games. Sparked by the Olympics, American patriotism was stoked into flame by Ronald Reagan’s reelection campaign, a campaign whose message, “It’s morning in America,” resonated powerfully with Americans feeling good—but wanting to feel better—about their country. With patriotism in the air, Born in the U.S.A. (whose cover featured an American flag) seemed to represent another reason for Americans to feel good
910
■
The Eighties in America
Springsteen, Bruce
about themselves. The title song of the album told the story of an impoverished American forced to fight in the Vietnam War and beaten at every turn by an uncaring nation. However, the refrain, “I was born in the U.S.A.” (meant to represent the character’s protest against his treatment), was misunderstood by some to be a patriotic anthem. Other songs, also complaints about the economic woes of workingclass Americans, similarly resonated with both those working-class individuals for whom they spoke and those others for whom the titles (such as “No Surrender” and “My Hometown”) struck a patriotic chord. The album was a worldwide hit. Springsteen looked heroic, and pundits and politicians—including President Ronald Reagan—soon seized on Springsteen as proof that America was the land of opportunity, much to the singer’s chagrin. By October, 1984, Springsteen was more than just a chart-
topping musician: He was a symbol of America itself. For politicians and patriots, the album’s title seemed to support their agenda. For the working Americans left behind by the nation’s economic growth, Springsteen was a powerful voice of protest and musical activism. In a sense, Springsteen spent the second half of the 1980’s living down the image he had acquired in 1984 and 1985. He struggled to correct misinterpretations of Born in the U.S.A., advising listeners to pay closer attention to the lyrics. He connected audiences to charitable groups, arguing that too many Americans were neglected by government agencies. During this time, he also married model Julianne Phillips in an extremely high-profile wedding. Media scrutiny only increased when Springsteen, unhappy with the marriage, began an affair with his backup singer, Patti Scialfa. In the midst of these personal travails, Springsteen scaled down his music, releasing a low-key album (1987’s Tunnel of Love) that was dominated by confessional love songs. In 1989, as if to draw a curtain on the turbulent 1980’s, Springsteen dissolved the E Street Band and filed for divorce from Phillips. Impact Bruce Springsteen’s vision of America was enormously influential. Distancing himself from the flag-waving patriotism characteristic of the Reagan era, Springsteen in the 1980’s espoused an alternate patriotism, one celebrating America not for its military prowess or economic might but for its compassion. He sought with his music to reactivate that compassion and put it to work, striving to counteract the tendencies that led his compatriots to be labeled the “Me generation.” Further Reading
Alterman, Eric. It Ain’t No Sin to Be Glad You’re Alive: The Promise of Bruce Springsteen. Boston: Little Brown, 1999. George-Warren, Holly, ed. Bruce Springsteen: The “Rolling Stone” Files. New York: Hyperion, 1996. Marsh, Dave. Glory Days: Bruce Springsteen in the 1980’s. New York: Pantheon, 1987. Matt Brillinger See also Bruce Springsteen performs in Dallas, Texas, on September 14, 1985. (AP/Wide World Photos)
Elections in the United States, 1984; Jackson, Michael; MTV; Music; Music videos; Olympic Games of 1984; Pop music; Prince; Reagan, Ronald; USA for Africa.
The Eighties in America
■ Standards and accountability in education Definition
Systems designed objectively to measurement success and failure in student instruction and to hold teachers responsible for their performance
The standards and accountability movement sought to import management practices from the business world to improve the nation’s schools. It began as a grassroots movement early in the 1980’s in response to the disappointment felt by many parents at their communities’ educational systems. By the end of the decade, the movement had reached the national level, largely because of the publication of A Nation at Risk in 1983. Americans had begun to question the quality of their educational system in the 1960’s, when studies of segregated schools revealed that students were more likely to succeed when their schools had better resources. At that time, the reform movement sought more money for all public schools. In the 1970’s, states began to be concerned about their budgets, so theorists and politicians began looking into applying business principles to school management. Rather than simply adding more money to school budgets, state administrators looked for ways to make education more cost-effective. Respondents to a 1980 Gallup poll identified “low standards” as one of the top four problems in schools, and 79 percent of respondents favored teaching morality in public schools. The U.S. Department of Education had been founded by the Carter administration in 1979. President Ronald Reagan’s initial intention had been to dissolve the department, until he realized he could use it to promote his agenda of a “new federalism.” Reagan sought to use the department to strengthen state education departments and promote the idea of efficiency. In 1982, Mortimer J. Adler, a classical philosopher and leader of the Great Books movement (which called for an education based upon certain texts considered to have universal importance and quality), published the Paideia Proposal, an argument for more rigorous standards in elementary education. Adler called on public schools to teach children the skills that would prepare them for a lifetime of selfmotivated learning. Allan Bloom, another classical philosopher and Great Books theorist, published The Closing of the American Mind in 1987, arguing that
Standards and accountability in education
■
911
public schools were more interested in teaching moral and cultural relativism than in teaching basic cultural concepts and that students who grew up thinking of all truth as relative were incapable of learning anything. In 1983, the National Commission on Excellence in Education issued A Nation at Risk, warning that American students were falling behind other nations’ youth in math, science, and engineering. The main “front” of the Cold War with the Soviet Union was a competition over productivity and technological advancement. Both nations sought to be the world’s leader in scientific advancement and economic growth. With the publication of A Nation at Risk, Americans feared that they were not only losing the Cold War but also falling behind other countries such as China and Japan. Reagan modified his attitude toward the Department of Education and tasked it with spearheading efforts to establish national standards and accountability mechanisms. In 1985, Reagan appointed William Bennett to be his new secretary of education, and Bennett became one of the main spokesmen for the standards movement. The movement was criticized from all ends of the political spectrum. Advocates of classical educational theories, while supporting the establishment of higher educational standards, opposed using standardized tests to measure success. Adler, for example, argued in Six Great Ideas that a few minutes of conversation is a far better gauge of a student’s learning than is any standardized test. Libertarians worried that the accountability movement would become an excuse to expand the federal bureacracy, while many religious conservatives feared that educational standards would further marginalize religion in American culture. Social progressives, meanwhile, feared that standardization would stifle independent thought and student creativity and that accountability would compromise the integrity of the teaching profession. They argued that standardized tests unfairly penalized students who were academically gifted and successful but performed poorly on tests because of anxieties, learning disabilities, or confusion about procedures. Nevertheless, the first federally funded testing program, the National Assessment of Education Progress (NAEP), began in 1988. The decade ended with President George H. W. Bush and the National Governors’ Association meeting at the National Education Summit in Charlottesville, Virginia. There,
912
■
Star Search
they issued the first national standards for English, mathematics, science, history, and geography, as well as rules for standardized testing and enforcing teacher accountability. In the same year, the National Council of Teachers of Mathematics issued the Curriculum and Evaluation Standards for School Mathematics. Impact The standards and accountability movement was relatively new in the 1980’s but quickly gained national popularity and bipartisan political support. While Republicans Ronald Reagan and George H. W. Bush put school accountability on the national agenda, Democrat Bill Clinton was one of the pioneers of state-level accountability programs as governor of Arkansas, and he would continue to support the movement as president. Further Reading
Adler, Mortimer. Paideia Proposal. New York: Touchstone, 1998. Argues that the flaw of the American educational system is that it does not teach children to think, but merely to memorize facts. _______. Six Great Ideas. New York: Touchstone, 1997. Argues that specific core Western ideas should be part of every education. Critiques the use of standardized tests to evaluate student progress. Gordon, David T. A Nation Reformed: American Education Twenty Years After “A Nation at Risk.” Cambridge, Mass.: Harvard Educational, 2003. History of the standards and accountability movement, starting with A Nation at Risk. Hayes, William. Are We Still a Nation at Risk Two Decades Later? Lanham, Md.: Scarecrow Education, 2004. Analyzes the impact of the reform movement that arose in response to the 1983 document. McGuinn, Patrick J. No Child Left Behind and the Transformation of Federal Education Policy, 19652005. Lawrence: University Press of Kansas, 2006. History of federal education policies for the forty years prior to passage of George W. Bush’s No Child Left Behind Act. John C. Hathaway See also Bennett, William; Bush, George H. W.; Closing of the American Mind, The; Cold War; Education in the United States; Magnet schools; Mainstreaming in education; Multiculturalism in education; Nation at Risk, A; National Education Summit of 1989; Reagan, Ronald; School vouchers debate.
The Eighties in America
■ Star Search Identification Television talent and variety show Date Aired from 1983 to 1995
Star Search was a nightly prime-time television talent competition, in which contestants competed against one another in six different categories for money and a chance at fame. On September 17, 1983, Star Search premiered on the Columbia Broadcasting System (CBS) network and became a staple of 1980’s television. The program was originally broadcast from the Earl Carroll Theater on Sunset Boulevard in Hollywood, California, and later from the Hollywood Center Studios on Las Palmas Boulevard in Hollywood. The show was hosted by the longtime cohost of The Tonight Show, Ed McMahon, and its announcer was Sam Riddle. Star Search was based on traditional talent competitions and variety shows: Contestants competed in various categories for a chance to remain on the program and ultimately to compete in the semifinal and final rounds. Before they could compete on the show, would-be contestants had to audition offcamera, and the competition was fierce. Once on the show, contestants sought to win nightly competitions against newcomers. The winner of each competition returned, and the loser was eliminated.Star Search competition was divided into six categories: Male Vocalist, Female Vocalist, Young Performer, Group, Fashion Model/Spokesperson, and Comedy. Each episode would pit a challenger against a returning champion, with the challenger having the advantage of performing first. After the brief performances, a panel of five celebrity judges voted on each contestant. Each judge could award a performance one to four stars. The scores were revealed after both performances, and the contestant with the highest average of votes would appear on the following program. In the event of a tie, the audience would cast the deciding vote, which would be revealed at the end of the show. Originally, the contestant that remained on the show the longest in each category would win the grand prize in that category. Throughout the run of the series, however, the rules changed, and a rule was adopted that any contestant who managed to win three consecutive matches would be retired and invited to return the following week. For the semifinal and final rounds, the panel of judges was re-
The Eighties in America
moved and replaced by audience voting. Winners of the Vocalist, Fashion Model/Spokesperson, and Comedy categories were awarded $100,000, and the winning youth performer was awarded $25,000. Impact Building on traditional variety- and talentshow platforms, Star Search created the foundation for reality talent shows such as American Idol, So You Think You Can Dance, Dancing with the Stars, and many more. Star Search also gave several notable celebrities their breaks in show business, including Drew Carey, Ray Romano, Dennis Miller, Rosie O’Donnell, and Sinbad. Further Reading
Craig, Michael-Dante. The Totally Awesome 80’s TV Trivia Book. Lincoln, Nebr.: Writers Club Press, 2001. Mansour, David. From Abba to Zoom: A Pop Culture Encyclopedia of the Late Twentieth Century. Riverside, N.J.: Andrews McMeel, 2005. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of That Decadent Decade. New York: St. Martin’s Griffin, 1996. Sara Vidar See also
America’s Most Wanted; Cable television; Comedians; Infomercials; People’s Court, The; Sitcoms; Soap operas; Talk shows; Television.
■ Star Trek: The Next Generation Identification Science-fiction television series Creator Gene Roddenberry (1921-1991) Date Aired from September 26, 1987, to May 21,
1994 A sequel series to the original Star Trek, Star Trek: The Next Generation revitalized the franchise, creating a demand among American audiences for additional television series, movies, merchandise, and tie-in fiction set in Gene Roddenberry’s idealized future. Set approximately eighty years after the events of the original Star Trek television series, Star Trek: The Next Generation (known to fans as ST:TNG) introduced viewers to a new crew and a new USS Enterprise. In an unusual step, Paramount produced ST:TNG in firstrun syndication, selling the series to local television stations on an individual basis. This way, even if the
Star Trek: The Next Generation
■
913
show failed after the initial thirteen episodes, the studio could recoup some expenses by bundling the episodes with the original series, which still did well in rerun syndication. ST:TNG did not fail, however. In spite of lukewarm early critical reception and initial resistance from some fans, it became highly successful, airing in over two hundred markets simultaneously. Perhaps the biggest adjustment for Star Trek fans was the introduction of an entirely different style of captain. Instead of James T. Kirk, an action-oriented ladies’ man, Captain Jean-Luc Picard was an older, balding Frenchman played by classically trained British actor Patrick Stewart. Other significant changes included a female security chief, a Klingon bridge officer, a blind crew member, and an android who aspired to become more human. Overall, ST:TNG encompassed a more sophisticated blend of optimism and realism than the original show had. It introduced the Borg, one of the Star Trek franchise’s most interesting alien villains, as well as the Holodeck, a representation of a virtual reality system that was featured in many episodes and helped influence the public imagination of the possibilities of virtual reality during the decade. The show appealed not only to science-fiction fans but also to viewers who did not typically watch science fiction, in part because it placed increased emphasis on female and minority characters. Impact In 1994, stunning the show’s fans, Paramount ended ST:TNG at the conclusion of the seventh season, citing several reasons: The studio had always planned on only seven seasons, a significant number for television programs hoping to be sold into perpetual syndication. Paramount also wanted the cast to begin making feature films based on the series. The studio did not want to price the series out of the rerun syndication market by having too many episodes, and it believed the show had reached its maximum profitability. In spite of its cancellation, the overwhelming of success of Star Trek: The Next Generation led to an unprecedented revitalization of an old franchise into something relevant for both old and new generations of viewers. It showed that there was room in the Star Trek universe for countless new characters, alien species, and situations, and it paved the way for additional television series—Star Trek: Deep Space Nine; Star Trek: Voyager; and Star Trek: Enterprise—as well as
914
■
The Eighties in America
Starbucks
Patrick Stewart, left, star of Star Trek: The Next Generation, poses with William Shatner, star of the original series, at an event in 1991. (AP/Wide World Photos)
several successful movies starring the Next Generation cast members. It also led to a revival of Star Trek merchandise, such as action figures and other toys, and to the creation of a themed attraction in Las Vegas called Star Trek: The Experience. Further Reading
Coit, Dawn G. “Star Trek: The Continuing Saga of a Sixties Sensation.” USA Today 117 (January, 1989): 88-90. Nemecek, Larry. The “Star Trek: The Next Generation” Companion. Rev. ed. New York: Pocket Books, 2003. Reeves-Stevens, Judith, and Garfield Reeves-Stevens. “Star Trek: The Next Generation”: The Continuing Mission—A Tenth Anniversary Tribute. New York: Pocket Books, 1997. Amy Sisson See also
Science-fiction films; Sequels; Special effects; Television.
■ Starbucks Identification Coffee franchise Date Founded in 1971; incorporated in 1987
The Starbucks concept changed the way consumers viewed the coffee experience. Starbucks rapidly became a part of popular culture. When Starbucks first opened in Seattle, Washington, in 1971, it was during a period of decline in coffee consumption. Large American coffee brands were adding cheaper beans to their blends, thus sacrificing flavor. In addition, consumers were concerned about the long-term health effects of caffeine. While a market study would therefore have indicated that the 1970’s was a bad time to expand a coffee business, the situation did not deter the coffee store’s owners. At the time, Starbuck’s only sold coffee beans; it did not brew and sell coffee by the cup. The company’s owners wanted their brand to stand for high-quality, dark-roasted coffee.
The Eighties in America
In 1982, Starbucks hired Howard Schultz as its director of retail operations and marketing. In the spring of 1983, the company sent Schultz to Milan, Italy, to attend an international housewares show. He was impressed with the popularity of espresso bars in Milan and saw the potential of building a similar coffee bar culture in the United States. He noted that Americans had increasing incomes, increased air travel, changing work patterns, and more time constraints. He believed that these were all factors that had the potential to change consumer priorities. His vision was to build a national brand around coffee and create a retail store that would become a third destination between home and work. In 1987, with the backing of local investors, Schultz purchased the company, which became known as the Starbucks Corporation. By the late 1980’s, Americans’ desire for affordable luxuries had grown. More disposable income and European travel gave Americans a new interest in the cultures and products they sampled abroad. This interest fed the specialty coffee market, and sales soared. Starbucks was credited with changing people’s expectations of how coffee should taste; changing the language of coffee drinks by introducing Italian terms such as grande (large) and venti (twenty ounces); changing the way coffee was ordered by offering customized options (for example, using skim milk instead of whole milk for a “nonfat” option or mixing regular and decaffeinated coffee to make a “half-caf”); changing how and where people met, as Starbucks became known as a safe place to socialize and conduct business; changing urban streetscapes, as a Starbucks located in a neighborhood became an indication that the area would be a desirable place to live and work; and raising social consciousness, as one aspect of the Starbucks mission statement encouraged employees to make a contribution to the community and the environment. Impact From its humble beginnings, Starbucks grew to become the world’s largest multinational chain of coffee shops. The initiatives and innovations that Starbucks developed to market specialty coffee became the standard for the industry. In addition, the company’s popularity helped drive a broader explosion of coffee culture in the United States. Several other national and regional chains arose, such as Seattle’s Best Coffee and Caribou Coffee, as did a great many individual coffee stores. Far
Statue of Liberty restoration and centennial
■
915
from driving one another out of business, the sheer number of the stores seemed to encourage Americans to patronize coffee stores in general, as in some cities, multiple stores seemed able to thrive on the same block. Further Reading
Koehn, Nancy. “Howard Schultz and Starbucks Coffee Company.” In How Entrepreneurs Earned Consumers’ Trust: From Wedgewood to Dell. Boston: Harvard Business School Press, 2001. Michelli, Joseph. The Starbucks Experience: Five Principles for Turning Ordinary into Extraordinary. New York: McGraw-Hill, 2007. Schultz, Howard, and Dori Jones Yang. Pour Your Heart Into It: How Starbucks Built a Company One Cup at a Time. New York: Hyperion, 1997. Sharon M. LeMaster See also
Business and the economy in the United States; Fads; Food trends.
■ Statue of Liberty restoration and centennial The Event
Full-scale restoration effort and centennial celebration for the Statue of Liberty Date Restoration took place from 1984 to 1986; centennial celebration held July 4-6, 1986 Place Statue of Liberty National Monument, Ellis Island, New York City The much-needed restoration of the Statue of Liberty and subsequent centennial celebration provided Americans with an opportunity to reflect upon their nation’s history and the heritage of the immigrants that passed through Ellis Island. The Statue of Liberty National Monument, one of the United States’ most recognizable national symbols, turned one hundred years old on October 28, 1986. France had given the statue, designed by Frédéric-Auguste Bartholdi, to the United States in 1886, in recognition of a friendship that had begun during the American Revolution. Erecting the statue had been a joint effort between the two countries, with France responsible for designing the statue and assembling it in the United States and the United States responsible for building its pedestal. In May of 1982, President Ronald Reagan appointed former Chrysler executive Lee Iacocca to head a private-
916
■
Statue of Liberty restoration and centennial
sector effort to raise funds to restore the Statue of Liberty. A partnership formed between the government, represented by the National Park Service, and the newly formed, private Statue of Liberty-Ellis Island Foundation. The foundation also sought resources to beautify the surroundings of the statue and to rehabilitate the crumbling ruins of Ellis Island. The Statue A combination of weather, pollution, time, and the high volume of sightseers visiting the island had left the Statue of Liberty in serious need of attention. In 1984, therefore, the statue was closed for renovation, and scaffolding was erected around it, obscuring it from view until its rededication. At this point, the United Nations designated the Statue of Liberty National Monument as a World Heritage Site.
The Eighties in America
Various procedures were performed on the monument’s interior and exterior. Liquid nitrogen was used to strip away seven layers of paint from the interior, and other techniques were employed to remove the layers of tar that originally been applied to plug leaks and prevent corrosion. Large holes in the copper skin were smoothed out and patched with new copper. Each of the over one thousand supporting iron ribs of the statue had to be removed and replaced, because the iron had corroded to such an extent that it had lost a great deal of its original density. Teflon film was inserted between the new bars and the skin to provide insulation and reduce friction. Chemicals were applied to sections of the copper skin to ensure that the statue was strengthened. The support structure of the right arm was updated and reinforced to make the arm structurally sound. The crown’s seven rays were also reinforced. Thousands
During the Statue of Liberty centennial celebration, traditional sailing vessels are welcomed into New York Harbor, under the watchful eyes of the statue (background) and the aircraft carrier USS John F. Kennedy (foreground). (U.S. Department of Defense)
The Eighties in America
of rivets were replaced, and any seams or open holes were sealed. A significant feature of the restoration was the replacement of the torch. The original torch, which had been modified extensively in 1916, was considered beyond repair. The new torch included features designed to enhance its visibility, such as gold plating on the exterior of the flame and external lamps on the surrounding balcony platform. Other renovations to the statue included upgraded climatecontrol systems and the addition of two elevators, one to the top of the pedestal and a second emergency elevator reaching the crown. Improvements were also made to the administration and concession buildings on Liberty Island. New walkways were added, along with landscaping to prepare the island for the centennial celebration. Restoration work on Ellis Island was limited to the main building and the power station. The Centennial On July 5, 1986, the Statue of Liberty reopened to the public during a centennial celebration known as Liberty Weekend. Acclaimed producer David Wolper was selected to orchestrate the $32-million gala. The event began on July 4 with over three thousand restoration sponsors and members of the media from more than forty nations joining President Reagan for the grand unveiling. Numerous celebrities such as Frank Sinatra and Elizabeth Taylor joined in the salute to the statue. The president kicked off the ceremonies by pressing a button that activated the floodlights on the statue. The event included the presentation of a special medal, created solely for the centennial, called the Medal of Liberty. The medal was given to twelve naturalized American citizens, including Irving Berlin, Bob Hope, Henry Kissinger, Albert Sabin, and Itzhak Perlman. Thirty-three naval vessels from fourteen nations passed the statue and fired twenty-one gun salutes. From Ellis Island, Chief Justice Warren Burger administered the citizenship oath to thirteen thousand people via satellite television broadcast. The evening was concluded with a dramatic fireworks display. The following day, First Lady Nancy Reagan led French and American schoolchildren on the first tour through the renovated statue, and a conference on the meaning of liberty began in New York City. On July 6, an event was held featuring sports legends Muhammad Ali, Billie Jean King, and Hank Aaron.
Statue of Liberty restoration and centennial
■
917
It included a skating exhibition by Dorothy Hamill, Peggy Fleming, and others. The closing ceremonies of the centennial included a cast of over twelve thousand, including notables Charlton Heston, Willie Nelson, Gene Kelly, the Four Tops, and the Pointer Sisters, as well as two hundred Elvis impersonators, gospel choirs, drill teams, dancers, and the Statue of Liberty All-American Marching Band. Impact The restoration of the Statue of Liberty National Monument and its subsequent centennial celebration provided Americans with a chance to celebrate their diverse heritage by recalling the immigrants who passed through Ellis Island and whose first image of America was the welcoming sight of the Statue of Liberty. The festivities portrayed the renewal of the statue as a renewal of the United States itself. Some believed that the restoration of the Statue of Liberty served as a metaphor for the restoration of the American Dream that, according to Reagan and his supporters, occurred during the 1980’s. Further Reading
Bell, James B., and Richard I. Abrams. In Search of Liberty. New York: Doubleday, 1984. Comprehensive treatment of the story of the Statue of Liberty and Ellis Island. Printed for the Centennial Commission as a souvenir. Contains excellent images. Moreno, Barry. The Statue of Liberty Encyclopedia. New York: Simon & Schuster, 2000. Thorough history of the Statue of Liberty and the restoration. Quick reference source. Smith, V. Elaine. “Engineering Miss Liberty’s Rescue.” Popular Science 228, no. 6 (June, 1986): 6873. Cover story on the restoration process that focuses on the techniques and products used to renovate the structure. United States General Accounting Office. National Parks: Restoration of the Statue of Liberty Monument— Report to the Chairman, Subcommittee on National Parks and Recreation, Committee on Interior and Insular Affairs, House of Representatives. Washington, D.C.: Author, 1986. Complete history of the project, including detailed budget, organizational chart, and numerous engineering blueprints and reports. Amanda Bahr-Evola See also
Iacocca, Lee; Immigration to the United States; Reagan, Ronald; Reagan Revolution.
918
■
The Eighties in America
Stealth fighter
■ Stealth fighter Definition Innovative military aircraft Manufacturer Lockheed Advanced Development
Projects Unit of Lockheed Martin First flight in 1982; revealed in 1988
Date
The F-117 Nighthawk was a classified aircraft of the United States Air Force that was developed during the 1980’s. Initially kept secret from the public, it was declassified during the latter part of the decade. Throughout the 1980’s, the Lockheed Advanced Development Projects Unit, nicknamed the “Skunk Works,” developed and constructed a fleet of stealth aircraft. The director of this project was Ben Rich. Rich took over the Skunk Works from Kelly Johnson, who had founded it. The F-117 Nighthawk was the first aircraft designed around stealth technology. Lockheed Martin was originally awarded the contract in 1973, but the first operational aircraft was not completed until 1982. By the end of the decade,
Lockheed Martin had fulfilled its contractual obligation to build a fleet of Nighthawks. The Skunk Works was a well-kept secret. Even the invoices for construction materials were designed to keep the project secret, as all such materials were listed as spare parts for other Lockheed Martin aircraft, such as the F-16 Fighting Falcon. Lockheed Martin also began working on a stealth fighter known as the F-22 Raptor, while Northrup developed a stealth bomber known as the B-2 Spirit. The F-117, although a fighter in design, was capable only of delivering bombs. The stealth technology employed by the F-117 makes the aircraft nearly invisible to radar. It is not completely invisible, however. The paint of the aircraft is an important part of its stealth capabilities, because it is made from radar-absorbent material (RAM). As a result, it absorbs rather than reflects radar signals, making the vehicle less detectable. Other aspects of the plane’s stealth technology include its engines and overall shape, both of which
The F-117 Nighthawk. (Catalan/cc-by-sa-2.0)
The Eighties in America
are modified to make it less visible to enemies in the air and on the ground. Impact The F-117 Nighthawk was kept secret from the public until 1988. When it and the B-2 were revealed that year, such demonstrably next-generation technology seemed like science fiction come to life. Had it been developed earlier, such an invisible aircraft would have represented a major strategic imbalance in the Cold War between the United States and the Soviet Union. Because the Cold War was winding down, however, the new strategic capabilities made possible by stealth technology were less significant to the nuclear arms race than they would have been just a few years earlier. The aircraft flew its first military mission in 1989, during the U.S. invasion of Panama. Two Nighthawks dropped two bombs on Rio Hato airfield. Later, the Nighthawks would gain fame during the first Persian Gulf War of the early 1990’s. Further Reading
Jenkins, Dennis R. Lockheed Secret Projects: Inside the Skunk Works. St. Paul, Minn.: MBI, 2001. Useful monograph on the Skunk Works unit of Lockheed Martin. Pace, Steve. Lockheed Skunk Works. Osceola, Wis.: Motorbooks International, 1992. Comprehensive history of the program that developed the F-117 Nighthawk and other U.S. military aircraft. Rich, Ben R., and Leo Janos. Skunk Works: A Personal Memoir of My Years at Lockheed. Boston: Little, Brown, 1994. Ben Rich was the director of the Skunk Works, and in this memoir of the project, he discusses the development of the F-117. Timothy C. Hemmis See also
Cold War; Foreign policy of the United States; Military spending; Panama invasion; Science and technology.
■ Steel, Danielle Identification
Best-selling author of romantic
fiction August 14, 1947; New York, New York
Born
In the 1980’s, Steel’s consistently staggering popularity made her an unrivaled benchmark of trends in the rapidly changing romance genre.
Steel, Danielle
■
919
Danielle Steel was one of the first authors to move romantic fiction into a new phase in which the heroines were strong, independent women determined to find themselves and solve their own problems. The lives of Steel’s heroines mirrored many of her own personal experiences, such as divorce, battling cancer, and difficulties with her children. In an interview, Steel once stated that she liked to create worlds in which her heroines’ struggles were rewarded with something she herself had missed out on—a happy ending. Steel’s fifth book, The Ring (1980), was her first hardcover publication. It boasted a more sophisticated look than had her paperbacks, featuring a glamorous black-and-white photograph of the author on the back cover. This was a great accomplishment for Steel, who had to push hard to achieve it. Her publisher, Delacorte (an imprint of Dell), had felt that the kind of readers who bought Steel’s books would not pay hardcover prices. Dell finally agreed to publish The Ring in hardcover, however, and Steel’s readers purchased it eagerly. With millions of books in print and a fan club and as the subject of numerous interviews, Steel moved into the next phase of her career. In 1983, she fired her longtime agent and hired Mort Janklow, one of the most powerful literary “superagents” in the country. Janklow was adept at negotiating television and movie deals for his clients. He also helped Steel make the transition from romance fiction to contemporary fiction. In addition to adult fiction, Steel also wrote the Max and Martha series of children’s books, which sought to help young readers face problems such as attending a new school or losing a grandparent. She was also one of seven women authors who contributed to Having a Baby (1984), in which she described her experience of suffering through a miscarriage. Extremely organized and focused, Steel created an elaborate filing system for keeping her writing on track. She prioritized her time, often writing well into the night so she could be available for her children during the day. Impact One of the three best-selling authors of the decade, Steel left her mark on the publishing industry. Her novels appealed to millions of readers and opened up the field of romance fiction for many other women writers. In 1989, Steel held the Guinness Book of World Records record for the longest run on the New York Times best seller list: 381 consecutive weeks.
920
■
Sting
Further Reading
The Eighties in America
Stewart Copeland and lead guitarist Henry Padovani (who was soon replaced by Andy Summers). Embracing eclecticism, the Police merged reggae and ska with punk and even jazz, produced several successful albums, and won six Grammy Awards in the early 1980’s. Their final album, Synchronicity (1983), included their most successful song, “Every See also Book publishing; King, Stephen; LiteraBreath You Take.” ture in the United States. Even before the Police disbanded, Sting began to make solo appearances. In 1982, he released a solo single, “Spread a Little Happiness,” from the sound track to the television play Brimstone and Treacle, in ■ Sting which he also appeared. The single became a hit in Identification British musician, songwriter, and the United Kingdom. His first solo album, The Dream actor of the Blue Turtles (1985), blended rock, reggae, jazz, Born October 2, 1951; Wallsend, and pop and featured many important jazz and fuNorthumberland, England sion musicians, including Kenny Kirkland, Darryl Jones, Omar Hakim, and Branford Marsalis. Later During the first half of the 1980’s, Sting was the lead albums continued to blend various musical styles singer, bass player, and primary songwriter for the New and always featured intelligent, literate lyrics. In Wave band the Police, one of the most popular musical 1987, Sting again worked with jazz artists, such as groups of the decade. When the band’s members went their Marsalis and veteran jazz arranger Gil Evans, releasseparate ways, Sting concentrated on his solo career, achieving . . . Nothing Like the Sun, which included the hit ing great success. songs “We’ll Be Together,” “Fragile,” “Englishman in In early 1977, Gordon Sumner, who was nicknamed New York,” and “Be Still My Beating Heart.” In Feb“Sting” after wearing a black and yellow jersey early ruary, 1988, he released Nada Como el Sol, a selection in his career, formed the Police with drum player of five songs from . . . Nothing Like the Sun sung in Spanish and Portuguese. Later that year, he performed an arrangement of “Murder by Numbers” (a song from Synchronicity) on Frank Zappa’s album Broadway the Hard Way. Active in support of various humanitarian and environmental causes, Sting performed on all four nights of the fourth Amnesty International benefit, The Secret Policeman’s Other Ball, in 1981, and he led an all-star band that included Eric Clapton, Phil Collins, and Bob Geldof in the Live Aid concert in 1985. In 1988, Sting joined a group of major artists, such as Peter Gabriel and Bruce Springsteen, for Amnesty International’s Human Rights Now! world tour and also released a single, “They Dance Alone,” which chronicled the plight of women under the Augusto Pinochet Ugarte regime in Chile. Sting also founded the Rainforest FoundaSting, center, with fellow Police members Andy Summers, right, and Stewart Copetion. land. (PA Photos/Landov) Bane, Vickie, and Lorenzo Benet. The Lives of Danielle Steel. New York: St. Martin’s Press, 1994. Hoyt, Nicole. Danielle Steel: The Glamour, the Myth, the Woman. New York: Windsor, 1994. Maryanne Barsotti
The Eighties in America
Sting pursued a minor acting career during the decade, appearing in such films as Dune (1984), Plenty (1985), The Bride (1985), Bring on the Night (1985), The Adventures of Baron Munchausen (1988), and Stormy Monday (1988). In 1989, he starred as Macheath in a failed Broadway revival of Die Dreigroschenoper (pr. 1928, pb. 1929; The Threepenny Opera, 1949). Impact During the 1980’s, Sting was involved in numerous musical projects that testified to his eclectic tastes. His intelligent lyrics and jazz-pop-world music fusion expanded the boundaries of popular music and led to a richness and expressiveness that were rare in rock. His work on behalf of charitable causes became legendary. Further Reading
Berryman, James. Sting and I. London: John Blake, 2005. Sandford, Christopher. Sting: Demolition Man. New York: Little, Brown, 1998. Sumner, Gordon. Broken Music. New York: Simon & Schuster, 2003. Mary A. Wischusen See also
Film in the United States; Jazz; Live Aid; Music; New Wave music; Pop music; World music.
■ Stockton massacre The Event
Patrick Edward Purdy kills five children and wounds thirty others when he opens fire on his former elementary school Date January 17, 1989 Place Cleveland Elementary School in Stockton, California The Stockton massacre sparked significant controversy and debate over possible restrictions on the manufacture and sale of assault weapons. On January 17, 1989, a twenty-six-year-old drifter, Patrick Edward Purdy, drove to his former elementary school in Stockton, California. After getting out of his car, Purdy poured gasoline on it and set it ablaze. Afterward, he walked toward the schoolyard of Cleveland Elementary School, where the children were enjoying lunch recess. There, Purdy opened
Stockton massacre
■
921
fire on the schoolchildren, mostly children enrolled in kindergarten through third grade. Within minutes, he had fired 106 rounds from his personally engraved AK-47 assault rifle, killing five Asian immigrants ranging in age from six to nine years old. Another thirty individuals were injured, including one teacher. Once the assault was over, Purdy used a nine-millimeter handgun to kill himself. This tragic event, known as the Stockton massacre, was actually the second shooting to take place at the same school in a ten-year span. Almost immediately, there was a public uproar. People questioned how an individual with Purdy’s criminal history, which included convictions for attempted robbery and unlawful weapons sales, could readily purchase an automatic rifle such as an AK-47. There was a call for tightened legislation to restrict the availability of all weapons with large ammunition capacities, including domestic and foreign assault weapons. Finally, a growing number of school systems banned all weapons on school grounds. Some of the controversy also stemmed from the apparent racially motivated nature of the crime. Impact The Stockton massacre led to statewide and national debates over proposed legislation to restrict assault weapons. As a result of the murders, California became the first state to ban certain types of assault weapons that same year. Subsequent Events On September 13, 1994, the Crime Control Act of 1994 was enacted. It banned the production, distribution, and possession of certain types of firearms, including assault weapons. Further Reading
“Death on the Playground.” Newsweek 113, no. 5 (January 30, 1989): 35. Holmes, Ronald, and Stephen Holmes, eds. Murder in America. 2d ed. Thousand Oaks, Calif.: Sage, 2001. “Slaughter in a School Yard.” Time, January 30, 1989, 29. Jocelyn M. Brineman and Richard D. McAnulty See also Asian Americans; Education in the United States; Post office shootings; San Ysidro McDonald’s massacre.
922
■
Stone, Oliver
■ Stone, Oliver Identification
American film director and screenwriter Born September 15, 1946; New York, New York The 1980’s marked for Stone the end of his apprenticeship as a writer of scripts for other directors and the beginning of his own career as a filmmaker.
The Eighties in America
Stone’s social criticism did not stop with the Vietnam War, however. In Salvador (1986), released before Platoon, he explored the involvement of the United States in Central America and provided a vivid portrayal of a foreign policy both devastating and dangerous in its execution. Wall Street (1987), dedicated to his stockbroker father, exposed the financial excesses of the stock market during a period of widespread corruption and insider trading, practices his father deplored. The line “Greed is good,” delivered by Michael Douglas portraying Gordon Gekko, the principal offender in the film, could have become a mantra for the period.
Oliver Stone began the 1980’s writing and directing his first feature, a horror film called The Hand (1981). Afterward, he spent the next five years writing screenplays for other filmmakers, including Conan the Barbarian (1982) for John Milius, Scarface (1983) for Impact Oliver Stone became the most famous Brian de Palma, Year of the Dragon (1985) for Michael American director of politically focused films of the Cimino, and Eight Million Ways to Die (1986) for Hal 1980’s. His films were often brash, angry, violent, Ashby. Writing for such talented directors prepared and confrontational, and they usually dealt with conhim to craft better screenplays for his own films. troversial subject matter. As a result, Stone simultaStone had served in the U.S. Army for fifteen months neously became one of the most admired and the along the Cambodian boarder in the Vietnam War. most reviled filmmakers in international cinema. He was wounded twice and was awarded a Purple He won two Academy Awards for Best Director durHeart with an oak leaf cluster, as well as a Bronze Star ing the decade, honoring his work on Platoon and for valor. He returned home a changed man. It was Born on the Fourth of July. not surprising, then, that as a film director he eventually turned to the war for material. In 1986, Stone released his third feature-length directorial effort, Platoon, the first of what would become a trilogy dealing with the Vietnam War and its effects on those who fought in Southeast Asia. Platoon focused on the day-today combat experience of infantry soldiers, and Born on the Fourth of July (1989) dealt with the experiences of returning vets as they worked to reintegrate themselves into American society. Heaven and Earth (1993) would complete the trilogy. The three movies provided perhaps one of the most devastating critiques of the war on film. The two films released in the 1980’s helped fuel a larger reassessment of the Vietnam experience and its aftermath that became one of the hallmarks of American cinema in the 1980’s. This reassessment led to an increasing number of films critical of U.S. overseas engagements generally, especially when they interOliver Stone, left, celebrates with Ron Kovic, the subject of the director’s Born on fered with the domestic social and politthe Fourth of July, after the film swept most of the top drama awards at the 1990 ical environment of another country. Golden Globe Awards ceremony. (AP/Wide World Photos)
The Eighties in America Further Reading
Beaver, Frank. Oliver Stone: Wakeup Cinema. New York: Twayne, 1994. Kagan, Norman. The Cinema of Oliver Stone. New York: Continuum, 2000. Salewicz, Chris. Oliver Stone. London: Orion Media, 1997. Silet, Charles L. P., ed. Oliver Stone: Interviews. Jackson: University Press of Mississippi, 2001. Charles L. P. Silet See also
Academy Awards; Action films; Cruise, Tom; Douglas, Michael; Film in the United States; Platoon; Wall Street.
■ Strategic Defense Initiative (SDI) Identification
Plan to establish an antiballistic missile defense
In an effort to protect the United States from a possible nuclear attack by the Soviet Union, President Ronald Reagan proposed a high-tech defense shield capable of shooting down incoming Soviet missiles. The plan generated much criticism, both of its technical infeasibility and of its political ramifications. In the 1970’s, the United States followed a nuclear deterrence strategy known as mutually assured destruction (MAD), which depended on a situation in which the United States and the Soviet Union each possessed enough nuclear weapons to survive an attack by the other and still launch a devastating counterstrike. The certainty of utter annihilation in a nuclear war thus prevented one from happening. When Ronald Reagan became president, however, he considered MAD a risky strategy, especially as the number of Soviet nuclear warheads increased. After consultations with scientific advisers, Reagan gave a nationally televised speech on March 23, 1983, in which he announced plans to establish the Strategic Defense Initiative (SDI), which was tasked with creating a defensive shield to protect the United States from nuclear attack. Reagan’s televised proposal envisioned a space-based front line of satellite defenses that could destroy Soviet missiles at the early-launch stage, a space-based second line to destroy individual warheads released by Soviet missiles that got through the front line, and a ground-based third line
Strategic Defense Initiative (SDI)
■
923
to destroy any warheads in their terminal phase that avoided the other defensive lines. Pushing the Technical Boundaries Shooting down Soviet missiles represented a huge technical challenge. The project started with ground-based missile technology, such as the Extended Range Interceptor (ERINT), originally developed for the Safeguard Anti-Ballistic Missile system devised in the 1970’s. The Strategic Defense Initiative Organization (SDIO), established at the Pentagon in 1984, funded a number of innovative approaches based on the ERINT technology that sought to detect, target, and destroy Soviet missiles. On the relatively low-tech end of development were projects like the Homing Overlay Experiment, a missile-launched projectile with four-meter-diameter fans to increase the size of the projectile and ensure a hit, and Brilliant Pebbles, watermelon-sized satellites that would destroy Soviet missiles by purposely colliding with them. On the high-tech end of the research spectrum were a number of directed-energy weapons programs that used energy to destroy missiles, rather than physical collisions between missiles and targeting projectiles. These high-tech projects, centered on beam-projecting weaponry, earned SDI its skeptical nickname, “Star Wars.” The first research centered on an X-ray laser powered by a nuclear explosion, with first tests carried out in 1983. In 1985, the SDIO began tests with a deuterium fluoride laser, which successfully destroyed a Titan missile booster and several low-flying target drones. Another prom ising project was the Hypervelocity Rail Gun, a spacebased platform that destroyed satellites with “bullets” fired at fourteen hundred miles per hour. The main drawback of the fluoride laser and the Rail Gun was the massive electricity requirement of the systems. Experiments on sensors designed to detect and target incoming Soviet warheads, such as the Boost Surveillance and Tracking System, proved much more successful, as they proved capable of tracking Soviet missiles from their initial launches through their entire flight path. The total of SDI funding amounted to approximately $30 billion between 1983 and 1989. Criticism of SDI While many Americans supported SDI, the plan attracted a considerable amount of criticism. Some critics believed the system to be so far beyond the technical capability of current science that it would remain unfeasible for the foresee-
924
■
The Eighties in America
Strategic Defense Initiative (SDI)
A Vision of the Future Excerpts from Ronald Reagan’s televised speech regarding the Strategic Defense Initiative, delivered March 23, 1983: Let me share with you a vision of the future which offers hope. It is that we embark on a program to counter the awesome Soviet missile threat with measures that are defensive. Let us turn to the very strengths in technology that spawned our great industrial base and that have given us the quality of life we enjoy today. What if free people could live secure in the knowledge that their security did not rest upon the threat of instant U.S. retaliation to deter a Soviet attack, that we could intercept and destroy strategic ballistic missiles before they reached our own soil or that of our allies? I know this is a formidable, technical task, one that may not be accomplished before the end of the century. Yet, current technology has attained a level of sophistication where it’s reasonable for us to begin this effort. It will take years, probably decades of efforts on many fronts. There will be failures and setbacks, just as there will be successes and breakthroughs. And as we proceed, we must remain constant in preserving the nuclear deterrent and maintaining a solid capability for flexible response. But isn’t it worth every investment necessary to free the world from the threat of nuclear war? We know it is. . . . I call upon the scientific community in our country, those who gave us nuclear weapons, to turn their great talents now to the cause of mankind and world peace, to give us the means of rendering these nuclear weapons impotent and obsolete. Tonight, . . . I am directing a comprehensive and intensive effort to define a long-term research and development program to begin to achieve our ultimate goal of eliminating the threat posed by strategic nuclear missiles. This could pave the way for arms control measures to eliminate the weapons themselves. We seek neither military superiority nor political advantage. Our only purpose—one all people share—is to search for ways to reduce the danger of nuclear war. My fellow Americans, tonight we’re launching an effort which holds the promise of changing the course of human history. There will be risks, and results take time. But I believe we can do it.
able future. They therefore argued that it represented a massive waste of resources. Others saw SDI as a waste of resources even if it worked, because it diverted funds from other government programs
that should receive a higher priority. Some foreign-policy critics saw SDI as an open provocation to the Soviets that might trigger a new arms race. Antiwar advocates postulated that SDI actually increased the chance of a nuclear war. They feared that the United States, safe behind its SDI shield, might be more inclined to launch a nuclear attack on the Soviet Union, because the Soviets would not be able to inflict similar damage in a counterstrike. It also seemed possible that the Soviet Union might consider launching a preemptive attack against the United States before SDI became operational and Soviet nuclear weapons became useless. Another major criticism of SDI was its potential violation of the 1972 AntiBallistic Missile (ABM) Treaty, which placed limitations on American and Soviet missile defense systems. Supporters of SDI claimed that the ABM Treaty applied only to groundbased systems and not to the spacebased systems envisioned by President Reagan. Impact Research continued on SDI throughout the 1980’s. With the fall of the Soviet Union, however, the project lost much of its purpose. Many pro-Reagan analysts claim that SDI helped bankrupt the communist system, as the Soviets were forced to spend money they could not afford on technology designed to match or overcome the initiative. Subsequent Events
Later presidents continued antiballistic missile research, but for regional defense against missiles launched by terrorist groups or rouge nations. In 1993, President Bill Clinton renamed SDIO the Ballistic Missile Defense Organization (MBDO), reflecting its reduced mission from global to regional defense. Even this more modest program remained controversial, however, because it continued to violate the ABM Treaty and was ac-
The Eighties in America
Streep, Meryl
■
925
cused by critics of potentially provoking a new Cold War-style arms race. Further Reading
Guertner, Gary L., and Donald M. Snow. The Last Frontier: An Analysis of the Strategic Defense Initiative. Lexington, Mass.: Lexington Books, 1986. Examination of the scientific and political ramifications of the project. Linenthal, Edward. Symbolic Defense: The Cultural Significance of the Strategic Defense Initiative. Urbana: University of Illinois Press, 1989. Discussion of how America and the rest of the world perceived SDI and reacted to the project’s expectations and implications. Reiss, Edward. The Strategic Defense Initiative. Cambridge, England: Cambridge University Press, 2003. Well-documented history of SDI and its impact, with analysis of subsequent projects beyond the 1980’s. Steven J. Ramold See also
Cold War; Foreign policy of the United States; Intermediate-Range Nuclear Forces (INF) Treaty; Military spending; Reagan, Ronald; Reagan Doctrine; Soviet Union and North America; Weinberger, Caspar.
At the April 14, 1980, Academy Awards ceremony, Meryl Streep holds her Oscar for Best Supporting Actress, awarded for her work in Kramer vs. Kramer. (AP/Wide World Photos)
■ Streep, Meryl Identification American actor Born June 22, 1949; Summit, New Jersey
By the mid-1980’s, Streep had become the most admired film actor of her generation. Building on a broad general education; expert training in theater at Vassar, Darmouth, and the Yale Drama School; and commercial work that ranged from Off-Broadway to Broadway to made-for-television movies, Meryl Streep entered film acting in the late 1970’s. She immediately made her presence known, most memorably as a young working-class woman loved by two friends in The Deer Hunter (1978). Soon after, Streep advanced to leading roles and became recognized for her meticulous preparation and her ability to handle a wide range of accents and behaviors. Although adept at musical performance and comedy, Streep was celebrated in the 1980’s for her success in a string of powerful dramatic roles,
usually playing strong-willed women. Off screen, Streep was an active proponent of women’s rights and equity within the Screen Actors Guild. In 1981, Streep appeared in the dual roles in the aggressively self-reflexive The French Lieutenant’s Woman, portraying the nineteenth century title character as well as the contemporary actor who plays the part in a film version of the novel. For both the film’s characters, independence becomes a central issue. This theme of independence would resonate across Streep’s film roles and in many of her interviews and published statements. She played a factory worker who finds the strength to confront corrupt plutonium-plant owners in the biopic Silkwood (1983), adventuresome writer Isak Dinesen in Out of Africa (1985), and a former French Resistance patriot unwilling to be satisfied with a domesticated British life in Plenty (1985). In Heartburn (1986), Streep portrayed an embittered betrayed wife and writer. In A
926
■
Sununu, John H.
Cry in the Dark (1988; also known as Evil Angels), she depicted an Australian mother unjustly accused of killing her child. In these roles, Streep brought to the screen women of courage, intelligence, and determination. Even when playing a destitute drunk in Ironweed (1987) or the tragic Holocaust survivor in Sophie’s Choice (1982), Streep conveyed a core of personal integrity in the most humiliating of circumstances. Streep won an Academy Award for her performance in Sophie’s Choice and was nominated for another five during the decade, an unprecedented achievement. Her acting was honored at Cannes, by the New York Film Critics Council, and at the Golden Globe Awards. Additionally, she won the People’s Choice award for Favorite Dramatic Motion Picture Actress five years out of six between 1984 and 1989. Impact Meryl Streep raised the bar of expectation for American film performance with her technical skill, careful preparation, and creativity. Despite winning only one of the six Academy Awards for which she was nominated in the 1980’s, Streep became synonymous with Oscar-caliber dramatic performances during the decade. Further Reading
Cardullo, Bert, et al., eds. Playing to the Camera: Film Actors Discuss Their Craft. New York: Yale University Press, 1998. Maychick, Diana. Meryl Streep: The Reluctant Superstar. New York: St. Martin’s Press, 1984. Carolyn Anderson See also
Academy Awards; Film in the United States; Feminism; Theater; Women in the workforce.
■ Sununu, John H. Identification
Governor of New Hampshire from 1983 to 1989 and White House chief of staff from 1989 to 1991 Born July 2, 1939; Havana, Cuba Sununu’s political success in New Hampshire and his work for George H. W. Bush in the 1988 presidential campaign resulted in his appointment as White House chief of staff, a position he used to advance the causes of the Republican right wing.
The Eighties in America
Although John H. Sununu was a successful businessman and the president of JHS Engineering Company and Thermal Research (1963-1983), it was his political rather than financial career that was most important during the 1980’s. In 1980, he ran unsuccessfully for a U.S. Senate seat from New Hampshire, losing the primary election to Warren Rudman. After his defeat, he became Rudman’s campaign manager in the general election. Two years later, Sununu bested Hugh Gladden in a gubernatorial election and went on to serve three terms as governor of New Hampshire. While governor, Sununu opposed raising taxes, brought new businesses to New Hampshire, and supported the controversial Seabrook nuclear power plant. He gained national prominence by serving as the chair of the Coalition of Northeastern Governors, chair of the Republican Governors Association, and chair of the National Governors Association. He was a member of the Council for National Policy from 1984 to 1985 and again in 1988. During the 1988 presidential campaign, he was a key player, helping George H. W. Bush win the key early New Hampshire primary to become the Republican nominee. Sununu also lead attacks on Michael Dukakis, the Democratic presidential nominee. President George H. W. Bush rewarded Sununu for his help by naming him White House chief of staff in 1989, a post that he held until 1991. When Robert Teeter was considered for an appointment as counselor to the president, a position that would have rivaled Sununu’s in importance, Sununu squelched the appointment and became the president’s closest political adviser. He spent about 40 percent of his working day with the president and also served as the president’s legislative liaison with Congress. A staunch ally of the more conservative wing of the Republican Party, Sununu used his influence with the moderate Republican Bush to downplay the importance of the environment, deny access to the disabled, oppose the Clean Air Act, and stop funding for abortions. Impact Sununu was described by detractors as the president’s lightning rod; others described him as the president’s “pit bull” and the “bad cop” to the president’s “good cop.” Possessed of a sizable ego, an extremely high IQ, and an abrasive personality, he enjoyed taking on the media. That behavior did not make him popular with more moderate Republi-
The Eighties in America
cans, however, nor did it endear him to the press. In 1991, Sununu was accused of misusing government aircraft for personal use. The resulting scandal ended with his resignation on December 3, 1991. Further Reading
Burke, John P. The Institutional Presidency: Organizing and Managing the White House from FDR to Clinton. 2d ed. Baltimore: Johns Hopkins University Press, 2000. Kessel, John H. Presidents, the Presidency, and the Political Environment. Washington, D.C.: Congressional Quarterly, 2001. O’Neil, John. The Paradox of Success. New York: Tarcher, 1994. Thomas L. Erskine See also Abortion; Bush, George H. W.; Conservatism in U.S. politics; Dukakis, Michael; Elections in the United States, 1988; Environmental movement.
■ Superconductors Definition
Elements, alloys, and ceramic compounds through which electric current flows without resistance
The discovery of high-temperature superconductors in the late 1980’s by European and American researchers was quickly recognized as extremely important by both scientists and journalists, who emphasized such potential applications as magnetically levitated trains. The discovery and development of superconductors, as well as their theoretical explanation and practical applications, were due to the collective efforts of many Europeans and Americans throughout the twentieth century. In 1911, the Dutch physicist Heiki Kamerlingh Onnes made the surprising discovery that mercury, when cooled in liquid helium to near absolute zero (about 4 Kelvins), permitted electricity to flow through it without resistance. A few decades later, German researchers found that superconductors repelled magnetic fields, a phenomenon later called the Meissner effect, after one of its discoverers. In the late 1950’s, three American physicists, John Bardeen, Leon Cooper, and John Schrieffer, explained superconductivity in a mathematical theory based on the movement of electron pairs. This theory became known as the BCS theory,
Superconductors
■
927
from the first letter of each physicist’s last name. Impressive as these early developments were, the 1980’s became a nonpareil period of momentous discoveries in superconductivity. High-Temperature Superconductivity
Before the 1980’s, all superconductivity research took place at temperatures close to absolute zero, but in 1986, the Swiss physicist Karl Alexander Müller and his younger German colleague, Johannes Georg Bednorz, working at the International Business Machines (IBM) research laboratory in Switzerland, made a ceramic compound composed of the elements lanthanum, barium, copper, and oxygen. To their surprise, this compound superconducted at the highest temperature then known for any substance, 35 Kelvins. When the researchers published this discovery, it stimulated the search for substances that superconducted at even higher temperatures, and it also led to their winning the 1987 Nobel Prize in Physics. Two researchers who built on the work of Bednorz and Müller were Maw-Kuen Wu at the University of Alabama in Huntsville and Paul (Ching-Wu) Chu at the University of Houston. Early in 1987, Wu and his students made a new ceramic material composed of yttrium, barium, copper, and oxygen (YBCO) that appeared to superconduct at a temperature much higher than any previous material. Paul Chu, Wu’s doctoral dissertation adviser, used his sophisticated equipment to observe a transition in the magnetic susceptibility of the YBCO ceramic material at the astonishing temperature of 92 Kelvins. The relatively high temperature of YBCO and other materials’ superconductivity made it possible to conduct research at temperatures above 77 Kelvins, using the safer and more economical liquid nitrogen rather than liquid helium. Many scientists called high-temperature superconductors the “discovery of the decade,” and research in the field exploded throughout the rest of the 1980’s. Within six months of the initial discoveries, more than eight hundred papers were published. These papers reported on the physical and chemical properties of the new materials, as well as on their detailed atomic arrangements. A variety of new ceramic superconductors were made that challenged the tenets of the BCS theory. This theory had successfully explained superconductivity in the range of 4 to 40 Kelvins, but it had predicted a limit
928
■
The Eighties in America
Superconductors
of about 40 Kelvins for superconductive materials. Many scientists quickly saw that the BCS theory was inadequate to make sense of the new ceramic superconductors. Impact The race to commercialize the epochal 1986 and 1987 discoveries of high-temperature superconductors accelerated through the remaining years of the 1980’s and beyond. The first company to take advantage of these discoveries was the firm later known as ISCO International, which introduced a sensor for medical equipment. Researchers around the world wrestled with the formidable problems of fabricating wires from brittle ceramic substances. If such wires could be made, energy efficiencies in various electrical devices would be dramatically improved. Major corporations around the world invested heavily in the research and development of superconductors because of their potential to render more efficient computers, magnetically levitated (maglev) trains, and many other machines. Physi-
cists did make progress in creating materials that superconducted at 125 Kelvins and 138 Kelvins. However, because basic questions about the structure and behavior of these new materials needed to be answered and because many manufacturing and marketing problems needed to be solved, it turned out that the road from discovery through research and development to successful application was more tortuous than early enthusiasts had initially envisioned. Further Reading
Hazen, Robert M. The Breakthrough: The Race for the Superconductor. New York: Summit Books, 1988. Hazen, who was involved in Wu and Chu’s discovery of the yttrium superconductor, provides a vivid, behind-the-scenes account of the scientists and the research of this great breakthrough. Mayo, Jonathan L. Superconductivity: The Threshold of a New Technology. Blue Ridge Summit, Pa.: TAB Books, 1988. After introducing readers to the ba-
Karl Alexander Müller, left, and Johannes Georg Bednorz won the 1987 Nobel Prize in Physics for synthesizing a new ceramic substance capable of superconductivity at 35 degrees Kelvin. (IBM Corporation, AIP Emilio Segrè Visual Archives)
The Eighties in America
sics of superconductivity, Mayo emphasizes the possible applications of high-temperature superconductors for computers, medicine, and transportation. Helpful glossary. Schechter, Bruce. The Path of No Resistance: The Story of the Revolution in Superconductivity. New York: Simon & Schuster, 1989. After a survey of early work on superconductivity, Schechter concentrates on the pivotal discoveries of Bednorz and Müller, as well as Wu and Chu. Emphasis is on superconductor theory, applications, and possible commercialization. Simon, Randy, and Andrew Smith. Superconductors: Conquering Technology’s New Frontier. New York: Plenum Press, 1988. Intended for readers with no background in physics. Analyzes the nature, history, and theories of superconductivity, as well as the possible influence of discoveries in hightemperature superconductivity on future technologies. Tinkham, Michael. Introduction to Superconductivity. 2d ed. New York: Dover Books, 2004. Accessible to science students with some knowledge of physics and mathematics, this book contains a historical overview and an analysis of the principal experiments and theories in the field, including hightemperature superconductors. Robert J. Paradowski See also Computers; Inventions; Nobel Prizes; Science and technology.
■ Superfund program Identification
Program setting up a general fund to pay for cleanup of hazardous waste sites
The Superfund was a joint public and private program designed to facilitate cleanup of the worst American toxic waste sites. However, it was underfunded and litigation intensive, and it did not represent a permanent total solution to the nation’s hazardous waste problem. The Superfund program, started in 1980, was the United States’ primary answer to the growing problem of toxic waste dumps in the 1980’s. That problem burst into the national consciousness in the 1970’s, when an entire neighborhood in upstate New York was declared unlivable as a result of toxic
Superfund program
■
929
waste: At Love Canal, near Niagara Falls, a subdivision had been built over a former toxic waste dump. After many protests, eight hundred families living there were relocated. Even after the national scope of the toxic waste problem was acknowledged, however, President Jimmy Carter was unable to convince Congress to pay for a long-term nationwide cleanup program. The compromise reached was the Superfund, created by the Comprehensive Environmental Response, Compensation, and Liability Act (1980). The Superfund was promoted as a way to clean up the various toxic waste dumps around the country without massive federal aid. The Superfund’s moneys come from taxes on petroleum products and chemicals. To be eligible for cleanup using Superfund money, a contaminated site must be placed on an official list of Superfund sites. It must first be nominated for inclusion and then inspected and certified. Sites on the Superfund list are ranked by priority, which is important, because the Superfund has never collected nearly enough revenue to clean up most of the sites on the list. Thus, only those near the top have a chance of receiving government-funded attention. When it was recognized that the Superfund’s federal revenues were inadequate, the government sought to force polluting companies to clean up the contamination they had produced. Such companies could be fined tens of thousands of dollars per day if they refused to comply. These companies often then sued other responsible parties, either to recoup the costs of cleanup or to recover the money spent to pay fines. While the Love Canal disaster was the main incident that helped lead to the Superfund legislation, it was not the only environmental disaster of the 1970’s and 1980’s. During the mid-1980’s, a wide variety of environmental issues surfaced. In Missouri, the entire town of Times Beach was closed and relocated because of dioxin contamination. The town had hired a contractor to pour oil on its dirt roads to contain dust, and the contractor used oil that had two thousand times the level of dioxin present in many herbicides. In New Jersey, huge quantities of medical waste washed up on the state’s shores in 1987 and 1988, forcing the closing of many beaches. The medical waste came from a New York City landfill. Impact
The Superfund attempted to create a government-private partnership to clean up hazard-
930
■
Superfund program
The Eighties in America
Superfund workers conduct drilling operations for soil sampling at Bruin Lagoon in Bruin, Pennsylvania. (U.S. Army Corps of Engineers)
ous waste dumps in the United States. However, the government’s funding was inadequate, and the private companies failed to contribute funds of their own until they were forced to do so. As a result, much litigation ensued and only a small percentage of the sites were cleaned up. Further Reading
Colten, Craig E., and Peter N. Skinner. The Road to Love Canal: Managing Industrial Waste Before EPA. Austin: University of Texas Press, 1996. Examines how industrial waste was handled before 1970, when the Environmental Protection Agency was established. Dixon, Lloyd S. The Financial Implications of Releasing Small Firms and Small-Volume Contributors from Superfund Liability. Santa Monica, Calif.: RAND Institute for Civil Justice, 2000. This short work examines the implications of a proposal to ex-
empt small businesses from Superfund regulations. Mazur, Allan. A Hazardous Inquiry: The Rashomon Effect at Love Canal. Cambridge, Mass.: Harvard University Press, 1998. Studies the Love Canal environmental disaster from a wide variety of perspectives, including those of the school board, the industry, and the public. Stephenson, John B. Superfund Program: Updated Appropriation and Expenditure Data. Washington, D.C.: U.S. General Accounting Office, 2004. Updates information about how much Superfund money has been gathered and spent since passage of the law creating the fund. Scott A. Merriman See also Environmental movement; Ocean Ranger oil rig disaster; Times Beach dioxin scare; Water pollution.
The Eighties in America
■ Supreme Court decisions Definition
Rulings made by the highest court in the United States
A number of U.S. Supreme Court decisions had an impact during the 1980’s and future decades. Chief Justice Warren E. Burger headed the Supreme Court from 1969 until his retirement in 1986, when President Ronald Reagan appointed conservative federalist William H. Rehnquist as chief justice. The Supreme Court handed down a wide range of decisions during the 1980’s that affected abortion, affirmative action, women’s and gay rights, education, and freedom of speech and religion. Abortion
After the Supreme Court’s landmark Roe v. Wade (1973) decision legalizing abortion, state and local governments immediately started passing complex laws aimed at weakening the impact of Roe. As a result, the Court spent much of the 1980’s hearing challenges to these state laws, and antiabortion groups hoped that the Court’s rulings in these cases would ultimately lead to the overturning of Roe. After Roe, the federal government’s Medicaid program began covering the costs of abortions for lowincome women. In 1976, Congress passed the Hyde Amendment, which barred the use of Medicaid funds for abortions except when the mother’s life was in danger and in cases of rape or incest. A group of indigent women sued the federal government, challenging the constitutionality of the amendment. In the first of many abortion decisions during this decade, Harris v. McRae (1980) upheld the constitutionality of the Hyde Amendment. The Court ruled that a woman’s right to terminate a pregnancy did not entitle her to receive government funding for that choice. In three 1983 decisions, City of Akron v. Akron Center for Reproductive Health, Planned Parenthood Association of Kansas City v. Ashcroft, and Simopoulos v. Virginia, the Court struck down state and local laws that, among other things, imposed a twenty-four-hour waiting period between the signing of a consent form and having an abortion and required minors to receive parental consent before having an abortion. Citing its 1983 abortion decisions, the Court in Thornburgh v. American College of Obstetricians and Gynecologists (1986) overturned portions of a Pennsylvania antiabortion law because it infringed on a
Supreme Court decisions
■
931
woman’s right to an abortion. The Court said that states could not require doctors to inform women seeking abortions about potential risks and about available benefits for prenatal care and childbirth. Webster v. Reproductive Health Services (1989) was the Court’s last abortion decision that pro-choice advocates believed weakened Roe v. Wade. The 5-4 decision upheld a Missouri law that barred state employees and facilities from performing abortions. The Webster ruling was narrow in that it did not affect private doctors’ offices or clinics where most abortions were performed. However, Webster did give state legislatures new authority to limit a woman’s right to an abortion without reversing Roe v. Wade. Affirmative Action and Discrimination During the 1980’s, the federal government had various laws and affirmative action programs in place in an effort to encourage more minorities and women to earn college degrees or to own their own businesses.
Chief Justice Warren E. Burger presided over the Court during the first half of the 1980’s. (Library of Congress)
932
■
Supreme Court decisions
During the decade, the Internal Revenue Service (IRS) stopped providing tax-exempt status to private schools that discriminated against African Americans, and two religious institutions that had racebased admission policies sued the government to regain their tax-exempt status. In Bob Jones University v. United States (1983), the Supreme Court upheld the IRS’s authority to deny tax-exempt status for private religious schools that practiced racial discrimination. The Court ruled that the government’s interest in the eradication of racial discrimination outweighed a school’s need for tax-exempt status when that school discriminated based on race. In Fullilove v. Klutznick (1980), nonminority contractors challenged Congress’s decision to set aside 10 percent of federal public works funding for minority contractors. A deeply divided Court held that the minority set-aside program was a legitimate exercise of congressional power to remedy past discrimination. This decision reaffirmed Congress’s right to set racial quotas to combat discrimination. The Court handed down various decisions during the decade that centered on racial discrimination in the criminal justice system. In Batson v. Kentucky (1986), the Court ruled that an African American man had not received a fair trial because the prosecuting attorney had deliberately disqualified all the potential African American jurors during the selection process, resulting in him being convicted by an all-white jury. The Court ruled that attorneys who rejected qualified prospective jurors solely on the basis of their race violated the Sixth Amendment. In its Vasquez v. Hillery (1986) decision, the Court ruled that the conviction of any defendant indicted by a grand jury from which members of his or her race had been illegally excluded must be reversed. In Turner v. Murray (1986), the Court held that an African American defendant facing a possible death penalty for the murder of a white victim was entitled to have prospective jurors questioned about racial bias. Women’s and Gay Rights Title IX of the Education Amendments of 1972 banned sex discrimination at colleges and universities that received federal funding. The federal government would cut off government grants and student loans to schools that discriminated against women. In Grove City College v. Bell (1984), the Supreme Court upheld the federal requirement and ruled that in order for colleges and
The Eighties in America
universities to continue to receive federal funding, they must comply with Title IX. In Rostker v. Goldberg (1981), the Court refined the limits of sexual equality by ruling that women may be excluded from the military draft. Unlike other areas in which the judicial body had struck down malefemale distinctions, the Court ruled that Congress may discriminate between men and women when it came to the draft because it was based on the need for combat troops and not equity. While the Court refused to allow women to become part of the military draft, it did give women access to private men’s clubs. In Roberts v. United States Jaycees (1984) and Rotary International v. Rotary Club of Duarte (1987), the Court ruled that private, men-only clubs could not exclude women from their membership. The Court handed down two decisions during the 1980’s that dealt with gay rights issues. In a case involving a homosexual man arrested in his own bedroom, the Court decided for the first time whether states could be allowed to regulate private sexual activities between consenting adults. In Bowers v. Hardwick (1986), the Court upheld a Georgia antisodomy law that made it a crime to engage in homosexual acts even in the privacy of one’s home. In Webster v. Doe (1988), the Court allowed a former Central Intelligence Agency (CIA) employee to sue the government agency for firing him because the agency considered him to be a threat to national security because he was a homosexual. Education The Supreme Court was asked to decide if states could allow taxpayers to deduct from their state income taxes tuition and other expenses for their children’s religious elementary or secondary school education. Minnesota law permitted such deductions, and some taxpayers sued the state, arguing that the law violated the establishment clause separating church and state. In its Mueller v. Allen (1983) decision, the Court ruled that a state tax deduction for education expenses was constitutional because the law had the secular purpose of ensuring that children were well educated and did not “excessively entangle” the state in religion. The clash over the teaching of creationism and evolution in the public schools reached the Supreme Court in 1987. In Edwards v. Aguillard, the Court ruled that Louisiana public schools that taught evolution could not be required to teach creationism as “creation science” if such a requirement
The Eighties in America
was intended to promote a religious belief. The Court said that the state law had no secular purpose and endorsed religion in violation of the Constitution’s establishment clause of the First Amendment. In Wallace v. Jaffree (1985), the Supreme Court struck down an Alabama law that allowed public school teachers to hold a one-minute period of silence for “meditation or voluntary prayer” each day. The Court did not rule that the moment of silence was itself unconstitutional. Rather, it held that Alabama lawmakers had passed the law solely to advance religion, thereby violating the First Amendment. How much freedom of speech minors should have at school was the issue before the Court in two First Amendment cases during the 1980’s. In Bethel School District v. Fraser (1986), the Court upheld a school district’s suspension of a high school student for delivering a speech that contained “elaborate, graphic, and explicit sexual” metaphors. The Court ruled that the First Amendment did not prevent school officials from prohibiting vulgar and lewd speech that would undermine the school’s basic educational mission. Two years later, the Court handed down a similar decision that First Amendment advocates considered to be a major setback in protecting students’ rights to freedom of expression. The Court held in Hazelwood School District v. Kuhlmeier (1988) that a school principal could censor the content of a student newspaper if that newspaper was part of a class assignment and not a forum for public discussion. However, students did win an important Supreme Court victory during the 1980’s in the area of school censorship. Steven Pico was one of five students who challenged their school board’s decision to remove books from their high school library because they were “anti-American, anti-Christian, anti-Semitic, and just plain filthy.” The Court ruled in Island Trees School District v. Pico (1982) that school officials could not remove books from school library shelves simply because they disliked the ideas contained in those books. Freedom of Speech During the 1980’s, the Supreme Court heard a variety of First Amendment cases ranging from copyright issues to the public’s access to court trials. By the 1980’s, many Americans were using videocassette recorders (VCRs) to record
Supreme Court decisions
■
933
their favorite television programs while they were away from home. Movie producers believed that this use of the VCR was a violation of copyright law and sued the VCR manufacturer. However, the Court ruled in Sony Corp. of America v. Universal City Studios, Inc. (1984) that the home use of VCRs to tape television programs for later viewing (“time shifting”) did not violate federal copyright law. Evangelist Jerry Falwell believed that the First Amendment did not give pornography publisher Larry Flynt the right to publish a fake ad poking fun at him and his deceased mother. However, in its 1988 Hustler Magazine v. Falwell decision, the Court decided in favor of Flynt and Hustler, ruling that satire and parody were protected forms of free speech. The American flag is a symbol associated with freedom, nationalism, patriotism, and sometimes militarism. Protesters sometimes burn the flag to demonstrate their opposition to a government policy. Outside the 1984 Republican National Convention in Dallas, Texas, Gregory Lee Johnson burned a flag in protest against President Ronald Reagan’s policies. Johnson was arrested under the state’s flag desecration statute. In its 5-4 ruling in Texas v. Johnson (1989), the Court struck down the Texas flag desecration law as well as similar laws in forty-eight states by ruling that flag burning was a constitutionally protected form of symbolic speech. The Court reaffirmed that the public’s right to have access to the courts could outweigh a defendant’s desire to keep the public out of the courtroom. In Richmond Newspapers v. Virginia (1980), the Court ruled that a trial judge’s order to close the courtroom to the public and media during a murder trial was unconstitutional. The Court ruled that the arbitrary closing of a courtroom to avoid unwanted publicity violated the First Amendment and that the closure of court hearings was permissible only under unusual circumstances. Public broadcasting, unlike commercial broadcasting with its advertisers, is dependent on the government for much of its funding. In the 1980’s, the government tried to prevent public radio and television stations from voicing opinion through editorials. In its 1984 FCC v. League of Women Voters of California decision, the Court struck down the federal regulation that prohibited any noncommercial educational station receiving government funding from engaging in editorializing. The Court ruled that this
934
■
The Eighties in America
Supreme Court decisions
U.S. Supreme Court Justices During the 1980’s Justices are nominated to the Supreme Court by the president and approved by the U.S. Senate. The table below lists the justices who served during the 1980’s. The names are placed in the order in which they took the judicial oath of office and thereby started their tenure on the court. Asterisks (*) indicate the terms of chief justices. Justice
Term
William J. Brennan
1956-1990
Potter Stewart
1958-1981
Byron White
1962-1993
Thurgood Marshall
1967-1991
Warren E. Burger
1969-1986*
Harry A. Blackmun
1970-1994
Lewis F. Powell, Jr.
1972-1987
William H. Rehnquist
1972-1986 1986-2005*
John Paul Stevens
1975-
Sandra Day O’Connor
1981-2006
Antonin Scalia
1986-
Anthony Kennedy
1988-
regulation violated the free speech rights of public broadcasters because it curtailed the expression of editorial opinion that was at “the heart of First Amendment protection.” Freedom of Religion The establishment clause of the First Amendment bars the government from preferring one religion over another. During the 1980’s, the Supreme Court handed down a number of decisions that challenged the separation of church and state. The constitutionality of including religious symbols in public holiday displays came before the Court in Lynch v. Donnelly (1984) and again in Allegheny County v. Greater Pittsburgh ACLU (1989). In Lynch, the Court ruled that an annual city park Christmas display that included a nativity scene was constitutional because the scene was displayed with
other Christmas symbols and was used to promote retail sales and goodwill, not to endorse a particular religion. In Allegheny County, however, the Court ruled that a nativity scene placed inside the Allegheny County Courthouse with the words “Gloria in Excelsis Deo,” referring to the words sung by the angels at the Nativity (Luke 2:14), did endorse religion and violated the Constitution. At the same time, the Court upheld the display of a nearby menorah, which appeared along with a Christmas tree and a sign saluting liberty, reasoning that the combined display of the tree, the sign, and the menorah did not endorse one particular religion but instead recognized that both Christmas and Hanukkah were part of the same winter-holiday season, which had a secular status in society. Members of the religious movement International Society for Krishna Consciousness wanted to walk among the crowd, distribute flyers, and solicit donations during the Minnesota State Fair. However, state fair organizers required the group to distribute its literature in a fixed location along with the other fair vendors. The Court ruled in Heffron v. International Society for Krishna Consciousness (1981) that the Krishna members had not been discriminated against, because fair organizers had treated all groups the same, regardless of their religious or political affiliations. The Court also ruled that the fair organizers had legitimate interest in confining vendors to a designated space because of the need to avoid congestion with the large amounts of pedestrian traffic at the fair. During the 1980’s, some states passed “blue” laws requiring businesses and sporting events to be closed on Sundays and “Sabbath” laws that required employers to give employees the day off on their chosen day of worship. However, in Thornton v. Caldor (1985), the Court ruled that state laws that endorsed a specific religious practice, like observing a Sabbath, were unconstitutional. In Goldman v. Weinberger (1986), S. Simcha Goldman, an Orthodox Jew and ordained rabbi serving as an officer in the U.S. Air Force, sued the military after being punished for wearing his yarmulke (skullcap) indoors while in uniform, in violation of military regulations. The Court upheld the Air Force penalties against Goldman, ruling that the military’s interest in enforcing its dress code outweighed the officer’s religious obligation to keep his head covered.
The Eighties in America Pornography During the 1980’s, antipornography groups and some feminists convinced some state and local governments that pornography should be banned because it violated women’s civil rights by portraying them as sex objects and could be linked to violence against women. Cities and states began passing antismut laws that allowed women to sue porn producers and distributors if the women could prove that they had been harmed by the pornographic material. In American Booksellers Association v. Hudnut (1985), the Court struck down an Indianapolis antipornography ordinance, ruling that the law and others like it were unconstitutional. The selling of pornography moved to the telephone lines when pornographers began providing sexually oriented telephone services known as “diala-porn.” Congress immediately passed a law making dial-a-porn illegal. In Sable Communications of California, Inc. v. FCC (1989), a unanimous Court overturned the federal law, ruling that it violated the free speech rights of pornographers. The Court said that the dial-a-porn industry should be regulated to protect minors, but it could not be outlawed altogether because banning dial-a-porn would deny adults access to this sexually oriented telephone service. Throughout the 1980’s, the Court reaffirmed that pornography deserved some First Amendment protection, but not at the same level as political speech. Despite the legal victories pornographers won during the 1980’s, the Court reminded them that there were still limits to how much free speech protection pornographers possessed. Some cities regulated pornography by moving X-rated movie theaters away from churches, schools, homes, or parks. Despite arguments that such zoning laws were a form of censorship and violated free speech, the Court in Renton v. Playtime Theatres (1986) ruled that state and local governments could use zoning laws to restrict the location of theaters that showed sexually explicit films without violating pornographers’ First Amendment rights. Impact Through the rulings handed down by the Supreme Court during the 1980’s, women no longer could be forced to sign a consent form or wait twenty-four hours before having an abortion. However, the Court did give state governments more authority in restricting a woman’s ability to have an abortion. Women were excluded from the military draft, but African Americans could not be excluded
Supreme Court decisions
■
935
from jury duty solely because of their race. Colleges and universities could still lose federal funding or their tax-exempt status for gender or race discrimination. Cities could not ban pornography on the basis that it discriminated against women, but they could zone X-rated movie theaters away from neighborhoods and children. State and local governments had to make sure that any public displays of religious objects were in a secular context and did not promote a particular religion. School officials had more authority in controlling the inappropriate speech of their students but less authority in imposing religiousbased regulations such as school prayer disguised as a “moment of silence” and the teaching of creationism. Parents could deduct from their state income taxes expenses related to sending their children to religious schools, but they could not force school officials to censor books from the school libraries. Protesters could burn the American flag as a form of protected symbolic speech. The government could not refuse federal funding to public radio and television stations that aired editorials. Consumers could use their VCRs to record their favorite programs without fear of violating federal copyright law. The impact of these Supreme Court decisions continued to resonate throughout U.S. politics and culture long after the decade was over. Further Reading
Harrison, Maureen, and Steve Gilbert, eds. Abortion Decisions of the United States Supreme Court: The 1980’s. Beverly Hills, Calif.: Excellent Books, 1993. For the general reader, a discussion of all the abortion cases decided by the Supreme Court from Harris v. McRae through Webster v. Reproductive Health Services. Irons, Peter. A People’s History of the Supreme Court. New York: Viking Press, 1999. A general history of the Supreme Court and its most significant decisions placed in their cultural and political context. McCloskey, Robert G. The American Supreme Court. 4th ed. Chicago: University of Chicago Press, 2005. A classic work offering a concise introduction into the workings of the Supreme Court and its role in constructing the U.S. Constitution and its role in U.S. politics. Eddith A. Dashiell See also Abortion; Affirmative action; Bowers v. Hardwick; Education in the United States; Feminism;
936
■
The Eighties in America
Swaggart, Jimmy
Flag burning; Homosexuality and gay rights; Hustler Magazine v. Falwell; Meritor Savings Bank v. Vinson; O’Connor, Sandra Day; Pornography; Racial discrimination; Rehnquist, William H.; Roberts v. United States Jaycees; Thompson v. Oklahoma; Webster v. Reproductive Health Services; Women’s rights.
■ Swaggart, Jimmy Identification
Pentecostal minister and televangelist Born March 15, 1935; Ferriday, Louisiana A popular televangelist known for his dramatic, “spiritfilled” preaching style, Swaggart left public ministry during the late 1980’s in the wake of news of his reported involvement with a prostitute. Swaggart tearfully confessed his sins to his followers in a memorable videotaped sermon.
Jimmy Swaggart began building his Assemblies of God ministry in the 1960’s after refusing a gospel music recording deal from Sun Records, the home label of Swaggart’s cousin Jerry Lee Lewis. Instead of pursuing a recording career, Swaggart chose to use his musical talents to inspire conversions to Christianity. Over the next several years, he built an empire that eventually included television and radio ministries and the Jimmy Swaggart Bible College near his church in Baton Rouge. At the height of Swaggart’s career in the mid-1980’s, the Jimmy Swaggart Ministries posted incomes of $140 million per year. However, in 1988—just a year after Jim Bakker, another Assemblies of God minister, confessed to his own financial and sexual indiscretions—it was revealed that Swaggart had solicited the services of prostitutes, and that the trysts featured sexual practices that many of his followers thought to be depraved. Rumors persisted that it was Swaggart’s preaching rival, Marvin Gorman of New Orleans, who had tracked and recorded his liaisons and then used his surveillance to undo the more popular Swaggart. After his sexual practices were made public, Swaggart on February 21, 1988, preached a tearful sermon to his loyal followers—a group that was dwindling every day. The sermon was broadcast, excerpted, and analyzed nationwide. Swaggart retained his standing as an ordained minister under the General Presbytery of the Assemblies of God until May of 1988, when he was expelled for violating that group’s injunction on public preaching during a term of suspension. Though he lost his media empire, Swaggart continued to preach at what became the Family Workshop Center in Baton Rouge. Impact More than any other figure, Jimmy Swaggart became associated with the ups and downs of televangelism in the 1980’s. At the height of his career, he attracted an extremely large following—and equally sizable donation revenues—with his emotional and passionate preaching style. His downfall, with its related tales of ministry rivalry and sexual escapades, was a public blow to an industry always in search of greater credibility and respect. Further Reading
A scandal-ridden Jimmy Swaggart addresses the press in April, 1988. (AP/Wide World Photos)
Balmer, Randall. “Still Wrestling with the Devil: A Visit with Jimmy Swaggart Ten Years After His Fall.” Christianity Today, March 2, 1998, 31. Seaman, Ann Rowe. Swaggart: The Unauthorized Biog-
The Eighties in America
raphy of an American Evangelist. New York: Continuum, 1999. Jennifer Heller See also
Bakker, Jim and Tammy Faye; Falwell, Jerry; Heritage USA; Religion and spirituality in the United States; Robertson, Pat; Scandals; Televangelism; Television.
■ Synthesizers Definition
Electronic devices for producing and manipulating sound, especially music
Synthesizers became common in popular music during the 1980’s in genres ranging from rock to country, as well as classical, stage, and sound track music. Analog synthesizers had existed since the early 1960’s. While their sound was not realistic, it created unique musical effects that opened up new avenues of experimentation for composers and performers. Nevertheless, analog synthesizers were able to produce only one note at a time. Technologies developed in the 1970’s made synthesizers capable of polyphony and of more realistic sounds, but the initial versions were expensive. The 1980’s saw these technologies become practical and affordable. Companies such as Casio, Yamaha, Roland, Kurzweil, and Korg pioneered the decade’s new synthesizer technologies, dominating the industry. Casio and Yamaha broke into the retail-priced keyboard market in 1980. Korg introduced the first mid-level polyphonic synthesizer in 1981. Yamaha took its place in the music industry by licensing the technology for frequency-modulated (FM) synthesis from Stanford University, achieving more realistic sounds by combining waveforms. Another breakthrough of the decade was digital music sampling. The same digital technology that recorded compact discs was used to record “samples” of real instrument sounds. Kurzweil introduced the first sampling keyboard in 1983, but Roland’s version, released in 1985, was both more affordable and more nuanced, closely mimicking the sound of a real piano. Casio released a low-end sampling keyboard in 1986.
Synthesizers
■
937
As digital music evolved, interconnectivity became a concern. In 1983, the industry-wide Musical Instrument Digital Interface (MIDI) standardized interfaces and data formats, so users could connect synthesizers, keyboards, computers, and other devices to one another. MIDI would be updated to deal with newer problems and technologies. Impact
Gradually, these new technologies removed the stigma once attached to electronic instruments. New genres, such as synthpop and New Age music, specialized in the use of synthesizers. In 1981, composer Vangelis produced the entire sound track to the film Chariots of Fire with a synthesizer. After his success with Cats (pr. 1982), theatrical composer Andrew Lloyd Webber continued to use synthesizers in his shows and adopted a digital piano as his instrument of choice for composing. Sampling also became a crucial technology, as artists learned to manipulate sampled sounds, especially voices and snippets of other musicians’ work, to create new musical effects. During the 1980’s, synthesizers became commonly accessible and used by average people. Churches began purchasing them in place of organs. Where musical instruments had previously been relegated to specialty stores, department stores and electronics stores began dedicating shelf space to synthesizers. More affordable, portable, and space-saving than pianos, the instruments were frequently found in homes. Many products were marketed encouraging people to teach themselves how to play keyboard instruments with computerized assistance.
Further Reading
Friedman, Dean. Complete Guide to Synthesizers, Sequencers, and Drum Machines. London: Music Sales, 1985. Jenkins, Mark. Analog Synthesizers: Understanding, Performing, Buying—From the Legacy of Moog to Software Synthesis. St. Louis: Focus Press, 2007. Russ, Martin. Sound Synthesis and Sampling. 2d ed. St. Louis: Focal Press, 2004. John C. Hathaway See also Compact discs (CDs); Computers; Consumerism; Inventions; MTV; Music; New Wave music; Pop music; Science and technology; Vangelis.
T ■ Tabloid television Definition
Sensationalistic television newsmagazine programs modeled after print tabloids
Meant to be the televised equivalent of tabloid newspapers, tabloid television shows established a foothold in both daytime and prime time during the 1980’s. Their proliferation blurred the line between entertainment and mainstream traditional news, the public accepted this shift, and network news programs modified their formats in response to their tabloid competition.
had been the subject of tabloid attention because of his son’s kidnapping and murder, John Walsh was a perfect host of America’s Most Wanted. Tabloid Television Genres
Three formats or genres of nonfiction programming became the basis for tabloid television. The first format, a type of “reality” television, used minicams to capture documentary footage of law enforcement or rescue personnel performing their duties. Extraordinary amounts of footage were shot and carefully edited to heighten the drama, and staged reenactments sometimes substituted for actual footage. FOX built early primetime success on COPS, a show in which viewers were offered the experience of riding along in police patrol cars in different cities around the country. The second format, the tabloid newscast or documentary, copied the appearance of a nighttime news-
Tabloid television in the 1980’s encompassed a range of programming that used provocative titles, an exaggerated style, and content related to crime, sex, celebrity gossip, and other outlandish or sensational subjects. The television landscape was changing during the decade as a result of the proliferation of cable channels, as well as an increase in the number of independent broadcast television stations and the establishment of FOX as a viable fourth network. The explosion in the number of channels entailed a demand for content and created a thriving marketplace for syndicated programming—inexpensive alternative programming that was sold to individual stations or groups of stations, rather than entire networks. Unscripted, nonfiction programming, moreover, was the among the least expensive such programming to produce. Rupert Murdoch had already built an international tabloid newspaper empire when he bought FOX. He programmed the fourth network with such “reality” and tabloid shows as COPS, A Current Affair, and America’s Most Wanted. The latter series was the Tabloid talk show host Geraldo Rivera’s nose was broken in a brawl with white first FOX show to break into the Nielsupremacists that broke out during a taping of his show on November 3, 1988. (AP/ sen ratings’ top fifty. As a man who Wide World Photos)
The Eighties in America
Talk shows
■
939
cast or documentary but defied accepted journalistic standards. Examples of the format included A Current Affair (also a FOX creation), Hard Copy, America’s Most Wanted, and Unsolved Mysteries, which relied heavily on reenactment. America’s Most Wanted and Unsolved Mysteries added audience participation to the tabloid format. The third format or genre of tabloid television was the tabloid talk show. Usually, such a show’s host posed a question in each episode or segment, and guests represented various sides of the featured issue. Phil Donohue and Oprah Winfrey began as Chicago favorites before their talk shows became nationally syndicated. Winfrey was credited with legitimizing the daytime talk show by bringing a level of sincerity to controversial subjects. Morton Downey, Jr., was far more provocative. He did not interview guests so much as scream at and belittle them. He treated members of his audience in the same fashion. Downey aired only in late-night time slots. Few people personified tabloid television more than did Geraldo Rivera, an Emmy and Peabody award-winning reporter. Rivera’s career turned to the sensational when he hosted a special titled Secrets of Al Capone’s Vault in 1986. It was revealed on live television that there were in fact no secrets inside the vault, but the program was highly rated for a syndicated special. Within a year, Rivera had his own talk show and covered everything from crossdressers to neo-Nazis. In one show featuring neo-Nazis, a skinhead broke a chair over Rivera’s head and broke his nose.
Murdoch—discusses his career in both print and television tabloids. Kimmel, Daniel M. The Fourth Network: How Fox Broke the Rules and Reinvented Television. Chicago: Ivan R. Dee, 2004. History of the FOX Network through 2000, with accounts by insiders. Krajicek, David J. Scooped! Media Miss Real Story on Crime While Chasing Sex, Sleaze, and Celebrities. New York: Columbia University Press, 1998. Print journalist examines taboidization and its negative effects on legitimate news coverage. Povich, Maury, and Ken Gross. Current Affairs: A Life on the Edge. New York: Putnam’s, 1998. Talk show host Maury Povich’s account of A Current Affair. Nancy Meyer
Impact Tabloid television brought changes in local and national news, forcing journalists to compete for stories that they once would have ignored. To the distress of many longtime journalists, by the end of the 1980’s, network news departments were examining successful tabloid programs to see what aspects they could incorporate in their own newscasts and news specials.
Since the beginning of television, talk shows have been a popular programming choice of both producers and viewers. With appealing hosts, interesting guests, and timely topics, talk shows have consistently attracted large and loyal audiences. During the 1980’s, however, talk shows not only maintained their popularity but also saw an increase in viewership, resulting in an explosion of talk show formats, increased competition, diverse hosts, and groundbreaking subject matter.
Further Reading
Glynn, Kevin. Tabloid Culture: Trash Taste, Popular Power, and the Transformation of American Television. Durham, N.C.: Duke University Press, 2000. Analyzes tabloid television’s effect on television in general. Kearns, Burt. Tabloid Baby. New York: Celebrity Books, 1999. Kearns—the producer of A Current Affair who had worked in print media for Rupert
See also
America’s Most Wanted; Cable television; Crime; FOX Network; Journalism; Network anchors; Rivera, Geraldo; Talk shows; Television; Winfrey, Oprah.
■ Talk shows Definition
Television programs in which the host, guests, and audience engage in topical conversations
Talk shows of the 1980’s reflected changing demographics in society and offered television viewers opportunities for self-improvement through relevant topics and “infotainment.”
Sociological Changes
Prior to the 1980’s, talk shows and news programs generally were hosted by white males such as Johnny Carson, Phil Donahue, and Dick Cavett. On the heels of the civil rights movements of the 1960’s and 1970’s, the 1980’s witnessed great shifts in traditional demographics. Minorities became recognized and sought-after con-
940
■
The Eighties in America
Talk shows
sumers, and television producers had to adjust the face of talk shows and their personalities to attract and accommodate this growing and diverse new demographic. Women and minorities such as Oprah Winfrey, Sally Jessy Raphael, Arsenio Hall, and Geraldo Rivera began dominating the talk show circuit, ushering in changes in topics, guests, and audience members. The 1980’s redefined what was acceptable to discuss on television, expanding talk show topics to include traditionally taboo subjects such as teen pregnancies, alternative lifestyles, and eating disorders. With the 1980’s obsession with self-improvement, talk shows adopted the term “infotainment” and offered information about health, news events, and trends affecting Americans. Talk shows began to include ordinary people as guests, not just celebrities or experts, to discuss issues that were relevant to viewers’ lives. In doing so, the average person was elevated to celebrity status, and viewers received televised therapy in the security of their own homes. Hosts acted as surrogates for the audience at home, asking personal questions and supporting their guests as they responded. The queen of therapeutic and informative talk was Winfrey, whose debut show aired in September, 1986. Winfrey comforted guests as they opened up, and she shared her own stories of abuse and neglect, relationships, and weight problems, frequently crying with her audience. Although Donahue had discussed controversial topics since the 1970’s, his approach was more intellectual than emotional. Industry and Technological Changes Technological advances in hardware and syndication distribution methods, along with changes within the broadcast and cable industries, also contributed to the talk show revolution of the 1980’s. Cable television and its new networks offered viewers more entertainment choices, and videocassette recorders (VCRs) and remote controls offered viewers more ways to pick and choose their entertainment. These advancements created new challenges for producers to overcome in order to retain their sponsors. Cable and satellite networks, along with the emerging television networks, found themselves with numerous hours to program. Talk shows provided needed “filler” programming at a low cost, since they required no writers or actors and minimal sets. Also, the prevalence of syndication and the ease of distrib-
uting programming over satellites made it affordable for small stations to receive new talk shows, free from network constraints. During the 1980’s, Americans had more consumer choices because of deregulation policies, more disposable income as a result of a soaring stock market and a sense of prosperity, and more entertainment choices because of technological advances and broadcasting distribution methods. Viewers’ ability to “channel surf” (and their diminishing attention spans) meant that producers had seconds to capture and retain viewers. In order to do this, talk shows transformed into quick, over-the-top programming based on attention-grabbing sound bites. Impact During the fast-paced decade, Americans turned to talk shows for both emotional security and entertaining escape. Shows became ethnically diverse to meet societal expectations and to compete in the changing media landscape, and they became more prevalent through syndication advancements and an abundance of cable and network channels. Further Reading
Day, Nancy. Sensational TV: Trash or Journalism? Springfield, N.J.: Enslow, 1996. Explores the motives and practices of television talk and news shows. Grindstaff, Laura. The Money Shot: Trash, Class, and the Making of Talk Shows. Chicago: University of Chicago Press, 2002. Examines ambush and emotional tactics employed by talk shows. Kurtz, Howard. Hot Air: All Talk All the Time. New York: Basic Books, 1997. An in-depth look into the biggest names in talk shows. Manga, Julie Engel. Talking Trash: The Cultural Politics of Daytime TV Talk Shows. New York: New York University Press, 2003. Investigates sociological factors contributing to talk shows’ popularity. Parish, James Robert. Let’s Talk: America’s Favorite Talk Show Hosts. Las Vegas: Pioneer Books, 1993. Includes biographies of television talk show hosts. Scott, Gini Graham. Can We Talk? The Power and Influence of Talk Shows. New York: Insight Books, 1996. A history of the rise in popularity of talk shows. Shattuc, Jane M. The Talking Cure: TV Talk Shows and Women. New York: Routledge, 1997. Analysis of the interaction between women’s issues and talk shows. Sara Vidar
The Eighties in America
Talking Heads
■
941
See also Cable television; Demographics of Canada; Demographics of the United States; FOX network; Journalism; Pauley, Jane; Rivera, Geraldo; Soap operas; Tabloid television; Television; Winfrey, Oprah.
Naked, were the last original studio albums recorded by the band. Issues of control and increasing interest in pursuing separate projects made it difficult for the band members to work together. After a long hiatus, they announced in 1991 that Talking Heads had officially broken up.
■ Talking Heads
Impact Talking Heads drew from a vast pool of sources that grew and altered from one album to the next. During the 1980’s, they incorporated into their music African, Caribbean, and South American rhythms, as well as funk and abstract sounds more often associated with avant-garde composers such as Philip Glass. In the mid-1980’s, they introduced both electronic and hip-hop elements into their songs, confirming and transforming the influence of those genres on popular music. Although the group was closely associated with other New York City bands of the time, such as the Ramones and Blondie, it brought a distinctive and revolutionary type of music to the popular music scene, becoming one of the most influential bands of the 1980’s.
Identification American New Wave rock band Date 1974-1991
A rock band that combined an art-school sensibility with pop and punk influences, Talking Heads profoundly influenced the sound of popular music in the 1980’s.
Talking Heads was formed in the mid-1970’s at the Rhode Island School of Design (RISD) in Providence by classmates Tina Weymouth (bass and vocals), David Byrne (guitar and vocals), and Chris Frantz (drums). The band released its debut album, Talking Heads: 77 in 1977, a pivotal year for both punk and New Wave music. By then, it had added Jerry Harrison as a fourth member, playing guitar Further Reading and keyboards. Bowman, David. This Must Be the Place: The Adventures By the time Talking Heads’ fourth album, Remain of Talking Heads in the Twentieth Century. New York: in Light (1980), came out, the band had become a HarperEntertainment, 2001. fixture on the New York City punk scene, playing freGittins, Ian. Talking Heads: Once in a Lifetime—-The quent gigs at important music clubs CBGB and the Stories Behind Every Song. Milwaukee: Hal LeonMudd Club. The group had its first top ten hit with ard, 2004. the single “Burning Down the House” from 1983’s album Speaking in Tongues. Talking Heads took advantage of the new music video format to impress their unique style on the new MTV generation, producing striking videos for both “Burning Down the House” and an earlier single, “Once in a Lifetime.” Both became popular standards in the MTV rotation. Talking Heads followed up their first top ten single with a major U.S. tour. The film director Jonathan Demme documented the tour in what became the 1984 concert film, Stop Making Sense. A live album of the same name followed. Little Creatures came out in 1985, and Byrne directed a musical comedy in 1986 called True Stories. The accompanyTalking Heads in 1983. From left: David Byrne, Jerry Harrison, Tina Weymouth, ing album and a follow-up in 1988, and Chris Frantz. (Deborah Feingold/Archive Photos)
942
■
Tamper-proof packaging
McNeil, Legs, and Gillian McCain, eds. Please Kill Me: The Uncensored Oral History of Punk. New York: Penguin Books, 1997. Lacy Schutz See also
Blondie; Boy George and Culture Club; Glass, Philip; MTV; Music; Music videos; New Wave music; Performance art; Pop music; Synthesizers; World music.
■ Tamper-proof packaging Definition
Product packaging designed to frustrate attempts to alter its contents
After the Tylenol murders, consumer fears about the safety of over-the-counter medications were partly allayed by the development of protective packaging. Prior to the 1980’s, medicines such as asprin were routinely placed on open shelves with nothing more than screw-on caps and a wad of cotton inside. Some medicines might be sold in boxes, but their flaps generally slid in and out with ease. The first kind of protective packaging used by pharmaceutical companies, the familiar “push down and turn” lid, came only after wide publicity focused on the accidental poisonings of small children. However, child-proof caps were primarily intended to protect the innocent from their own curiosity, and they often frustrated adults seeking to take their medications. As a result, packages without child-proof lids were made available for the elderly and other people living in households without small children. Tylenol Murders This casual approach to pharmaceutical safety changed after the 1982 Tylenol murders in Chicago. The murders made plain in the most shocking way possible the vulnerability of over-thecounter medicines to being made vehicles for poisoners’ malice. Not only were all Tylenol products swept from store shelves while investigators tried to determine how extensive the contamination was, but also many other non-prescription drugs were withdrawn because of fears of copycat criminals seeking further media attention at the expense of human life. Immediately after the Tylenol killings, there was real concern that the murders might end the sale of over-the-counter medications, making it necessary to reinstitute policies requiring pharmacists to
The Eighties in America
keep even non-prescription medications behind their counters. However, consumer advocates called upon pharmaceutical corporations to cooperate with the Food and Drug Administration to develop alternative means of protecting the public from future tainting of common drugs. The solution they developed was to create barriers that would make it difficult or impossible for future poisoners to imitate the Tylenol murders. The most obvious security solution was to seal the flaps of boxes containing retail drugs. In order to open the sealed boxes, it became necessary to tear the flap loose, leaving visible damage. However, there was concern that a sufficiently skilled poisoner might develop a means to loosen such flaps without leaving tell-tale damage. As a result, it was decided to add additional layers of protection, such as a plastic sleeve around the neck of a bottle, or a paper under the lid that sealed the bottle’s neck altogether. In addition, it was noted that the Extra-Strength Tylenol capsules that had been the primary vehicle for the Tylenol murders had been adulterated by carefully pulling apart the two halves of each capsule, dripping in a droplet of cyanide, and reassembling the capsule. As a result, it was decided that rigid plastic capsules would henceforth be restricted to prescription medicines, which were considered less vulnerable to tampering because access to them was restricted. All over-the-counter medicines that could not be administered in tablet form would henceforth use a softer gelatin-based capsule that would crush or disintegrate if disassembled. Impact The development of tamper-proof packaging enabled the public to regain confidence in the safety of over-the-counter medicines. Within a matter of months, the sale of Tylenol products returned to levels comparable to those prior to the Tylenol murders. However, security came at a price. Protective packaging was often difficult and frustrating for legitimate users to remove, particularly if they were elderly or disabled. Furthermore, all the additional packaging had to be discarded, adding to the burden on the nation’s landfills. However, it was generally agreed that these downsides were an acceptable price to pay for safety. More subtly, the growing ubiquity of tamperproof packaging marked a loss of innocence by the American people. Every protective sleeve and inner lid that had to be peeled off a bottle of over-the-
The Eighties in America
counter medicine was a reminder that there were people out there who could not be trusted—who, if given a chance, would be happy to do others harm through devious means. Although the immediate fear of product tampering soon receded, the sense that the world was a fundamentally dangerous place became an artifact of American culture. In time, the use of tamper-proof and tamperevident packaging moved beyond the pharmaceutical industry to include many other products that were susceptible to malicious tampering or even inadvertent contamination by careless shoppers. Many prepared foods—particularly baby foods, which were seen as particularly vulnerable because of the helplessness and innocence of their intended consumers—soon received various forms of protective packaging. Cosmetics formed another group of consumer products that began to be enclosed in various forms of shrink-wrap or protective bands, rather than simply being marketed in loose-lidded boxes, as it became obvious that careless “sampling” of such products could pass infections. By the end of the 1980’s, it had become unthinkable to purchase many of these items from open shelves if their protective packaging was not firmly in place. Further Reading
Dean, D. A. Pharmaceutical Packaging Technology. Oxford, England: Taylor and Francis, 2000. Detailed explanation of how tamper-proof and tamperevident packaging is produced. Somewhat technical, but a good source of in-depth information. Jenkins, Philip. Decade of Nightmares: The End of the Sixties and the Making of Eighties America. New York: Oxford University Press, 2006. Helps place the development of tamper-proof packaging within a larger cultural context that goes beyond the obvious impetus of product tampering. Useem, Michael. The Leadership Moment: Nine True Stories of Triumph and Disaster and Their Lessons for Us All. New York: Three Rivers Press, 1998. Includes among its nine studies that of the Tylenol murders and how Johnson & Johnson restored the reputation of its brand by aggressively promoting tamper-proof packaging to prevent similar incidents. Leigh Husband Kimmel See also
Business and the economy in the United States; Medicine; Tylenol murders.
Tanner ’88
■
943
■ Tanner ’88 Identification
Cable television political satire series Date Aired from February 15 to August 22, 1988 Tanner ’88 accelerated the blurring of presidential campaigns with television entertainment, as it highlighted the performance aspects of political life. Although the practice of fake documentary is as old as filmmaking itself, with Tanner ’88, two of America’s best satirists, Garry Trudeau, the creator of the comic strip Doonesbury, and Robert Altman, the director of M*A*S*H (1970) and Nashville (1975), joined forces with Home Box Office (HBO) to produce a truly original project. In the eleven-part series, an imaginary candidate, Jack Tanner (played by Michael Murphy), ran for the Democratic nomination for president of the United States. Tanner and his fictional staff and family appeared in narratives scripted by Trudeau set in real political environments, from New Hampshire to the floor of the Democratic National Convention in Atlanta. Along the primary route, actual candidates—Pat Robertson, Bob Dole, and Gary Hart—interacted briefly with Tanner, whom they may or may not have recognized. In one lengthy conversation, Governor Bruce Babbit, who had dropped out of the running, counseled Tanner to oppose the “silver screen of unreality” and “take a risk,” advice both ludicrous and heartfelt. Ironies multiplied as the show progressed, as Tanner—whose campaign slogan was “for real”— struggled with his pragmatic staff and idealistic daughter to find his voice in the artificial world of campaign politics. This world was populated by pretentious ad makers, confused pollsters, vacuous speech coaches, and gossip-hungry journalists. The most jarring sequence in the series occurred when Tanner visited an actual inner-city meeting of Detroit parents whose children had been murdered; their expressions of authentic grief and frustration momentarily cut through the satire. Mid-campaign, HBO reran the six previously aired episodes of Tanner ’88 in one block, introduced by real television journalist Linda Ellerbe. Viewers were urged to “choose from a group of presidential candidates, one or more of whom is not a real person.” Tanner won the straw poll, receiving 38 percent of the approximately forty-one thousand votes cast, followed by George H. W. Bush (with 22 percent), Jesse
944
■
The Eighties in America
Tax Reform Act of 1986
Jackson (with 21 percent), and Michael Dukakis (with 19 percent). Impact Tanner ’88 influenced the growth of the “mockumentary” style, solidified HBO’s reputation for innovative productions, and extended the reputations of Robert Altman (who won an Emmy for his direction) and Gary Trudeau as brilliant critics of American culture. Further Reading
Goff, Michael J. The Money Primary: The New Politics of the Early Presidential Politics. Lanham, Md.: Rowman & Littlefield, 2004. Juhasz, Alexandra, and Jesse Lerner, eds. F is for Phony: Fake Documentary and Truth’s Undoing. Minneapolis: University of Minnesota Press, 2006. Keyssar, Helene. Robert Altman’s America. New York: Oxford University Press, 1991. Trudeau, G. B. Flashbacks: Twenty-Five Years of “Doonesbury.” Kansas City: Andrews and McMeel, 1995. Carolyn Anderson See also Atwater, Lee; Bush, George H. W.; Cable television; Comic strips; Dukakis, Michael; Elections in the United States, 1988; Hart, Gary; Jackson, Jesse; Liberalism in U.S. politics; Television.
■ Tax Reform Act of 1986 Identification U.S. federal legislation Date Became law on October 22, 1986
The Tax Reform Act of 1986 made major changes in how income was taxed in the United States by simplifying the tax code, reducing the top marginal income tax rate, and eliminating many tax shelters and other preferences. Though it was officially deemed revenue neutral because it did not increase overall tax levels, the Tax Reform Act of 1986 significantly altered the distribution of federal taxes. The top tax rate was lowered from 50 percent to 28 percent, and the bottom rate was raised from 11 percent to 15 percent, the only time in history that the top rate was reduced and the bottom rate was simultaneously increased. Other reforms of the act included reducing the capital gains tax to the same tax rate as that for ordinary income and increasing incentives favoring investment in owner-occupied housing relative to rental housing
by increasing the home mortgage interest deduction. Because the measure was seen as revenue neutral, the act passed by a large bipartisan majority in Congress. The bill originated in a Democratic tax reform proposal first advanced in August, 1982, by Senator Bill Bradley and Representative Dick Gephardt, as well as in President Ronald Reagan’s call for tax reform in his January, 1984, state of the union address. As enacted, the legislation cut individual tax rates more than had originally been anticipated, but it cut corporate taxes less than originally proposed. The law shifted tax liability from individuals to corporations, reversing a long trend of corporate taxes supplying a decreasing share of federal revenues. Enactment of the measure was accomplished through the perseverance of its chief backers in Congress over the objections of many special interests that would lose the favored status they enjoyed under the current tax code. Although the law was originally envisioned as a way of eliminating all tax loopholes, the tax reform debate almost immediately focused on which tax loopholes would be preserved or added under the new law. Despite the nation being mired in a period of large budget deficits, Reagan refused to support any tax increases, and as a result, the bill neither raised nor reduced total federal tax collections over a five-year period after its enactment. Ultimately, that principle allowed the bill’s adherents to turn back costly amendments to restore tax breaks, because the sponsors of those amendments were not able to produce offsetting revenues. Impact The Tax Reform Act of 1986 was a considerable change from the previous tax code. The fact that Congress passed serious tax reform at all was remarkable, considering all the obstacles it faced. Of all post-World War II-era domestic goals, tax reform was among the most politically difficult to bring about. After the 1986 tax reforms were enacted, however, Congress would go on to make at least fourteen thousand further changes to the tax code, very few of which could be considered reform. Many of the loopholes and exceptions that were excised by the Tax Reform Act of 1986 were later essentially restored. Further Reading
Birnbaum, Jeffrey, and Alan Murray. Showdown at Gucci Gulch. New York: Random House, 1987. Fisher, Patrick. Congressional Budgeting: A Representa-
The Eighties in America
tional Perspective. Lanham, Md.: University Press of America, 2005. Peters, B. Guy. The Politics of Taxation. Cambridge, England: Blackwell, 1991. Patrick Fisher See also Congress, U.S.; Economic Recovery Tax Act of 1981; Reagan, Ronald.
■ Taylor, Lawrence Identification NFL Hall of Fame linebacker Born February 4, 1959; Williamsburg, Virginia
Taylor, better known as LT, was perhaps the NFL’s best defensive player during the 1980’s. Lawrence Taylor played for only one team, the New York Giants, during his career in the National Football League (NFL). He was an outside linebacker
Taylor, Lawrence
■
945
and was best known for sacking quarterbacks. Taylor recorded a total of 132.5 sacks, and at the time of his retirement, only one other player had more. One of Taylor’s sacks is best remembered because its impact ended the career of a rival player. During a Monday Night Football game in 1985, Taylor raced past a Washington Redskins offensive lineman, turned, and jumped, in an effort to tackle quarterback Joe Theismann. Once he made the tackle, he immediately got up and frantically waved to the Redskins’ bench, indicating that Theismann was hurt. The quarterback’s right leg was badly broken, and he never played again. The image of Taylor—his intensity known to everyone—desperate to get help for an injured player was a lasting one for those who remembered the play. Taylor’s successes on the field were sadly matched by terrible mistakes off it. He later admitted that by his second year in the NFL he was already addicted to cocaine. In 1988, he was suspended for thirty days
New York Giants Lawrence Taylor, right, and George Martin smash into Tampa Bay Buccaneer Steve DeBerg, forcing him to fumble, during a Giants home game in September, 1984. (AP/Wide World Photos)
946
■
Teen films
by the league after his second positive drug test. Many more problems off the field would dog him well into the 1990’s. Ironically, it was also in 1988 that Taylor played what many observers called his most memorable game. Taylor took the field against the New Orleans Saints, even though he had a torn pectoral muscle. The injury was severe enough that he needed a harness to keep his shoulder in place. He nevertheless recorded multiple sacks and tackles, leading the Giants to an important win. Impact Experts have argued that it was Taylor who made all NFL teams realize how critical it was to have a powerful outside linebacker capable of disrupting the opposition’s offense. Taylor was named the NFL Defensive Player of the Year three times (including following his rookie season), he was an All-Pro nine times, and he earned ten Pro Bowl berths. He was also selected as the league’s most valuable player following the 1986 season, when he led the Giants to their first Super Bowl championship, which was also the first Super Bowl loss for Denver Bronco quarterback John Elway. Further Reading
Taylor, Lawrence, and David Falkner. LT: Living on the Edge. New York: Random House, 1987. Taylor, Lawrence, and Steve Serby. LT: Over the Edge—Tackling Quarterbacks, Drugs, and a World Beyond Football. New York: HarperCollins, 2004. Anthony Moretti See also
Elway, John; Football; Sports.
■ Teen films Definition
Popular films portraying the lives and struggles of suburban American teenagers
Teen films ruled the box office in the 1980’s, portraying characters and situations to which American teenagers could easily relate and launching the careers of several stars. The teen film genre can be traced to the James Dean (Rebel Without a Cause, 1955) and beach films of the 1950’s and 1960’s. The 1980’s proved to be the golden age of the teen film. Movies like Sixteen Candles (1984) and The Breakfast Club (1985) demonstrated that teen films could be fun, meaningful, and profitable, dealing with issues that teenagers and
The Eighties in America
young adults found most important while still bringing in a significant profit for Hollywood. Most teen films are set in high schools or deal with characters that are of high school age. Films such as Fast Times at Ridgemont High (1982) combined humor with serious coming-of-age topics such as popularity, sex, dating, and abortion. Often, teen film plots focus on the so-called nerd lusting after a dream girl, as in Weird Science (1985), in which two teen boys successfully create their “perfect woman” through a freak computer accident. Although Weird Science was purely comedic, movies such as Pretty in Pink (1986) dealt with drama and struck a chord with teen girls. In this film, the lead character, played by Molly Ringwald, goes from being an overlooked, average girl to winning the heart of the most popular boy in school. The Brat Pack and John Hughes In the 1980’s, writer and director John Hughes quickly became known as the king of teen films. He wrote and/or directed some of the most popular films at the time, including Sixteen Candles, Pretty in Pink, The Breakfast Club, Ferris Bueller’s Day Off (1986), and Weird Science. Each of his movies centers on middle-class teenagers from the midwestern United States who are trying to find their place in the world. As in other teen films, Hughes’s characters often represent teen stereotypes: the nerd, the jock, the popular cheerleader, the troublemaker, and the girl struggling to fit in. Hughes often cast the same group of actors to play these parts, and they became known as the Brat Pack: Molly Ringwald, Anthony Michael Hall, Judd Nelson, Ally Sheedy, Emilio Estevez, Demi Moore, Andrew McCarthy, and Rob Lowe built great careers in the 1980’s thanks to Hughes’s films. Impact Teen films were designed to tap into the psyche of young people by using a mix of comedy and drama that dealt with issues such as sex, drugs, high school, relationships, and the pressure to live up to society’s standards. Although the demand for the types of teen films made in the 1980’s tapered off in favor of more lighthearted fantasy films of the 1990’s and 2000’s (for example, The Princess Diaries, 2000; What a Girl Wants, 2003), the teen film genre left a lasting impression on Hollywood. While some stars of the 1980’s had a hard time shaking their teen film pasts and struggled to find roles later in their careers, films like Risky Business (1983) launched the career of superstar Tom Cruise, About Last Night . . . (continued on page 948)
The Eighties in America
Teen films
■
Selected 1980’s Teen Films Year
Title
Director
Young Actors
1980
The Hollywood Knights
Floyd Mutrux
Robert Wuhl, Tony Danza, Fran Drescher, Michelle Pfeiffer
1982
Fast Times at Ridgemont High
Amy Heckerling
Sean Penn, Jennifer Jason Leigh, Judge Reinhold, Phoebe Cates
Zapped!
Robert J. Rosenthal
Scott Baio, Willie Aames
Risky Business
Paul Brickman
Tom Cruise
Class
Lewis John Carlino
Rob Lowe, Andrew McCarthy, John Cusack, Alan Ruck
The Outsiders
Francis Ford Coppola
Matt Dillon, Ralph Macchio, C. Thomas Howell, Patrick Swayze, Rob Lowe, Emilio Estevez, Tom Cruise
1984
Sixteen Candles
John Hughes
Molly Ringwald, Justin Henry, Anthony Michael Hall, John Cusack
Making the Grade
Dorian Walker
Judd Nelson, Dana Olson
1985
The Breakfast Club
John Hughes
Emilio Estevez, Anthony Michael Hall, Judd Nelson, Molly Ringwald, Ally Sheedy
Weird Science
John Hughes
Anthony Michael Hall, Ilan Mitchell-Smith, Robert Downey, Jr.
Girls Just Want to Have Fun
Alan Metter
Sarah Jessica Parker, Helen Hunt, Jonathan Silverman, Shannen Doherty
St. Elmo’s Fire
Joel Schumacher
Emilio Estevez, Rob Lowe, Andrew McCarthy, Demi Moore, Judd Nelson, Ally Sheedy, Mare Winningham
Private Resort
George Bowers
Rob Morrow, Johnny Depp
Teen Wolf
Rob Daniel
Michael J. Fox
Real Genius
Martha Coolidge
Val Kilmer, Gabriel Jarret
Pretty in Pink
Howard Deutch
Molly Ringwald, Jon Cryer, James Spader, Andrew McCarthy
Ferris Bueller’s Day Off
John Hughes
Matthew Broderick, Alan Ruck, Jennifer Grey, Charlie Sheen
Lucas
David Seltzer
Corey Haim, Charlie Sheen, Winona Ryder, Courtney Thorne-Smith
Some Kind of Wonderful
Howard Deutch
Eric Stoltz, Mary Stuart Masterson, Lea Thompson
Can’t Buy Me Love
Steve Rash
Patrick Dempsey, Amanda Peterson, Seth Green
Adventures in Babysitting
Chris Columbus
Elisabeth Shue, Anthony Rapp
Square Dance
Daniel Petrie
Winona Ryder, Rob Lowe
The Lost Boys
Joel Schumacher
Jason Patric, Corey Haim, Kiefer Sutherland, Jami Gertz, Corey Feldman
1988
A Night in the Life of Jimmy Reardon
William Richert
River Phoenix, Ione Skye, Matthew Perry
1989
Say Anthing
Cameron Crowe
John Cusack, Ione Skye, Lili Taylor
1983
1986
1987
947
948
■
(1986) catapulted Demi Moore to stardom, Ferris Bueller’s Day Off cemented Matthew Broderick’s career, and former teen geek Anthony Michael Hall found great success on The Dead Zone television series, which aired in 2002. The fact that 1980’s teen films such as The Breakfast Club remained popular in the early twenty-first century shows that the issues presented in these films are perennially pertinent. Further Reading
Bernstein, Jonathan. Pretty in Pink: The Golden Age of Teenage Movies. New York: St. Martin’s Griffin, 1997. A filmography for fans of the 1980’s teen films, the book provides funny and lighthearted facts about some of the decade’s most popular films. Clark, Jaime, ed. Don’t You Forget About Me: Contemporary Writers on the Films of John Hughes. New York: Simon Spotlight Entertainment, 2007. Offers a variety of perspectives on Hughes’s teen films. Foreword by Ally Sheedy. Deziel, Shanda. “The Man Who Understood Teenagers.” Maclean’s 119, no. 45 (November, 2006): 7. Discusses a documentary about the impact of teen films in the 1980’s. Addresses the importance of teen films, especially those directed by Hughes. Neale, Steve. “Major Genres.” In Genre and Hollywood. New York: Routledge, 2000. The text is helpful to the study of Hollywood films and genre theory. The chapter titled “Major Genres” addresses teen films and their impact. Prince, Stephen. History of the American Cinema: A New Pot of Gold—Hollywood Under the Electronic Rainbow, 1980-1989. New York: Charles Scribner’s Sons, 2000. The tenth volume in a set of books dedicated to laying out the history of American film one decade at a time. This volume is a great resource for the teen films of the 1980’s. Jennifer L. Titanski See also
The Eighties in America
Teen singers
Brat Pack in acting; Breakfast Club, The; Fads; Fashions and clothing; Fast Times at Ridgemont High; Film in the United States; Hughes, John; MTV; New Wave music; PG-13 rating; Pop music; Preppies; Slang and slogans.
■ Teen singers Definition
Young singers whose music is marketed to teen or preteen audiences
While Music Television provided exposure for teen artists such as New Kids on the Block and Debbie Gibson, another method of teen music marketing—mall tours—helped propel singers such as Tiffany to the top of the charts. The success of these teen artists renewed the record industry’s interest in teen acts and led to the boy band craze of the 1990’s. The 1970’s had seen television give teen singers such as David and Shaun Cassidy a ready-made audience. With the advent of MTV in 1981, the market expanded to include teenage singers and groups. Additionally, the singer Tiffany cultivated her success through reaching out to teens where they congregated—the mall. New Edition and New Kids on the Block Bostonbased songwriter and producer Maurice Starr formed the boy band New Edition in 1980. Made of up five black teenage boys, the group had top ten hits with “Cool It Now” and “Candy Girl.” Starr noted that he modeled the group after the Jackson 5. Eventually, New Edition fired Starr, and in 1984 he created a new band, New Kids on the Block, which he decided to model after the family pop group the Osmonds, but with “soul and good material—good black material.” New Kids on the Block dominated the teenybopper market in the late 1980’s, with songs written by Starr and music videos that made use of synchronized dance moves, which became part of the standard template for boy bands of the 1990’s. Teen Queens In 1987, two young teenage girls, each with a different hook, conquered the pop market. Long Island native Debbie Gibson began writing songs as a child and began producing them shortly thereafter. At the age of sixteen, she was signed to Atlantic Records and released her first single, “Only in My Dreams,” which made the top ten. She followed this up with two number one singles, “Foolish Beat” and “Lost in Your Eyes” (the latter from her second album). Unlike the other teen artists mentioned, Gibson wrote all her songs, and her record label made the effort not to promote her through teen magazines. Gibson’s manager, Doug Breitbart, told the Los Angeles Times, “It’s a major stigma, and I hadn’t
The Eighties in America
Teen singers
■
949
spent four years working with Debbie . . . to get discarded as a teen act.” Tiffany (full name Tiffany Renee Darwish) had been singing in public since she was nine years old. In 1987, she recorded her first album, which initially was not successful. Tiffany’s manager, George Tobin, recounted that he was told by industry executives that “teen stars went out with Donny and Marie [Osmond].” Tobin and the record label eventually hit on the idea of a mall tour to sell her to her targeted audience. At each mall stop, she would play several shows. The tour was successful, and by 1988 Tiffany had two number one hits. New Kids on the Block would also do the mall tour circuit Teen singing group New Kids on the Block at the American Music Awards on January 22, 1990. (AP/Wide World Photos) early in their career. Following the success of Gibson and Tiffany, record labels began reful section on Starr’s two bands of the 1980’s, New leasing songs by more teen singers, including Glenn Edition and New Kids on the Block, as well as Medeiros, Shanice Wilson, and Tracie Spencer. As many photos of great 1980’s merchandise. with Tiffany and New Kids on the Block, these artists Grein, Paul. “Teen-Agers Making Their Voices Heard: were promoted through MTV and especially teen Tiffany, 16, Is Not ‘Alone Now’ on Pop Scene as magazines. A number of these singers became “oneRecording Industry Capitalizes on Young Artists.” hit wonders,” known for only one successful hit. AnLos Angeles Times, December 1, 1987, p. 1. Excelother one-hit wonder teen singer who did not fit eilent article concisely summarizes paths Tiffany ther the pop singer or Maurice Starr boy band mold and Gibson took to their careers and includes the was Charlie Sexton, the rock guitarist who had a top prediction that as many baby boomers were havtwenty hit at age seventeen with “Beat’s So Lonely” in ing kids later, the teen artists explosion would ac1985. While he remained a successful musician, the tually occur in the 1990’s. 1985 single was his only hit on the pop charts. Hunt, Dennis. “Stardom’s Not Only in Her Dreams.” Los Angeles Times, August 23, 1987, p. 88. Provides Impact Teen singers in the 1980’s had a new mebackground on Gibson’s early songwriting and dium that teen idols of the past did not—music vidsigning to Atlantic Records. eos. The success of bands such as New Edition and _______. “Young, Gifted, and Sounding Black: New New Kids on the Block was a precursor to the boy Kids on the Block Are the Osmonds with Soul, band phenomenon of the 1990’s, and the success of Sings Their Creator.” Los Angeles Times, June 4, Tiffany’s mall tours is a reminder of the importance 1989, p. 8. Discusses the creation of New Kids on of marketing in music for teenagers. the Block and how Starr modeled the boy band off of his earlier group, New Edition. Further Reading Julie Elliott Cooper, Kim, and David Smay, eds. Bubblegum Music Is the Naked Truth. Los Angeles: Feral House, 2001. See also MTV; Music videos; Pop music; Women Though focused on the “tween” music of the late in rock music. 1960’s and 1970’s, the book does include a help-
950
■
The Eighties in America
Televangelism
■ Televangelism Definition
The use of television as a medium to communicate Christianity
Scandals caused the rapid downfall of some of the most famous televangelists in the 1980’s. Many Americans saw these events as confirmation of their suspicions that televangelists were corrupt and out of touch with mainstream religious views. The technology of television, which became a popular form of news and entertainment in American homes in the 1950’s, had long been ignored by religious groups. While some saw the television as a symbol of modernism and secularism, most simply did not develop the tools and technology necessary to transfer their message to television until the late 1970’s and early 1980’s. Partly because of a new responsiveness by religious groups to the potential power of television and partly because of a broader engagement by many Christians in the realm of American popular culture, many enthusiastic preachers moved their ministries to the television airwaves during the 1980’s. Still others launched their careers with local, regional, and even national programming plans that grew into vast media communications networks and broadcast empires. The Rise of Televangelism American Christianity has a long history of utilizing the most efficient and effective communications strategies of each generation in order to carry out the “Great Commission,” Jesus Christ’s instruction to his disciples to spread the gospel teachings throughout the world. Evangelist Aimee Semple McPherson, founder of the International Church of the Foursquare Gospel, was criticized in the early twentieth century for her use of the relatively new medium of radio as a tool for “winning souls for Christ.” In subsequent years, however, ministers embraced not only radio but also television as a viable method of communicating their message. One of the earliest to do this was Pat Robertson, who founded the Christian Broadcasting Network (CBN) in 1961. Its flagship show, The 700 Club, began broadcasts in the mid-1960’s. Part news reporting and part talk show, The 700 Club continued to broadcast into the early twenty-first century, featuring popular hosts and guests from a variety of backgrounds. Though Robertson became known for his controversial and outspoken views, he remained a
strong voice for followers of Judeo-Christian traditional values. Another controversial televangelist was Oral Roberts, the son of a midwestern preacher and an evangelist and faith healer in the charismatic tradition. Roberts began his ministr y in the 1950’s and launched a series of television specials during the 1970’s. His Oral Roberts Ministries continued to broadcast into the early twenty-first century. In the 1980’s, however, Roberts became the target of jokes and criticism when he claimed to have had visions of God about raising money for his ministries, including the City of Faith Medical and Research Center, which was open from 1981 to 1989. Like Pat Robertson, Jerry Falwell became well known for his television ministries in the 1980’s. Though he had established the Thomas Road Baptist Church in Lynchburg, Virginia, and the Old Time Gospel Hour radio program in 1956, he became a mobilizing force when his Moral Majority coalition (founded in 1979) endorsed Ronald Reagan in his 1980 election bid. The Decline of Televangelism
Roberts and Falwell spent the 1980’s building ministries that continued to play a role in national religious and political life in later decades. However, in many ways, their careers—at least during the 1980’s—were eclipsed by two other figures. In 1987, Assemblies of God minister Jim Bakker resigned from his position as head of the PTL television network and as host of its popular television show of the same name. Bakker’s resignation came in the wake of scandals involving a sexual encounter with a church secretary named Jessica Hahn and allegations of massive fraud. During their heyday, Jim and his wife, Tammy Faye, lived an excessive lifestyle, even while asking viewers to maximize their financial contributions to their ministry. After serving five years in prison for charges related to fraud and tax evasion, Bakker continued to minister, though on a much smaller scale. The year after Bakker’s resignation, Jimmy Swaggart, another Assemblies of God preacher, was forced to resign as head of Jimmy Swaggart Ministries (then worth more than $100 million) when allegations surfaced that he had met with prostitutes. Swaggart became another target of critics and skeptics, in part because of his tearful videotaped apology to his followers.
The Eighties in America Impact The rising stars of American televangelism during the late 1970’s and 1980’s helped lay the groundwork for the development of the Christian media empires that continued to thrive in the early twenty-first century. The Trinity Broadcasting Network, which calls itself the “world’s largest Christian television network,” began in 1973 but expanded its reach during the 1980’s, thanks in part to early work by Jim and Tammy Faye Bakker, whose PTL Club actually debuted there. Pioneers in the field contributed to both the legitimization of television as a potential medium for spreading Christianity and the increase in skepticism from those who questioned televangelists and their motives as well as from fellow evangelists and other Christian leaders who believed that the industry lent itself easily to corruption and greed, even in the hands of otherwise good people. Further Reading
Jorstad, Erling. The New Christian Right, 1981-1988: Prospects for the Post-Reagan Decade. Studies in American Religion 25. Lewiston, N.Y.: Edwin Mellen Press, 1987. Investigates the social and cultural influences behind the rise and fall of the Religious Right. Schmidt, Rosemarie, and Joseph F. Kess. Television Advertising and Televangelism: Discourse Analysis of Persuasive Language. Philadelphia: J. Benjamins, 1986. As the title suggests, this book analyzes televangelism from a sociolinguistic standpoint. Schultze, Quentin J. Televangelism and American Culture: The Business of Popular Religion. Grand Rapids, Mich.: Baker Book House, 1991. Analyzes how televangelist ministries have been corrupted by power and wealth. Jennifer Heller See also
Bakker, Jim and Tammy Faye; Falwell, Jerry; Heritage USA; Moral Majority; Religion and spirituality in the United States; Robertson, Pat; Swaggart, Jimmy; Television.
■ Television Definition
Programs and series, both fictional and nonfictional, produced for or broadcast on U.S. television
A transformation in television took place during the 1980’s as a result of the advent of home video recording devices and
Television
■
951
the deregulation of the industry during the Reagan administration. Cable and satellite systems soon represented the demise of free television. Three major networks—the American Broadcasting Company (ABC), the Columbia Broadcasting System (CBS), and the National Broadcasting Company (NBC)—controlled the programming offered on television from 1940 to 1980. With the introduction of cable, satellite delivery systems, and home video, executives began to target shows that captured smaller niche audiences, often referred to as narrowcasting. The television industry became a global business headed by mass conglomerates, and by the end of the decade pressures to streamline expenses led to fewer programming options. Programmers depended on the stability of the economy, and because cable was based on subscriber services, they had to offer shows that pleased the majority of the viewing public. The problem was that such systems were highly dependent on syndication, airing old movies, and programs that usually lacked originality and creativity. Cable systems could deliver “superstations” such as WTBS (Atlanta), WGN-TV (Chicago), and WWORTV (New York City) that became widely available throughout the United States. The three mainstream networks could no longer maintain a monopoly on what the American public viewed in their living rooms. By 1986, 82 percent of the American adults watched television on an average of seven hours per day while 88 percent of all households had subscribed to pay cable television networks such as Home Box Office (HBO), MTV (Music Television), Nickelodeon, and the Disney Channel. These new channels were not part of cable systems but satellitedistributed choices that allowed local cable companies to offer the channels in programming packages. This resulted in pricing arrangements in which subscribers paid more for “premium channels” with no commercial interruptions, such as HBO, while Nickelodeon was supported through advertisers; thus, viewers could get this channel free or for a nominal fee. Deregulation and Mergers The decade saw not only unprecedented technological growth but also changes that were taking place as conglomerates began seeking wider diversification. The Federal Communications Commission (FCC) abided by the antimonopolistic charter clauses that limited the
952
■
Television
The Eighties in America
number of stations that could be owned by a single person. However, when President Ronald Reagan appointed Mark S. Fowler and Dennis R. Patrick as FCC chairmen, the situation drastically changed for the television industry. Both men rejected the idea that broadcasters were trustees using the medium as a way to serve the public good. They believed in Reagan’s conservative economic policies that reinforced consumerism and market forces. The general public should be the ultimate consumer, unfettered by government rules for viewing choices. Thus, deregulation of the communications industry began to take place in the early 1980’s. Shareholders now considered television as a commodity, a profitable endeavor that could be bought and sold at will. Major corporations began to buy the floundering networks—Capital Cities took over ABC, and General Electric took over CBS and NBC. Entrepreneurs flocked to the airwaves as profits soured and emerged as media moguls. In 1980, Ted Turner unveiled the Cable News Network (CNN), and he spearheaded the movement with several satellite services (WTBS, CNN, and TNT). Eventually, Turner bought the television rights to the MGM library of motion pictures for rebroadcasting on his channels. Native Australian Rupert Murdoch first bought Metromedia television stations that served New York and Washington. Murdoch then acquired Twentieth Century-Fox and formed the FOX network to compete with the three mainstream networks.
time. Videocassette tape replaced bulky reel-to-reel machines; the tape could be reused and recorded over, and soon companies developed handheld video cameras that enabled faster production in the field in television journalism. Consumers could now record their favorite television programs, avoiding network schedules to accommodate their own lifestyles. In 1982, only 4 percent of households owned a VCR, but by 1988, with the costs of production going down, the number rose to 60 percent. Viewers became their own independent programmers, but the downside was that the VCR caused a boom in film piracy by some foreign countries and violations of copyright. In a 1984 lawsuit brought by Disney and Universal against Sony, the leading manufacturer of VHS recorders, the U.S. Supreme Court ruled that taping a copyrighted program off the air for one’s own personal viewing was not an illegal act. The decision caused the movie and television industry to move into marketing home video releases once the program was in syndication so that viewers would buy or rent the video before taping it directly off the air. By the mid1980’s, HBO began to scramble transmissions to stop nonpaying viewers who had satellite dishes but did not receive regular cable service. In order to receive clear reception, viewers had to pay for the service and get decoders for their television sets. The Cable Communications Policy Act of 1984 also banned the illegal sharing or wiring of cable television and telephone systems.
New Technologies and Piracy Issues
Programs and Reaganism Mergers, deregulation, and new technologies would have a dramatic effect on the type of shows the American public would watch on their television screens. During this decade, the decline of musical variety shows and Westerns occurred because of a lack of viewer interest. Innovation did not appear as hoped; instead, by the mid-1980’s, the cable networks relied on the reruns of classic programs such as Bewitched, Rifleman, Father Knows Best, Lassie, and Dragnet. Prime-time programmers increasingly used segmented scheduling to target specific demographic groups. For example, the first hour of prime time was devoted to family shows such as the Head of the Class, The Cosby Show, The Facts of Life, and Growing Pains. There was an attempt to appeal to older Americans in such series as The Golden Girls, in which three self-confident, mature women deal with retirement in Florida, while Mur-
New technologies—including the videocassette recorder (VCR), video games, and remote control devices—caused a significant revolution in home recordings of programs. Warner-Amex in Columbus, Ohio, introduced interactive two-way technology television with Qube in 1980. Subscribers could respond with a handheld device to answer multiple choice questions or order merchandise, and the system was even used by local universities to offer classes. However, by 1984, with mounting costs and rising subscriber concerns over privacy issues about what information was being stored in the company’s databases, Qube was discontinued. The camcorder, a video and recorder, enabled people to watch home movies on their television sets. The remote control changed the way viewers watched television, allowing them to flip channels and see snippets of several shows at one
The Eighties in America
der, She Wrote starred Angela Lansbury as an amateur sleuth in a small New England town. Many programs featured movie stars from the 1940’s and 1950’s in guest appearances to generate nostalgic reactions from viewers. Feminist positions also began to have a greater influence on television series such as Cagney and Lacey, since there were more female producers and writers with liberal political positions during the 1980’s. Also, women traditionally watched more television than men. On the other hand, programs such as thirtysomething and Moonlighting appealed to the yuppie crowd, young and upcoming professionals who mixed professional careers with raising families. The yuppies were part of the baby-boom generation that stressed work, especially with women rising into the managerial class. Career choices were meant to be meaningful to one’s personal life. Programming was also influenced by the Reagan administration’s renewed Cold War rhetoric against the Soviet Union as the “Evil Empire,” and network executives took note. In 1983, ABC broadcast a twohour movie, The Day After, showing the United States in the aftermath of a nuclear war with the Soviet Union. In 1988, the same network broadcast a fourteen-hour miniseries, Amerika, depicting the United States after an extended period of Soviet occupation. Reagan’s conservatism shaped the political landscape of television journalism as panelists with right-wing viewpoints appeared with more frequency on news reports, commentary programs, and talk shows. Television journalism addressing controversial topics in documentaries dwindled, and ratings for these programs declined, so much so that reporters often lamented that television news was becoming an entertainment medium. By the end of the decade, the evening news broadcasts on the three major networks went from an hour to thirty minutes in length. With video and satellite systems, fundamentalist religious leaders invested in these technologies, and televangelists such as Jimmy Swaggart, Pat Robertson, Jerry Falwell, and Jim and Tammy Faye Bakker emerged. They used the new medium to speak out against various issues that they found too prevalent—abortion, homosexuality, the Equal Rights Amendment, pornography, and sex education in schools. The decade certainly represented the golden age for prime-time soap operas—Dallas, Dynasty, Falcon
Television
■
953
Crest, and Knots Landing—that reflected Reagan-era federalism and an emphasis on glamour. Such shows epitomized the Reagan era with its concern over image and the fiscal conservatism found in supplyside economics. Materialism was expressed in lavish home design, decor, and gourmet food culture. The American public could not get enough of the ultrarich Carringtons on Dynasty and the Ewings, the larger-than-life Texans on Dallas, as these families grew in their obsessive need for wealth. Featured on location, Miami Vice was a television show that was focused on high style, lush cinematography, and couture fashions that revolutionized the look of detective dramas. The show’s portrayal of drugs, smuggling activities, and prostitution was set to music with hypnotic synthesizer beats that appealed to the MTV generation. However, the baby-boom generation, despite its materialistic tendencies, remained liberal on social issues. Dramas often held a gritty realism in shows such as Hills Street Blues and St. Elsewhere, in which “good” and “bad” were not always easily defined. One of the major themes in many of the crime dramas was that good cops were halted from doing their jobs by inept courts, bureaucratic red tape, and department policies that allowed criminals to be released because of legal technicalities. Justice was not always enforced or easily found; the police had to make deals with informants and bargain with the assistant district attorney. Officers often had to confront their own personal demons with alcoholism, sexual harassment, and racism. Another successful series was L.A. Law, which focused on attorneys and their relationships in a law firm. The action took place outside the courtroom and centered on their personal or professional lives. The series was true to form in depicting the blurred line between justice and the law; for example, plea bargaining may reveal the truth or obstruct justice. Vigilantes or freelance enforcers included those found on such popular series as The A-Team, Knight Rider, and The Equalizer, in which modern avengers went after evildoers in covert operations. These individuals defended the weak and innocent against violent criminals. The situation comedy, or sitcom, began a rebirth with The Cosby Show, Cheers, and Family Ties. Television network executives began to look at demographic information that indicated that African American households watched more television than other groups. In 1988, it was found that black view-
954
■
The Eighties in America
Television
ers watched television 10.6 hours daily while other groups watched an average of 7.3 hours per day. The Cosby Show was the first series in which the majority of the creative team and cast were black. The series premiered in 1984 and was a smash success from the very first episode. Both parents were professionals, a doctor and a lawyer, who offered an upbeat portrait of a black family in which the parents respected one another and nurtured their five children. The show often characterized the joys and tribulations of raising a large family and was subtle in its approach to improving race relations. By the late 1980’s, irreverent comedies such as Roseanne, Married . . . with Children, and the animated series The Simpsons began to rule the airwaves. These sitcoms maintained a satirical edge to current issues and social problems in environments where the families were decidedly dysfunctional. Another genre that started to emerge was reality show programming, which ran the gamut in regards to content. Some shows genuinely addressed adult topics on sensitive cultural issues that were informative to viewers, while other programs reverted to salacious Hollywood gossip found in the tabloids. Ratings for these programs proved extremely high and profitable to the cable networks. The daytime television talk show, with hosts such as Phil Donahue and Oprah Winfrey, began to replace documentaries. Some hosts, such as Geraldo Rivera and Morton Downey, Jr., used sensationalized scenarios, called trash TV, to boost ratings. The producers of these shows wanted to provoke their guests by goading them into argumentative confrontations that sometimes escalated into brawls or fistfights. The negativity that pervaded tabloid talk shows began to affect advertising in commercials, in which companies would malign a rival product using a cleverly disguised name. In presidential political campaigns, ads showed opponents veering away from discussing the issues in favor of personal attacks. Impact In 1979, the average television viewing time was 29 hours per week. As the local number of cable systems began to grow in the early 1980’s, new outlets for the television industry took center stage. Pay television altered the viewing habits of the American public. These technological innovations allowed network and cable executives to export American programs abroad as television truly became international in scope throughout the decade. The “Big
Three” networks that had supervised shows for more than forty years were finding it very difficult to compete with the interests of major corporations. The deregulation that occurred during the Reagan years caused critics to take sides. Some argued that television had been democratized, that the public was able to take control and choose its own programming, while others contested that the quality of shows had declined because companies were concerned only with ratings and profit margins. These reviewers contended that the packaging of shows to audiences led to a rise in syndication rather than the development of creative shows. As a result, quality programming, such as programs geared toward children, news commentaries, documentaries, and fine arts, declined during the decade. Further Reading
Abramson, Albert. The History of Television, 1942 to 2000. Jefferson, N.C.: McFarland, 2003. A retired network television engineer traces the technological innovations in the industry over sixty years. Focuses on the rise of the camcorder and digital prototypes for audio/video during the 1980’s. Barnouw, Erik. Tube of Plenty: The Evolution of American Television. 2d rev. ed. New York: Oxford University Press, 1990. Barnouw, professor emeritus of dramatic arts at Columbia University, addresses the development and impact of the communications revolution in radio and television from 1920 to 1990. Comstock, George. Television in America. 2d ed. Newbury Park, Calif.: Sage Publications, 1991. Analyzes the social, political, and behavioral forces that shaped the programming habits of the American public. Discusses emerging technologies such as cable, satellites, and VCRs. Meant to be a text for courses in communication, journalism, and popular culture. Doyle, Marc. The Future of Television: A Global Overview of Programming, Advertising, Technology, and Growth. Lincolnwood, Ill.: NTC Business Books, 1993. Focuses on the future of television programs, marketing, and policy in an era of mass globalization. Chapter 1 is an overview of the 1980’s. Feuer, Jane. Seeing Through the Eighties: Television and Reaganism. Durham, N.C.: Duke University Press, 1995. Delves into the relationship between politics, television programming, and viewing behav-
The Eighties in America
Tennis
ior during the Reagan administration. Lichter, S. Robert, Linda S. Lichter, and Stanley Rothman. Prime Time: How TV Portrays American Culture. Washington, D.C.: Regnery, 1994. A comprehensive study on prime-time entertainment from 1950 to 1990, focusing on how Hollywood depicts changes in American society, with chapters on private lives, crime and punishment, the working class, and controversial issues. MacDonald, J. Fred. One Nation Under Television: The Rise and Decline of Network TV. New York: Pantheon Books, 1990. Historical study of how the major networks rose to power and shaped viewer ratings until the introduction of cable television and home video substantially weakened their power in the 1980’s. Montgomery, Kathryn C. Target, Prime Time: Advocacy Groups and the Struggle over Entertainment Television. New York: Oxford University Press, 1989. Montgomer y demonstrates how various advocacy groups shaped the messages and values found in prime-time television from the early 1970’s to the mid-1980’s. Moorfoot, Rex. Television in the Eighties: The Total Equation. London: BBC, 1982. Examines the technological advancements occurring in the television industry in the early 1980’s. Gayla Koerting
■
955
■ Tennis Definition
Racket sport played in singles or pairs
Tennis experienced tremendous growth in the 1980’s because of the influence of television coverage and the most successful players, including Martina Navratilova and John McEnroe. The professionals fueled the amateur popularity of the sport, and there was a significant rise in sales of tennis products. The popularity of viewing top professional sports players on television gave a significant boost to the tennis industry in the 1980’s. Television networks such as CBS, ESPN, HBO, NBC, and ABC provided year-round coverage of the major professional tournaments, adding to the fame and wealth of champion players. Amateur tennis players were influenced to purchase tennis rackets, shoes, and clothing that the champions used. In the early 1980’s, the tennis racket itself underwent a transformation, as graphite, boron, aluminum, and titanium were introduced to make the rackets lighter and stronger than their wood and steel predecessors. The two most popular rackets were the Dunlop Max 200G, introduced in 1980, and the wide-body Wilson Profile, introduced in 1987. During the decade, advances in nutrition and notions about physical conditioning influenced the
See also
Brokaw, Tom; Cable television; Cagney and Lacey; Cheers; Children’s television; CNN; Cosby Show, The; Craft, Christine; Dallas; Day After, The; Designing Women; Dynasty; Facts of Life, The; Family Ties; Fox, Michael J.; FOX network; General Hospital; Golden Girls, The; Hill Street Blues; Home shopping channels; Infomercials; Jennings, Peter; Journalism; L.A. Law; Letterman, David; Magnum, P.I.; Married . . . with Children; M*A*S*H series finale; Miami Vice; Miniseries; Moonlighting; Murphy, Eddie; Murray, Bill; Network anchors; Pauley, Jane; Rather, Dan; Rivera, Geraldo; St. Elsewhere; Sitcoms; Soap operas; Star Search; Star Trek: The Next Generation; Tabloid television; Talk shows; thirtysomething; Turner, Ted; Williams, Robin; Winfrey, Oprah; Wonder Years, The.
In 1981, tennis greats (from left) John McEnroe, Björn Borg, and Vitas Gerulaitis particpate in Gerulaitis’s annual youth tennis clinic in New York City. (AP/Wide World Photos)
956
■
Tennis
sport. Amateur and professional tennis players trained according to these research concepts to maximize their performance. One notable player, Martina Navratilova, literally transformed herself with diet and physical conditioning into the greatest female tennis player of the 1980’s. Players
The four major professional tennis tournaments of the world (known as the grand slam tournaments) include the French, Australian, and U.S. Opens and Wimbledon. The best of the men’s players of the early 1980’s included Jimmy Connors, Björn Borg, and John McEnroe. Two of the greatest matches in the history of Wimbledon occurred in 1980 and 1981. In 1980, Borg defeated McEnroe in five sets to win Wimbledon, which included a thrilling fourth-set tiebreaker won by McEnroe eighteen points to sixteen. Two months later, McEnroe defeated Borg, again in five sets, to win the U.S. Open. In 1981, the two men met again in the finals at Wimbledon and the U.S. Open, with McEnroe winning both tournaments. The physical skills and psychological will exhibited by both players made these matches memorable, but McEnroe’s anger and disrespectful behavior in those tournaments made headlines all over the world as well. McEnroe won Wimbledon in 1981, 1983, and 1984, and the U.S. Open in 1980 and 1981. Connors won Wimbledon in 1982 and the U.S. Open in 1982 and 1983. Later in the decade, the best of the male professionals included Ivan Lendl, Mats Wilander, Stefan Edberg, and Boris Becker. During the 1980’s, Lendl won seven grand slam singles tournaments and finished second ten times. Wilander won seven grand slam singles tournaments and finished second four times. Edberg and Becker each won four grand slam singles titles. Becker was only seventeen years old when he won Wimbledon in 1985. Among the women’s competition, the 1980’s is best known for the number of tournaments won by and the rivalry between Navratilova and Chris Evert (Evert-Lloyd). Evert won nine grand slam singles tournaments, was second ten times, and clearly was the public favorite. However, in terms of grand slam victories in singles, doubles, and mixed doubles, Navratilova was the greatest player, male or female, of the 1980’s. She won fifteen grand slam singles titles, came in second ten times, and was ranked number one longer than Evert. Pam Shriver and Navratilova won twenty-one grand slam doubles tour-
The Eighties in America
naments during the 1980’s, and Navratilova also won four mixed doubles grand slam tournaments. In 1985-1987, Navratilova was in the singles final in all eleven of the grand slam tournaments that she entered. A very rare achievement occurred in 1987, when she won the singles, doubles, and mixed doubles, all the available events, at the U.S. Open. Away from the tennis court, she received a lot of media attention for her public declaration of her lesbian identity. As a world-famous athlete and public figure, she handled the media attention with pride and poise and added awareness and public sensitivity to issues concerning equal rights and women’s rights. Another great female professional of the late 1980’s was Steffi Graf of Germany, who won eight grand slam singles titles between 1987 and 1989. In 1988, she had one of the greatest years in the history of tennis, winning all four grand slam singles tournaments and even the gold medal at the 1988 Olympic Games. Historians refer to this accomplishment as the “golden slam.” Hall of Famers Many great players were inducted into the International Tennis Hall of Fame during the 1980’s, including Ken Rosewall and Lew Hoad in 1980, Rod Laver in 1981, Charlotte Dod in 1983, Arthur Ashe in 1985, John Newcombe in 1986, and Billie Jean King and Björn Borg in 1987. Ashe, a three-time grand slam singles champion, is a special figure in sports history, as he is the greatest male African American tennis player of all time and was a leader of the world civil rights movement. He was especially active in was the antiapartheid movement in South Africa. He made world news headlines in 1988 when he announced that he had contracted human immunodeficiency virus (HIV) through blood transfusions during two major heart surgeries. The heterosexual Ashe’s personal battle with acquired immunodeficiency syndrome (AIDS) increased world awareness about the disease and brought attention to the fact that it was not one to affect only homosexuals. Ashe also emphasized the importance of developing valid testing methods of donated blood. Impact Tennis in the 1980’s saw great performances by a number of stars, from the fiery John McEnroe to the greatest player of the decade, Martina Navratilova. Increased sports coverage by television networks as well as notable competition, such as that between Navratilova and Evert, and Borg and McEnroe, increased the popularity of the sport.
The Eighties in America Further Reading
Collins, Bud. Total Tennis, Revised: The Ultimate Tennis Encyclopedia. Toronto: Sport Classic Books, 2003. Provides year-by-year (1919-2002) comprehensive statistics of the top professional players and the grand slam tournaments. Includes a biographical section of the best players. McEnroe, John, and James Kaplan. You Cannot Be Serious. New York: Penguin Books, 2003. The tennis great reflects on his colorful professional career and personal life. Parsons, John. The Ultimate Encyclopedia of Tennis: The Definitive Illustrated Guide to World Tennis. London: Carlton Books, 2007. Provides a history of the sport, its players, tournaments, and controversies. Alan Prescott Peterson See also
McEnroe, John; Navratilova, Martina;
Sports.
■ Terminator, The Identification American science-fiction film Director James Cameron (1954) Date Released October 26, 1984
Representing a pessimistic view of technology, The Terminator contrasted with the positive view of technology in the films of the original Star Wars trilogy and the Star Trek: The Next Generation television series. Coupling nuclear war and computerized defense systems with the emergence of intelligent computers, the film reflected the technological and political situation of the mid-1980’s—the increasing presence of computers and President Ronald Reagan’s Strategic Defense Initiative at the height of the Cold War. The story line of The Terminator (1984) focuses on a cyborg assassin (Arnold Schwarzenegger) created by Skynet, an artificially intelligent computer defense network that developed self-awareness and attempted to annihilate humanity by starting World War III. Skynet built cyborgs to infiltrate the few surviving camps of humans. The film’s title character is sent back in time from the postapocalyptic world of 2029 to kill Sarah Connor (Linda Hamilton), who is to become the mother of the leader of the human rebellion. From the future, her son John sends Kyle Reese (Michael Biehn) to protect her from the human-like assassin.
Terminator, The
■
957
Machine imagery pervades the film. Appearing frequently are machines such as household devices, including an answering machine that exhorts callers to be nice to it, motorcycles and trailer trucks, and an automated factory where Sarah manages to crush the terminator in a press. In a larger sense, this story reflects contemporary concerns about the progression from human dependence on machines to being dominated by them. The extreme masculine physique of the terminator represents military technology gone awry. The film uses common science-fiction themes, notably artificial intelligence. The terminator acts in a conscious, purposive manner, yet its alien nature is reflected by the images of internal, decision-making screens. Furthermore, the terminator has no emotions and cannot be reasoned with. Another theme is time travel, including a variation of the grandfather paradox, as John Connor sends Reese back in time to impregnate Sarah and thus to become John’s father.The Terminator garnered generally favorable reviews and became an unexpected sleeper hit. Shot on the relatively small budget of $6.5 million, it had collected $38.4 million by the end of its first run. During 1985, it was one of the most popular rental videos. Impact The Terminator was the first major directing effort by James Cameron, who went on to direct other highly successful films (notably, 1997’s Titanic). The film solidified the acting career of Schwarzenegger and also led to major roles for Hamilton and Biehn. Furthermore, it inspired a number of other science-fiction films, including RoboCop (1987). Further Reading
Hollinger, Veronica, and Joan Gordon, eds. Edging into the Future: Science Fiction and Contemporary Cultural Transformation. Philadelphia: University of Pennsylvania Press, 2002. Kozlovic, Anton Karl. “Technophobic Themes in Pre-1990 Computer Films.” Science as Culture 12, no. 3 (2003): 3341-3372. Telotte, J. P. Replications: A Robotic History of the Science Fiction Film. Urbana: University of Illinois Press, 1995. Kristen L. Zacharias See also Blade Runner; Computers; Cyberpunk literature; Empire Strikes Back, The; Film in the United
958
■
Terms of Endearment
The Eighties in America
States; RoboCop; Robots; Schwarzenegger, Arnold; Science-fiction films; Star Trek: The Next Generation; Strategic Defense Initiative (SDI); Tron.
■ Terms of Endearment Identification American film Director James L. Brooks (1940Date Released November 23, 1983
)
Successfully mixing sentiment and humor, the film provided a showcase for the principal actors and a mirror reflecting many of the preoccupations of the decade. Terms of Endearment traces the life of a single mother, Aurora Greenway (Shirley MacLaine), as she raises her daughter Emma (Debra Winger), fights with her over her marriage (she hates her future son-in-law Flap Horton, played by Jeff Daniels) and her pregnancies (she has to face up to aging when she becomes a grandmother), and finally rec-
onciles with her as Emma succumbs to cancer (she must assume the responsibility for raising her three grandchildren). It is a relationship that does not gloss over the tensions of a single-parent household. Living next door is womanizer Garrett Breedlove (Jack Nicholson), a former astronaut who gradually has to face his own aging and develops a relationship with Aurora despite her age and the fact that she becomes a grandmother—twice. American movies had been reluctant to portray romantic situations involving older actresses, and MacLaine established the first of her feisty older women roles in this film, a trademark she continued to exploit. The emotional centerpiece of the film is the series of scenes showing the gradual deterioration of Emma as she fights her cancer. It is a sequence that culminates in the farewell scene between Emma and her two young sons and little daughter. Winger was nominated for an Academy Award for Best Actress, and although she did not win, her performance in these scenes was undoubtedly the reason the film
From left: James L. Brooks, Shirley MacLaine, and Jack Nicholson celebrate their Oscar victories for Terms of Endearment at the April 9, 1984, Academy Awards ceremony. (AP/Wide World Photos)
The Eighties in America
did so well at the Academy Awards.Terms of Endearment was something of a surprise hit in 1983 when it won a significant number of Oscars. The film beat out The Big Chill, The Dresser, The Right Stuff, and Tender Mercies for Best Picture and James L. Brooks, directing his first picture, won Best Director, beating out a formidable lineup of veterans: Bruce Beresford, Mike Nichols, Peter Yates, and even Ingmar Bergman. MacLaine won Best Actress and Nicholson won Best Supporting Actor. Brooks also won for Best Adapted Screenplay from Larry McMurtry’s novel. A sequel, The Evening Star (1996), allowed MacLaine to reprise her eccentric character Aurora Greenway as she raises her grandchildren left behind when their mother died. Impact Terms of Endearment struck a chord with movie audiences in the early 1980’s with its depiction of romance between older characters Aurora Greenway and Garrett Breedlove, its exploration of a single-parent family, and its heartfelt rendering of the death of a young mother who must leave her children. Further Reading
Evans, Peter William, and Celestion Deleyto, eds. Terms of Endearment: Hollywood Romantic Comedy in the 1980’s and 1990’s. Edinburgh: Edinburgh University Press, 1998. McMurtry, Larry. Terms of Endearment. New York: Simon & Schuster, 1975. Speidel, Constance. “Whose Terms of Endearment?” Literature/Film Quarterly 12, no. 4 (1984): 271-273. Charles L. P. Silet See also Abortion; Academy Awards; Big Chill, The; Feminism; Film in the United States; Nicholson, Jack.
■ Terrorism Definition
Acts of violence committed by individuals or groups seeking to influence public opinion or public policy
Although incidents of terrorism within the United States and Canada during the 1980’s were fewer in number and overall effect than the two previous decades, several notable incidents occurred. According to the Federal Bureau of Investigation (FBI), there were at least two hundred terrorist inci-
Terrorism
■
959
dents in the United States during the 1980’s; the largest single type of incident involved bombing, which made up seventy-eight of the incidents. Other types of incidents included armed robbery, a rocket attack, sniper attacks, arson and other property destruction, and assassinations. While some terrorist groups involved in these attacks had been active since the 1960’s and 1970’s, a few new organizations grabbed headlines with their armed activities, including the white supremacist Christian Identity and the neo-Nazi movement. In addition, certain leftist groups re-formed under different names and with different personnel. Canadian Incidents Canada experienced fourteen acts of terrorism during the 1980’s. Three were attributed to groups pushing for a Sikh nation separate from India. Three others were attributed to Armenian nationalists targeting Turkish government representatives. One incident was committed by a former U.S. Air Force officer protesting the visit of Pope John Paul II. The officer’s bombing attack resulted in the deaths of three people in Montreal’s Victoria Station. Another incident involved a man from the Canadian Armed Forces who was opposed to the Parti Québécois’s desire for Quebec to secede from Canada. The man attacked the Quebec parliament building, killing three individuals. The antiCastro group Omega 7 bombed the Cuban embassy in Montreal in 1980. On October 14, 1982, the left-wing anarchist group Direct Action bombed a Toronto Litton factory that manufactured triggers for U.S. military cruise missiles. Direct Action was also responsible for the bombings of three Vancouver pornographic bookstores and a hydroelectric substation on Vancouver Island, British Columbia. Although Canada was the site of far fewer incidents than the United States, the bombing of an airplane departing from Montreal Airport was one of the deadliest incidents of the decade. On June 23, 1985, Air India Flight 182 was flying over the Atlantic Ocean when a bomb exploded, killing all 329 people on board. Canadian police believed that the leader of the terrorist attack was Talwinder Singh Parmar, head of the militant Sikh separatist group Babbar Khalsa. Domestic Organizations
Foremost among terrorist organizations in terms of numbers of attacks were a number of groups fighting for the independence
960
■
Terrorism
of Puerto Rico from the United States. The best known of these groups were the Fuerzas Armadas de Liberación Nacional (FALN), the Macheteros, and the Organization of Volunteers for the Puerto Rican Revolution. Although usually leftist in their politics, the groups’ overriding concern was Puerto Rican independence. According to the FBI, these groups were responsible for eighty-eight of the known terrorist incidents in the United States during the 1980’s. Their primary modus operandi was bombings and armed robberies. Other leftist groups involved in terrorism in the United States during this period included the May 19 Communist Organization; the Black Liberation Army (BLA), an offshoot of the defunct Black Panther Party; the United Freedom Front (UFF); the Armed Resistance Unit (ARU); and the Red Guerrilla Resistance. The UFF and ARU shared some members, and most observers believe that the Red Guerrilla Resistance was made up of members of the May 19 Communist Organization. The militant Zionist group the Jewish Defense League (JDL) and its offshoot, the Jewish Underground, were also involved in terrorism during the 1980’s. At least twenty acts of terrorism in the United States were attributed to the JDL. Many of these acts targeted Soviet diplomatic facilities and businesses because of Moscow’s alleged mistreatment of its Jewish citizens. Other nations whose facilities were targeted by the JDL included Syria, Iran, and other Middle Eastern nations opposed to Israel. The actions were primarily bombings, including two bombs detonated at the Washington, D.C., office of the Soviet airline Aeroflot on February 21, 1982. Offshoots of the left-wing Weather Underground were linked to one of the most newsworthy terrorist crimes in the early part of the decade. The incident involved an armored truck robbery that the May 19 Communist Organization carried out with the BLA on October 20, 1981, outside Nyack, New York. Two policemen and one BLA member died in the robbery and manhunt. Former Weather Underground members were also among the membership of the UFF and the ARU. Right-wing groups involved in terror incidents included the Cuban exile group Omega 7. Most of the actions of the Omega 7 were bombings of diplomatic buildings owned by governments that had official relations with Cuba and businesses owned by Cubans unsympathetic to the anti-Castro exiles. At least
The Eighties in America
thirty acts of terrorism committed in the United States during the decade were attributed to groups that the FBI labels as “anti-Castro Cuban.” Most of the acts were carried out by this group. Omega 7 has an unusual past, as some of its members have links to the U.S. Central Intelligence Agency (CIA) that date back to the failed attempt by the U.S. government to overthrow the revolutionary government of Cuba in 1961. The Order was a neo-Nazi organization dedicated to what members called the “preservation of the Caucasian race” and was active in 1983 and 1984. Members of the organization were involved in burning down a synagogue and a church, whose minister opposed them, and bombing a pornography shop. In addition, Order members were convicted of murdering Alan Berg, a Denver disc jockey opposed to the neo-Nazi doctrine; a policeman; and a fellow member whom they suspected of disclosing their activities. The most spectacular of the group’s actions was the $3.7 million robbery of a Brink’s armored truck. Two other right-wing terrorist organizations were The Covenant, the Sword and the Arm of the Lord (CSA) and the Order II. CSA is known to have committed two acts of arson in 1983, setting fire to a Jewish community center and to a church that supported gay rights. In addition, the group set off an explosive near a natural gas pipeline in an attempt to disrupt natural gas distribution throughout the Midwest. The Order II was organized after Order leader Robert Jay Mathews was killed in a shootout with FBI agents on Whidbey Island, Washington. Based in Idaho near Richard Butler’s Aryan Nations church compound, this group was responsible for seven bombings. All seven members of the group were arrested during an attempt to rob three banks simultaneously. Organizations with Foreign Links In the United States, several terrorist acts were committed by groups with grievances tangentially related to the U.S. government, including the Armenian Revolutionary Army and Justice Commandos for the Armenian Genocide. These two groups, believed by most authorities to be composed of the same members, supported Armenian independence and sought justice for the early twentieth century Armenian genocide carried out by Turkey. Acts of terrorism by the
The Eighties in America
Terrorism
■
961
Police at the scene of the notorious Brinks armored truck robbery of 1981. The robbery, carried out by left-wing extremists, resulted in the deaths of two police officers and one member of the Black Liberation Army. (AP/Wide World Photos)
two groups included the August 27, 1982, assassination of the Turkish military attaché to Canada in Ottawa. This murder was preceded by the assassinations of the Turkish consul general in Los Angeles, California, on January 28, and the honorary Turkish consul general of New England in Somerville, Massachusetts, on May 4. Other incidents claimed by these organizations were bombing attacks on shops owned by Turkish government officials and Turkish diplomatic facilities. A Croatian separatist organization known as the Croatian Freedom Fighters was responsible for four bombings in 1980 and 1981, including a pipe bombing of the New York State Supreme Court building in Manhattan on January 23, 1981, two attacks on businesses owned by Yugoslavian officials, and a car bomb targeting the Washington, D.C., home of Yugoslavia’s chargé d’affaires. An unusual terrorist incident occurred in Washington, D.C., on August 7, 1981, when twenty-four
members of the People’s Mujahideen of Iran invaded the Iranian interests section of the Algerian embassy and took six people hostage. The siege lasted one hour. An individual act of terrorism occurred in Richmond, Virginia, on September 14, 1988, when a young armed Lebanese man frustrated with the civil war in his country took over a military recruiting office and demanded that a statement about the Lebanese situation be read over two local radio stations. He surrendered peacefully after six hours. Impact On November 7, 1983, the U.S. Senate was bombed by the ARU. Though no one was injured in the attack, the reaction by officials in Washington was quick. Security was heightened around government buildings, and federal funding increased for agencies involved in security and the surveillance of political activists of all persuasions. Much of the government reaction was related to recent terrorist inci-
962
■
dents overseas (in particular the October 23 bombing of Marine barracks in Beirut that killed 241 American servicemen), especially those committed by or attributed to Palestinian and Islamic groups, some of whom were believed to be sponsored by the governments of Libya and Iran. Security enhancements included the erection of concrete barriers around government buildings, the curtailing of parking around potential targets, and enhanced identification requirements for entrance to government buildings, airports, and other locations that might attract terrorists. Federal law enforcement agencies increased efforts to pass tougher legislation that would focus on prosecuting terrorists and their supporters. The primary change in Canadian governmental responses to terrorism was the creation of the Canadian Security Intelligence Service (CSIS), an agency involved in gathering intelligence on perceived threats to Canadian security. Further Reading
Burns, Vincent, and Kate Dempsey Peterson, eds. Terrorism: A Documentary and Reference Guide. Westport, Conn.: Greenwood Press, 2005. An easy-touse account of the past forty years of terrorism in the United States, with particular emphasis on the rise of terrorism related to the Middle East. The text contains more than seventy essays and documents discussing and detailing the issue of terrorism and the United States. Hansen, Ann. Direct Action: Memoirs of an Urban Guerrilla. Oakland, Calif.: AK Press, 2002. A first-person account by one of the members of the Canadian leftist anarchist group convicted of bombing the Litton Systems factory in Toronto. Woodger, Elin, and David F. Burg. The 1980’s. New York: Facts On File, 2006. An encyclopedic survey of the events of the 1980’s written for a high school audience. Ron Jacobs See also
The Eighties in America
Theater
Air India Flight 182 bombing; Anderson, Terry; Beirut bombings; Berg, Alan; Canadian caper; Crime; Iranian hostage crisis; Klinghoffer, Leon; Libya bombing; Pan Am Flight 103 bombing; Skinheads and neo-Nazis; Tylenol murders; U.S. Senate bombing; West Berlin discotheque bombing.
■ Theater Definition
Significant stage presentations on Broadway and across the United States
Major trends in theater, such as issues concerning women’s rights and gay and lesbian rights, came to fruition in the 1980’s. Professional theater strengthened its position across the United States, and there emerged new playwrights, actors, directors, and designers who included women, African Americans, and Asian Americans. The 1980’s saw the maturation and realization of trends in American theater that had been developing in the past two decades. Among the most important of these were the emergence of significant female, Asian American, and African American theatrical artists and the recognition of the issues surrounding gay and lesbian individuals in American society. Most important, perhaps, was the strengthening of professional theater across the United States, so that American theater was no longer simply Broadway theater. Theaters Across the United States Until the 1960’s, theater in the United States meant almost exclusively theater on Broadway, but a movement away from Broadway commenced in that decade and reached maturation in the 1980’s. Off-Broadway theaters not only were located away from Times Square playhouses but also were smaller in seating capacity. Thus, the Actors’ Equity Association allowed the performers to be paid less than their Broadway counterparts. In Off-Broadway houses, such as the Roundabout Theatre and the Manhattan Theatre Club, new and experimental works were performed, as were presentations by the Pan Asian Repertory Theatre and the Negro Ensemble Company. Soon, playhouses seating one hundred or less were applying for even lower Actors’ Equity Association rates, and these theaters took the designation Off-Off-Broadway. Among the interesting Off-OffBroadway theaters was Ellen Stewart’s La MaMa Experimental Theatre Club, which in the 1980’s presented new experimental comedy and performance art. By the 1980’s, there were important nonprofit professional repertory theaters across the United States, such as the Yale Repertory Theatre, the American Repertory Theatre at Harvard, the Mark Taper Forum in Los Angeles, the Actors Theatre of Louisville, Kentucky, and even two repertory theaters in
The Eighties in America
Atlanta, Georgia: the Alliance Theatre and Theatre in the Square. At the end of the decade, more than two hundred such theaters existed, and many of the playwrights who would become known as leaders in their field began in regional or Off-Broadway and Off-Off Broadway theaters. For example, Beth Henley’s Crimes of the Heart (pr. 1979), winner of the 1981 Pulitzer Prize for Drama, premiered at the Actors Theatre of Louisville before moving to Broadway. Women Playwrights and Feminist Theater
Chief among those who commenced their careers outside Broadway were a number of important female playwrights in addition to Henley—such as Marsha Norman, Tina Howe, and Wendy Wasserstein—two of whom won the Pulitzer Prize for Drama during the 1980’s: Norman, for ’night Mother (pr. 1983), and Wasserstein, for The Heidi Chronicles (pr. 1988). In 1983, Howe won a collective Obie Award, the OffBroadway award, for her overall contribution to dramatic literature. In various ways, these playwrights and other women playwrights of the decade, such as two-time Obie Award-winner Corinne Jacker, took up issues involved in the lives of women and in the emerging women’s rights movement. María Irene Fornés’s The Conduct of Life (pr. 1985) examined Chicano experiences as women attempted to deal with male dominance. To explore the ongoing interest in women’s rights, the Women’s Experimental Theatre presented plays investigating the role pf women in Western patriarchal families. In 1980, Lois Weaver and Peggy Shaw introduced Split Britches, based on the lives of Weaver’s aunts in the Virginia mountains, at the Women’s One World Festival of feminist theater held in New York City. In 1982, the Split Britches Company founded the Women’s One World (WOW) Café in New York City, dedicated to producing works by and for women. Along with collaborator Deb Margolin, the Split Britches Company presented several important feminist plays at WOW Café during the 1980’s. Women also claimed leadership in other ways. In 1982, Ellen Burstyn was elected the first female president of the Actors’ Equity Association, followed by Colleen Dewhurst in 1985. In the same year, Heidi Landesman became the first female designer to win a Tony Award, for her scenery for Big River (pr. 1985). African Americans and Asian Americans
In addition to the emerging women playwrights, the 1980’s saw the rise of two important Asian American play-
Theater
■
963
wrights: Philip Kan Gotanda, author of such works as Yankee Dawg You Die (pr. 1989), and David Henry Hwang, who won the 1988 Tony Award for the Broadway hit M. Butterfly. African American dramatists were led into the 1980’s by Ntozake Shange, August Wilson, and Charles Fuller, winner of the 1982 Pulitzer Prize for Drama for A Soldier’s Play (pr. 1981). It is August Wilson, whose work was first produced by Yale Repertory Theatre under the direction of Lloyd Richards, whom many consider to be the most significant playwright of contemporary theater. In 1987, he won the Pulitzer Prize for Drama for Fences (pr. 1985), starring Tony Award-winner James Earl Jones. Wilson’s other plays of the 1980’s include Ma Rainey’s Black Bottom (pr. 1982), The Piano Lesson (pr. 1987), and Joe Turner’s Come and Gone (pr. 1988), all of which continue to be produced throughout the United States. Gay and Lesbian Issues
Feminist theater groups such as Split Britches and WOW Café considered issues of lesbianism, as in the WOW Café 1985 productions of Alice Forrester’s Heart of the Scorpion (pr. 1984) and Holly Hughes’ The Well of Horniness (pr. 1985). In 1983, Harvey Fierstein would win the Pulitzer Prize for Drama for Torch Song Trilogy (pr. 1982), three one-act plays examining homosexual issues through the evolving life of a Jewish drag queen. That year also saw the hit drag musical La Cage aux Folles, based on the 1973 French play by Jean Poiret. Ultimately, however, more serious presentations of gay men would be made, as in Langford Wilson’s Burn This (pr. 1987). The gay experience turned extremely dark in the 1980’s with the advance of the AIDS epidemic, which is treated seriously in 1985 in William F. Hoffman’s drama As Is and Larry Kramer’s The Normal Heart.
Musical Theater
The most interesting phenomenon in musical theater of the 1980’s was the maturation of a type of artist developed in the 1970’s: the director-choreographer. It was this individual who created both the dance and the overall artistic statement of the musical, often including the story line as well. When the latter was the case, the work was referred to as a “concept musical,” perhaps best exemplified by A Chorus Line, created by Michael Bennett in 1975, which ran throughout the 1980’s and was revived in the early twenty-first century. Bennett created Dreamgirls (pr. 1981) and The Tap Dance Kid (pr. 1983). He was preceded as a director-choreographer
964
■
The Eighties in America
Theater
by the famous Bob Fosse, creator of Chicago (pr. 1975), whose concept musical Big Deal opened in 1986. Another such artist was Tommy Tune, who achieved top rank as a director-choreographer with Nine (pr. 1982) and followed with the hit Grand Hotel (pr. 1989). Not known as a choreographer but as a frontline director of musical theater was Hal Prince. Prince became well known with his work on concept musicals he produced in the 1970’s with the composer-lyricist Stephen Sondheim. Sweeney Todd, perhaps their best-known work, starring Angela Lansbury, opened in 1979 and ran throughout the 1980’s. Their last joint production was Merrily We Roll Along (pr. 1981). In the 1980’s, Prince devoted himself to importing and staging English musicals such as Andrew Lloyd Webber’s The Phantom of the Opera (pr. 1986). Indeed, it was the English import that came to dominate the American musical theater scene in the late 1980’s. In addition to The Phantom of the Opera, which became the longest-running musical in Broadway history, was Lloyd Webber’s Cats, inspired by T. S. Eliot’s Old Possum’s Book of Practical Cats (1939), which opened in the United States in 1982 and became the second-longest-running musical in Broadway history.
Biloxi Blues (pr. 1985), and Broadway Bound (pr. 1986). In speaking of established playwrights, it should be noted that one of the twentieth century’s most prominent American dramatists, Tennessee Williams, died in 1983. The number of talented new actors on Broadway in the 1980’s was impressive. They included Kevin Kline, Bernadette Peters, Mandy Patinkin, Glenn Close, Stockard Channing, and Swoosie Kurtz, many of whom went on to distinguished film careers. Joining these stars were three impressive talents who returned from filmmaking to grace Broadway productions: Meryl Streep, Kathleen Turner, and Dustin Hoffman. In addition to performers, there emerged new designers to join the ranks of established artists. Freddy Wittop, famed for his costume designs for Hello, Dolly! (pr. 1964), made his last contribution to Broadway in 1986’s The Three Musketeers. His reins were taken up by William Ivey Long, who received the 1982 Tony Award for his costume designs for Nine. Long was joined by scene designer John Lee Beatty, who, by the mid-1980’s, had designs for six shows running simultaneously. Broadway designers were not all men, however. Scenery and costume designer Heidi Landesman contributed designs for four Broadway productions. Alternative Theater
New and Established Theater Artists
David Mamet, who continues to be an important dramatist in the early twenty-first century, was introduced in the 1980’s. Mamet’s powerful work on the lack of ethics of some businessmen, Glengarry Glen Ross, won the 1984 Pulitzer Prize for Drama. He followed in 1988 with Speed-the-Plow. Mamet was joined by emerging playwrights Langford Wilson and A. R. Gurney. Gurney contributed five plays during the 1980’s, his two most successful being The Dining Room (pr. 1981) and Love Letters (pr. 1988). Wilson opened the 1980’s with the Pulitzer Prize for Drama for Talley’s Folly (pr. 1979). In addition to Angel’s Fall (pr. 1983) and Burn This, his Hot L Baltimore (pr. 1973) ran throughout the decade. Sam Shepard, Edward Albee, and Neil Simon were a trio of established playwrights who found success in the 1980’s. Shepard had two important 1980’s plays, True West (pr. 1980) and A Lie of the Mind (pr. 1985). Albee contributed Marriage Play (pr. 1987), and Simon had successes with They’re Playing Our Song (pr. 1979), Brighton Beach Memoirs (pr. 1983),
In addition to professional productions in New York City and across the United States, interesting forms of alternative theater matured during the 1980’s. The best-known director creating alternative theater is Robert Wilson, who mixes various forms of media and live theater. In 1985, he moved his work CIVIL warS from Europe to the United States. Wilson’s works do not have plots, and they take place in slow-motion so that the sense of time is altered and the audience is assaulted with nonstop images and sound effects. Another aspect of alternative theater involves performers who spend an entire evening presenting autobiographical information. Such artists include Laurie Anderson, Spalding Gray, Lily Tomlin, and Whoopi Goldberg. Some performance artists went beyond autobiography and included the audience in their work. This technique might range from chatting to traditional song and dance to dangerous activity such as cutting the artist with a knife or sticking him or her with pins. An important alternative theater, the Mabou Mimes, presented Dead End Kids: A History of Nuclear Power in 1980. The production intermixes text, such
The Eighties in America
as scientists’ diaries, with government films and excerpts from opera and night club acts to present a surrealistic image of atomic nightmares. Another such group presenting surrealistic productions was Richard Foreman’s Ontological-Hysteric Theatre, which offered four productions in the 1980’s. Impact The 1980’s signaled a coming of age of the theater in the United States. Gay and lesbian issues and the matter of women’s rights were thoroughly and openly explored. Female playwrights and designers made considerable impact, as did African and Asian American artists. New writers, performers, and other theater artists appeared who would lead the theater into the twenty-first century. Moreover, successful professional theaters were located not only Off-Broadway but also in most major U.S. cities. Further Reading
Brockett, Oscar G., and Franklin J. Hildy. History of the Theatre. 9th ed. Boston: Allyn & Bacon, 2002. An excellent history of the theater from its beginnings, with a general section on theater in the United States since 1968. Wilmeth, Don B., and Christopher Bigsby, eds. PostWorld War II to the 1990’s. Vol. 3 in The Cambridge History of American Theatre. New York: Cambridge University Press, 2000. A collection of essays by recognized experts detailing every aspect of American theater since the 1940’s. Wilmeth, Don B., and Tice L. Miller, eds. The Cambridge Guide to American Theatre. New York: Cambridge University Press, 1993. Contains more than two thousand entries covering people, places, venues, and subject matter from the beginnings of American theater to the early 1990’s. August W. Staub See also
African Americans; AIDS epidemic; Art movements; Asian Americans; Broadway musicals; Heidi Chronicles, The; Close, Glenn; Feminism; Henley, Beth; Hoffman, Dustin; Homosexuality and gay rights; Hwang, David Henry; Literature in Canada; Literature in the United States; Mamet, David; Performance art; Phantom of the Opera, The; Shepard, Sam; Streep, Meryl; Torch Song Trilogy; Turner, Kathleen; Wilson, August.
Third Wave, The
■
965
■ Third Wave, The Identification
The second book in a trilogy on the process, directions, and control of technological and social changes Author Alvin Toffler (1928) Date Published in 1980 At a time of bewildering changes and societal upheavals, Toffler argued that industrialized countries were in the birth throes of a new knowledge-based civilization. In 1970, Alvin Toffler published Future Shock, whose depiction of individuals and organizations overwhelmed by accelerating technological and societal changes helped to define the 1970’s, and in 1990 he published Powershift, in which he foretold a future in which companies as well as countries would split into opposing power centers based on different “wealth creation systems.” Sandwiched between these two books was The Third Wave, in which he expanded and deepened ideas he had introduced in Future Shock, and in which he prepared the ground for the changes in power structures that he analyzed in Powershift. The primary focus of The Third Wave is on a new civilization that he foresaw emerging out of industrial civilization. In Future Shock, he called this new civilization “super-industrial society,” but in his new book he eschewed this term for the “Third Wave.” He was not the first to use the metaphor of a wave for radical societal change, but he claimed that he was the first to apply it to the civilizational shift occurring in the 1980’s. According to Toffler’s framework, the “First Wave” began about ten thousand years ago when societies based on domesticated plants and animals replaced hunter-gatherer cultures. The “Second Wave” is Toffler’s term for what traditional historians have called the Industrial Revolution, associated with the mass production, mass distribution, and mass consumption of goods. It took thousands of years for the First Wave to decline, whereas the Second Wave played itself out in about three hundred years (1650 to 1950). Toffler believes that the Third Wave will grow, crest, and decline in several decades rather than centuries or millenniums. Humans in the 1980’s had difficulty perceiving this Third Wave because they were not yet in it but in a transition between the Second and Third Waves. Consequently, Toffler’s description of the Third Wave is more foretelling
966
■
thirtysomething
than observing, though he does extrapolate from his knowledge of contemporary cultures at the dawn of the 1980’s. What will the world look like after this global revolution is complete? New ways of life will exist, fueled by diversified, renewable energy sources. The customized production and distribution of goods and services will replace assembly lines and corporate control of marketing. Communications will be “demassified” and replaced by person-to-person contacts via computers. The nuclear family will be enhanced by a kaleidoscopic variety of new and different interpersonal relationships. Even human identity will change, for Third Wave societies will be more heterogeneous than Second Wave societies, with many varying racial, ethnic, and religious subgroupings. People who readily adapt to changes will prosper in Third Wave economies. New forms of political organization will arise that transcend the traditional nation-state. Finally, monetary wealth will be superseded by knowledge as the determinant of power, and this theme would be extensively developed in the last book of Toffler’s trilogy.
The Eighties in America Further Reading
Toffler, Alvin. Future Shock. New York: Random House, 1970. _______. Powershift: Knowledge, Wealth, and Violence at the Edge of the Twenty-First Century. New York: Bantam Books, 1990. Toffler, Alvin, and Heidi Toffler. Creating a New Civilization: The Politics of the Third Wave. Atlanta: Turner, 1995. Robert J. Paradowski See also
Agriculture in the United States; Alternative medicine; Business and the economy in the United States; Computers; Education in the United States; Europe and North America; Genetics research; Information age; Inventions; Science and technology.
■ thirtysomething Identification American television drama Date Aired from 1987 to 1991
With its sensitive writing and introspective performances, Unlike many social critics and sciencethirtysomething focused on a group of baby boomers as fiction writers, Toffler was optimistic that acceleratthey dealt with issues intrinsic to growing up. Considered ing scientific, technological, economic, and cultural overindulgent by some and groundbreakingly honest by othchanges would, on the whole, be liberating and ers, the show was the first television drama of its kind. beneficial for humanity, and he was prescient about the role that knowledge would play in a future information age. Like Future Shock, The Third Wave was a worldwide best seller and influenced many people and organizations. For example, the book had a direct influence on such American politicians as Newt Gingrich and on American military leaders who embraced such Third Wave doctrines as flexibility and decentralization. The book also influenced liberation movements in Poland and China. Toffler’s book also affected analysts who were hypothesizing a “Fourth Wave” associated either with “ecoglobalism” or the human conquest of outer space, but he himMarried actors Ken Olin and Patricia Wettig, whose characters are married to other self stated that his trilogy was comcharacters on thirtysomething, arrive at the 1988 Emmy Awards, where Wettig won plete, with no Fourth Wave analysis an Emmy as Best Supporting Actress in a drama series for her work on the show. (AP/ to come. Wide World Photos) Impact
The Eighties in America
Creators Edward Zwick and Marshall Herskovitz peopled their fictional Philadelphia with seven main characters: Michael Steadman (played by Ken Olin), the sensitive Jewish advertising executive always trying to make sense of his world and become a better man; Hope Murdoch Steadman (Mel Harris), his Protestant wife, struggling with first-time child rearing; Elliot Weston (Timothy Busfield), Michael’s business partner whose perpetual selfishness nearly destroys his marriage and forces him to change his life; Nancy Krieger Weston (Patricia Wettig), Elliot’s wife, who discovers, through their separation, her new career as an artist and ultimately her cancer and her identity separate from her family; Melissa Steadman (Melanie Mayron), Michael’s photographer cousin who is continually in and out of therapy, trying to work out her issues with men; Ellyn Warren (Polly Draper), Hope’s best friend from high school who has chosen her career over a family life; and Gary Shepherd (Peter Horton), Michael’s hippie college friend, now a college English professor, who criticizes the bourgeois lives of his friends while refusing to grow up. The lives and loves of these seven characters formed the plot of the show for the four seasons it aired on the American Broadcasting Company (ABC). The writing on the show, touted by some critics as some of the best writing ever seen on television nd criticized by others for being either too sophisticated or too “whiny,” earned the show Emmy nominations for each year it was on the air, two of which it won. The show also garnered numerous nominations and wins in acting, directing, and technical categories. Thirtysomething tackled several serious topical issues not previously featured on network television, including homosexuality, AIDS, secondwave feminism, divorce, and a long and detailed look at cancer. It also treated religion, sex, parenting, and friendship as the most important parts of a person’s life, things that require profound and constant attention. Impact
Though it aired only for four seasons, thirtysomething had a lasting impact on television and culture. The show focused on average people dealing with normal life events, a turn in television culture viewed by some as overindulgent and by others as the first move into making television a more serious and profound medium.
This Is Spin¨ al Tap
■
967
Further Reading
Heide, Margaret J. Television Culture and Women’s Lives: “Thirtysomething” and the Contradictions of Gender. Philadelphia: University of Pennsylvania Press, 1995. Thompson, Richard. Television’s Second Golden Age. Syracuse, N.Y.: Syracuse University Press, 1997. Lily Neilan Corwin See also
AIDS epidemic; Big Chill, The; Feminism; Homosexuality and gay rights; Jewish Americans; Marriage and divorce; Religion and spirituality in the United States; Television; Yuppies.
■ This Is Spin¨al Tap Identification American comedy film Director Rob Reiner (1945) Date Released March 2, 1984
Reiner’s debut independent film was one of the first “mockumentaries,” a deadpan, right-on-target satire on the excesses and outright silliness of the heavy metal rock-and-roll scene of the 1980’s. In the mock documentary This Is Spin¨al Tap, Rob Reiner plays director Marty DiBergi, who is making a documentary concert film on the heavy metal group Spin¨al Tap, distinguished as “one of England’s loudest bands.” The band was supposedly formed in 1976 and is making a comeback tour across the United States in 1982. The film pretends to be a rock documentary (like Martin Scorsese’s 1978 The Last Waltz, about the last concert of the Band, or Michael Lindsay-Hogg’s 1970 Let It Be, about the impending breakup of the Beatles) but portrays the stupidity, hedonism, and blind following of rock bands of this era, combining the satiric targets of early heavy metal bands such as Led Zeppelin and Black Sabbath and later 1980’s glam rock and theatrical metal bands such as Kiss, Megadeth, W.A.S.P., and Mötley Crüe. The film’s three lead actors, Christopher Guest, Michael McKean, and Harry Shearer, helped Reiner script and improvise the scenes, and the three also helped compose the music they played in the film. The first half of the film accurately pokes fun at the musical styles the band goes through, the television media spots, the difficulty of finding hotel accommodations on the road (or top billing at gigs), and
968
■
Thomas, Isiah
The Eighties in America
the small-minded and self-important record industry people. Though the second half of the film recounts the tour’s crash and the band’s disintegration, it also warmly tells a story of friendship, change, soul searching, and final reunion. The always-in-character acting style, something like “the method” mixed with deadpan comic improvisation, derives from Peter Sellers’s style of acting in Dr. Strangelove (1964) or Being There (1979) and Andy Kaufman’s television spots on the comedy variety show Saturday Night Live and foreshadows Sacha Baron Cohen in the film Borat (2006). From left: Michael McKean, Harry Shearer, and Christopher Guest as Although Reiner later directed Spin¨al Tap. (Hulton Archive/Getty Images) other successful comedies such as The Princess Bride (1987) and When Harry Met Sally . . . (1989), This Is Spin ¨ alTap spawned a series of teams and finally winning a national championship near “reality” docucomedies, including a Return of Spin¨al the end of the decade. Tap reunion concert film in 1992. In fact, the three lead actors, Guest, McKean, and Shearer, along with After leading Indiana University to the National Colcomedians Eugene Levy and Fred Willard, later legiate Athletic Association (NCAA) Championship starred in other mockumentaries directed by Guest: in 1981, Isiah Thomas signed a contract to play Waiting for Guffman (1996), about a local theater in the National Basketball Association (NBA) for group; Best in Show (2000), about dog shows; A the Detroit Pistons. In his first season with Detroit, Mighty Wind (2003), about nearly forgotten folk Thomas was a member of the Eastern All-Stars and singers; and For Your Consideration (2006), about Holwas also named to the NBA All-Rookie Team. In the lywood actors and awards. 1983-1984 season, Thomas led the Pistons into the NBA playoffs, where they were eliminated by the Further Reading New York Knicks. Thomas then helped the Pistons French, Karl, ed. This Is Spin ¨ al Tap: Official Companreach the Eastern Conference semifinals against the ion. New York: Bloomsbury, 2000. Boston Celtics in 1985, where they lost in six games. Maslin, Janet. “Film: This Is Spin ¨ al Tap, a Mock DocuDuring the 1986-1987 NBA campaign, Thomas mentary.” The New York Times, March 2, 1984. led the Pistons to the Eastern Conference finals, Occhiogrosso, Peter. Inside Spin ¨ alTap. New York: Arwhere they were again eliminated by Larry Bird and bor House, 1985. the Celtics. In 1988, Detroit made it to the NBA fiJoseph Francavilla nals for the first time in franchise history. Although Thomas played well, defeat came at the hands of See also Comedians; Film in the United States; Magic Johnson and the Los Angeles Lakers. In the Heavy metal; Mötley Crüe; When Harry Met Sally . . . . 1988-1989 season, Thomas guided the Pistons to a 63-19 regular season record and again into the NBA finals. Led by Thomas, the Pistons became NBA ■ Thomas, Isiah champions, prevailing over the Lakers. During the 1989-1990 campaign, Thomas again led the Pistons Identification Hall of Fame professional to the NBA championship by defeating the Lakers in basketball player four straight games. Thomas was named the NBA FiBorn April 30, 1961; Chicago, Illinois nals Most Valuable Player (MVP). During his illustrious NBA career from 1981 to Thomas became a superstar NBA player during the 1980’s, leading the Detroit Pistons to become one of the league’s top 1994, Thomas averaged 19.2 points per game; reImpact
The Eighties in America
Thompson v. Oklahoma
■
969
Further Reading
Challen, Paul. The Book of Isiah: The Rise of a Basketball Legend. Toronto: ECW Press, 2004. Kramer, Sydelle A. Basketball’s Greatest Players. New York: Random House, 1997. Thomas, Isiah. The Fundamentals: Eight Plays for Winning the Games of Business and Life. New York: Collins, 2002. Alvin K. Benson See also
Basketball; Bird, Larry; Johnson, Magic;
Sports.
■ Thompson v. Oklahoma Identification U.S. Supreme Court decision Date Decided on June 29, 1988
The Court’s ruling in Thompson v. Oklahoma abolished the death penalty for convicts who were aged fifteen or younger at the time they committed their crimes. Capital punishment remained legal for minors older than fifteen.
Detroit Piston Isiah Thomas shoots over the head of Chicago Bull Ronnie Lester during a regular-season game on January 12, 1982. (AP/Wide World Photos)
corded a .452 field goal percentage, a .759 free throw percentage, 9,061 assists, and 1,861 steals; and shot .290 from beyond the three-point line. He was named an NBA All-Star twelve times, a member of the All-NBA First Team three times, and a member of the All-NBA Second Team twice. Thomas was named the NBA All-Star Game MVP in 1984 and again in 1986. Impact Thomas was one of the best smaller men to play in the NBA. He was known for his superior dribbling ability, his accurate passing, his uncanny ability to score on drives to the basket, and his scrappy, aggressive play. Considered to be one of the greatest players to ever play for the Pistons, his jersey number, 11, was retired by the Detroit franchise when Thomas retired from the NBA. In 1987, Thomas was awarded the J. Walter Kennedy Citizenship Award.
When he was only fifteen years old, William W. Thompson participated in a brutal murder and was consequently sentenced to death by the State of Oklahoma. The Oklahoma Court of Criminal Appeals supported the trial court’s decision, and the case was appealed to the U.S. Supreme Court. In a 5-3 decision (including a four-justice plurality and a separate concurring opinion), Thompson was spared the death penalty. The plurality opinion, written by John Paul Stevens, based its reasoning upon the “evolving standards of decency of society.” The dissent, written by Antonin Scalia, could not dismiss the notion of a minor potentially being mature and responsible enough for a crime to warrant state execution. Sandra Day O’Connor cast the deciding vote: She wrote in her concurring opinion that Thompson could not be executed, because the Oklahoma law establishing the death penalty for murder did not specify a minimum age of eligibility for receiving that penalty. The legal significance of the case was that capital punishment could no longer be applied to those criminals who were aged fifteen or younger during the commission of their crime. Opponents of the death penalty have historically pursued so-called death penalty exception cases. These are controversial cases in which a characteristic of the accused
970
■
Times Beach dioxin scare
murderer could potentially negate the prosecution’s attempt to seek death on behalf of the state. For instance, the 2002 Atkins v. Virginia case established that mentally retarded offenders could not be executed. Typically these “exception” arguments are supported by the Eighth Amendment’s ban on “cruel and unusual punishment.” As in the Atkins case, Thompson was argued on Eighth Amendment grounds and Fourteenth Amendment grounds. Executing a fifteen-year-old was found to be cruel and unusual, and the Fourteenth Amendment applied this clause of the Eighth Amendment to the states. Impact The larger societal issue that the Thompson case raised was the appropriateness of the statesanctioned execution of minors. Under the legal concept of parens patriae, juveniles have traditionally been treated with different rights and obligations than adults. Though this concept has been variously interpreted, it was not unusual for the American justice system to treat children as adults in cases of perpetrating murder. This practice met with both international disdain, as the United States was one of the few countries to permit the practice, and ire among the country’s voters. The Thompson case was the first to limit the practice, hence saving minors fifteen years and younger from the death penalty. This finding was not only popular but also supported by contemporary psychiatric evidence on the reduced culpability of minors resulting from the incomplete maturation of the adolescent brain. However, the execution of sixteen- and seventeen-year-olds continued after Thompson. Further Reading
Fagan, Jeffrey. “Atkins, Adolescences, and the Maturity Heuristic: Rationales for a Categorical Exemption for Juveniles from Capital Punishment.” New Mexico Law Review 33 (Spring, 2003): 207254. Skovron, Sandra Evans, Joseph E. Scott, and Francis T. Cullen. “The Death Penalty for Juveniles: An Assessment of Public Support.” Crime and Delinquency 35, no. 4 (1989): 546-561. R. Matthew Beverlin See also
Crime; Supreme Court decisions.
The Eighties in America
■ Times Beach dioxin scare The Event Pollution incident destroys a town Date 1982-1997 Place Times Beach, Missouri
Dioxin spraying of Times Beach, population 2,240, made it one of the most toxic areas in the United States. The town was so polluted that the U.S. government purchased the entire town and evacuated its residents. During the late 1960’s and early 1970’s, Russell Bliss, a waste hauler, was hired to oil the dusty roads and horse arenas of Times Beach, Missouri, twenty-five miles southwest of St. Louis, near Interstate 44. Not knowing it was toxic, on May 26, 1971, Bliss mixed carcinogenic dioxin-contaminated waste with two thousand gallons of oil. He sprayed the contaminated oil on the town’s roads. During 1982, more than ten years later, the Environmental Protection Agency (EPA) took soil samples at various sites in Missouri to test for dioxin levels and found extremely high levels in the soil at Times Beach. The 2,240 citizens of Times Beach discovered that they were sitting on one of the most toxic patches of earth in the United States. In February, 1983, the EPA announced plans to purchase the entire town for almost $33 million in Superfund moneys and sealed it off. The deserted town was listed as a Superfund site and awaited cleanup. The cleanup would occur in the next decade, during which dioxin-contaminated materials would be incinerated, spreading the dioxin into the atmosphere and dispersing it across the planet. Impact The Times Beach dioxin scare brought further attention to the toxic substances contaminating portions of the country and the planet. It provided some impetus to environmental activists, and it increased the growing sense during the 1980’s that the world was a dangerous place, largely as a result of human activity. Within two months after Times Beach shut down its incinerator, dioxin was discovered in soil of the west St. Louis County suburb of Ellisville. Officials with the EPA said that a private driveway in Ellisville had dioxin levels as high as 195 parts per billion, many times the level it considered safe. In dry soil, a concentration of 50 parts per billion is considered hazardous waste; 3 parts per billion is the standard for edible food.
The Eighties in America
Titanic wreck discovery
■
971
Authorities began burning the dioxin-contaminated material in Times Beach, Missouri, in the 1990’s. Here, project manager Robert M. Kain is seen near an incinerator used to burn hundreds of thousands of tons of contaminated material. (AP/Wide World Photos)
Further Reading
Johansen, Bruce E. The Dirty Dozen: Toxic Chemicals and the Earth’s Future. Greenwood, Conn.: Praeger, 2003. Mansur, Michael. “After Fifteen Years, Dioxin Incineration at Times Beach, Mo., Is Finished.” Kansas City Star, June 18, 1997. Bruce E. Johansen See also
Air pollution; Environmental movement; Superfund program; Water pollution.
■ Titanic wreck discovery The Event
Marine geologists locate an ocean liner that sank in 1912 Date September 1, 1985 Place Northwest Atlantic Ocean The glamour and tragedy associated with the Titanic fostered public interest in its discovery; technologies used to
find and photograph the ship broke new ground in undersea exploration. In September, 1985, an expedition led by marine geologist Robert Ballard of Massachusetts’ Woods Hole Oceanographic Institution found the wreckage of the luxury liner Titanic resting on the bottom of the northwest Atlantic Ocean. The Titanic had collided with an iceberg four days into its maiden voyage, on the evening of April 14, 1912, and sank shortly after 2:00 a.m. on April 15. Although state-ofthe-art for its time and considered unsinkable, the Titanic was easily compromised, and it had enough lifeboats to hold fewer than half its passengers. The sinking claimed over fifteen hundred victims, including several famous and wealthy Americans to whom the Titanic had offered luxurious first-class accommodations. Ballard used the Titanic’s popular appeal to raise funds for an expedition to find it. His Woods Hole team was developing submarines, remote-controlled robots, and cameras to investigate deep-sea environ-
972
■
Titanic wreck discovery
The Eighties in America
ments inhospitable to humans. In the summer of 1985, Woods Hole and the French National Institute of Oceanography (IFREMER) launched a joint search for the Titanic. The French scanned the ocean bottom with sonar (sound waves), seeking large, metallic objects; the Americans investigated further with two vehicles used to take video and still photographs underwater: Argo and the Acoustically Navigated Geological Underwater Survey (ANGUS). IFREMER’s goal was to find the Titanic, while Ballard, partially funded by the United States Navy, was officially testing Argo and ANGUS. The French ship Le Suroit searched from July 5 to August 6 without success. The American ship Knorr resumed the search on August 25, towing Argo and its cameras 12,500 feet beneath the surface. French and American scientists worked in shifts to operate Argo and watch the video. Just before 1:00 a.m. on September 1, Argo began relaying pictures of human-made objects; the crew identified the Titanic when the outline of her circular boilers became visible. Further passes with Argo and twelve thousand color photographs taken by ANGUS would reveal that, although Titanic’s bow and stern were separated by about two thousand feet, they remained upright, and much of the ship was intact. Thousands of objects, including the Titanic’s once-elegant furnishings and its passengers’ personal effects, lay strewn around the wreck. Impact International media outlets immediately sought news and photos of the Titanic. Ballard and IFREMER fell into a dispute over releasing the expedition’s photographs, and the French declined to participate when Ballard returned to the Titanic in July, 1986, with J.J. (Jason Jr.), a smaller camera-equipped robot that could move independently while tethered to a submarine. Subsequent expeditions, including a 1987 effort by IFREMER in partnership with an American company unconnected to Ballard, recovered many artifacts from the site. Two of the ship’s safes were opened on live television on October 28, 1987, in a syndicated special hosted by Telly Savalas; the show was modeled after a similar special featuring Al Capone’s vault hosted by Geraldo Rivera. Although in 1985 Ballard supported salvage efforts, he later campaigned
to leave the Titanic undisturbed as a memorial to the dead. Further Reading
Ballard, Robert D., with Rick Archbold. The Discovery of the Titanic. Rev. ed. Toronto: Madison Press Books, 1995. Ballard, Robert D., with Michael S. Sweeney. Return to Titanic: A New Look at the World’s Most Famous Lost Ship. Washington, D.C.: National Geographic, 2004. Maureen Puffer-Rothenberg See also
Archaeology; Robots; Science and tech-
nology.
In July, 1986, Robert Ballard answers questions about his upcoming return voyage to excavate the wreck of the Titanic. Behind him is the ALVIN, a submersible capable of diving more than one mile below the ocean’s surface. (AP/Wide World Photos)
The Eighties in America
■ Torch Song Trilogy Identification Play and film Author Harvey Fierstein (1954) Date Play opened June 10, 1982; film released on
December 14, 1988 A Tony Award-winning play turned into a feature film, Fierstein’s Torch Song Trilogy addressed gay themes in a way that had previously been taboo—in terms of love. It brought to the general population a representation of samegendered relationships that looked very much like the heterosexual ones the world better understood. Torch Song Trilogy is a collection of three short plays— The International Stud, Fugue in the Nursery, and Widows and Children First!—that are produced together to form three stories from the life of Arnold Beckoff, a Jewish drag queen who lives in New York and favors singing torch songs. The play received much acclaim when it opened in 1982, including winning the Tony Award for Best Play in 1983. Much of the
Torch Song Trilogy
■
973
play’s recognition resulted from the way Fierstein handled topics such as gay-bashing, drag, bisexuality, infidelity, adoption, and family dysfunction. Moreover, in addressing these issues, the play placed its central focus on homosexual characters, who were treated not as abnormal but rather as strong, fully realized protagonists dealing with many issues that were also acutely familiar to heterosexual audiences. That Fierstein himself was an openly gay writer and actor who unabashedly wrote with such open candor and then played in the starring role was of no small significance either. Following the show’s successful Broadway run, New Line Cinema asked Fierstein to adapt the fourhour play into a two-hour film script. Making extensive cuts, Fierstein created a film that still preserved the three distinct vignettes and met studio criteria. However, there was one hurdle to overcome; New Line would not support setting the story in its original time frame of the early 1980’s, indicating that with the rise of acquired immunodeficiency syndrome (AIDS) in the gay community, the story was not plausible if it did not address that subject. Fierstein wanted to focus on other themes, so the setting of the film was changed to the 1970’s, prior to the AIDS crisis. This change highlights another significant shift in gay representation of the 1980’s, as Fierstein made a move that many others within the gay community would not make for another decade: He saw AIDS as a world crisis and not as something that was only a gay issue or that had to dominate all gay discussions to the exclusion of other topics or other sexualities. Impact Torch Song Trilogy brought gay themes to the forefront of mainstream culture in a way that differed from other attempts; it did so by casting a positive light on the gay characters within the story. This shift created a larger discussion of equity within media representations of gay, lesbian, bisexual, and transgender characters and issues. Further Reading
Playwright Harvey Fierstein in 1982. (AP/Wide World Photos)
Busch, Charles. “Torch Song Trilogy.” Advocate 876 (November, 2002): 103-104. Duralde, Alonso. “This Torch Still Burns.” Advocate 917 (June, 2004): 194-195. Guernsey, Otis L., ed. The Best Plays of 1981-1982. New York: Dodd, Mead, 1983. Needham Yancey Gulley
974
■
The Eighties in America
Toronto bathhouse raids of 1981
See also ACT UP; AIDS epidemic; Broderick, Matthew; Film in the United States; Homosexuality and gay rights; Kiss of the Spider Woman; Theater.
■ Toronto bathhouse raids of 1981 The Event
Toronto police raid four gay bathhouses, arresting hundreds of gay men Date February 5, 1981 Place Toronto, Ontario The Toronto bathhouse raids galvanized the city’s gay community into action, marking a point after which they would no longer endure ill treatment by police or the wider community. The protests that followed came to be known as “Canada’s Stonewall.” At 11:00 p.m. on February 5, 1981, 150 Toronto police officers conducted simultaneous raids, code-named Operation Soap, on four Toronto bathhouses: the Club Baths, the Romans II Spa, the Richmond Street Health Emporium, and the Barracks. Police broke down doors and smashed windows to gain entry. Once inside, they heavily damaged each establishment; the Richmond Street Health Emporium never reopened. They terrorized and verbally abused the men found inside. The officers had removed their badge numbers prior to the raid, so individual officers could not be identified. In all, 253 gay men were arrested and charged as “found-ins,” 14 were charged with minor drug possession, and 20 additional men were charged with “keeping a common bawdy house.” The violence of the raid was unprecedented, and it represented the largest Canadian mass arrest since the invocation of the War Measures Act during the FLQ Crisis of October, 1970. The following night, more than three thousand gay men and lesbians filled the streets of downtown Toronto in a protest that lasted late into the night. In an angry and sometimes violent protest, the community demanded action. With the rallying cry of “No More Shit,” they made it clear police violence could no longer be used to intimidate the community. On February 20, over four thousand gay men and lesbians marched on the provincial legislature and then to Police Division 52, whose officers had led the raid. Demanding a public inquiry from the government, they also demanded that the Ontario Human Rights Code be amended to protect gay men and lesbians.
Impact In the aftermath of the raids, 249 of those arrested were found not guilty. The police rationale for the raids, namely that the baths were actually bawdy houses and provided on-site prostitution, was found to be baseless. The raids capped a long period of police harassment of the gay community. Officers repeatedly engaged in entrapment and arrested gay men in cruising areas, raided the Body Politic gay newspaper in December, 1977, and raided the Barracks bathhouse in December, 1978, when twenty-eight men were arrested. After years of police attempts to enforce their view of morality on Toronto’s growing gay community, the police raids of 1981 pushed relations between that community and the police to a breaking point. As a result, the community came together and organized in a manner never previously achieved, establishing a base from which it would go on to respond to an even greater crisis: the looming HIV/AIDS pandemic. Further Reading
The Canadian Gay and Lesbian Archives. http:// www.clga.ca/. Kinsman, Gary. The Regulation of Desire: Homo and Hetero Sexualities. 2d ed. Montreal: Black Rose Books, 1996. McCaskell, Tim. “The Bath Raids and Gay Politics.” In Social Movements/Social Change, edited by Frank Cunningham. Toronto: Between the Lines, 1988. Michael E. Graydon See also ACT UP; AIDS epidemic; Homosexuality and gay rights.
■ Tort reform movement Definition
Legislation passed by the majority of the states in order to alleviate the tort and insurance crises Date Mid- to late 1980’s The majority of the states enacted legislation that included caps on “pain and suffering” awards, limits on punitive damages, and modification of joint and several liability rules. Among the reasons cited for the tort reform movement was the increased size and number of tort awards in personal injury, medical malpractice, and product liability cases, causing insurance companies
The Eighties in America
to raise premiums to cover increased awards. The socalled insurance cycle began at the end of the 1960’s, when the insurance industry’s profitability was declining and when the doctrine of strict liability was expanded to include personal liability cases. (The doctrine originally applied solely to businesses conducting abnormally dangerous activities.) With that expansion, a plaintiff no longer had to prove negligence but merely had to prove that the plaintiff was injured while using the product in the manner intended. In the mid-1970’s, the slump in insurance industry activity brought insurance premium increases. In 1976, President Gerald R. Ford convened a White House conference on product liability, and in 1979, during President Jimmy Carter’s administration, a special task force produced a model uniform product liability act for the states. The early 1980’s saw the insurance industry booming again because of good investment income and high interest rates. Policies were underpriced to generate premiums for investments. By 1984, interest rates fell, as did insurance investment income and profits. Pressure to increase premiums came about because of higher tort damage awards. The jury awards were so high that companies as well as individuals who could not pay their premiums went bankrupt. Others “went bare”; that is, they operated without insurance, knowing that a large verdict against them could annihilate them. The Tort Policy Working Group, formed during the administration of President Ronald Reagan, released a report advocating tort reforms, including the elimination of joint and several liability, limits on contingent fees, and a $100,000 cap on noneconomic damages. Some Reforms Numerous states passed legislation that aimed at reducing huge jury awards, limiting the areas for which suits may be brought, and imposing caps on punitive damage awards (damages granted beyond compensatory damage awards assessed as punishment for aggravated, wanton, reckless, or oppressive conduct and as a deterrence to others to prevent them from engaging in like conduct) either through a specific numerical cap or through limiting the circumstances under which punitive damage claims can be awarded. It had not been uncommon for juries to award actual damages of a few thousand dollars yet award millions of dollars in punitive damages. Additionally, amounts
Tort reform movement
■
975
assessed by juries to compensate for lost wages, medical payments, and the like (called “special damages”) made up a small part of many liability awards. Juries were likely to add on larger amounts for noneconomic damages, such as pain and suffering and loss of the ability to enjoy life (called “general damages”). In an effort to quell this trend, new legislation was sought. Tort reform involved putting limits on damage awards in malpractice, negligence, and personal injury cases. The legislation passed by the states was not uniform; not every reform was enacted in each state. In addition to imposing caps on punitive damage awards, limits were imposed for pain and suffering awards, and strict standards were adopted for proving liability for an accident or injury. In order to accomplish this, the joint and several liability principle was revised or abolished. Under the joint and several liability doctrine, two or more defendants are jointly (together) held financially responsible for a plaintiff’s injury, and each defendant severally (individually) may be held financially responsible for the full value of the harm. The most common form of joint and several liability reform was to limit its application when awarding general or noneconomic damages. Under this reform, defendants could be severally liable for economic damages but not for noneconomic damages. The intent of the reform was to provide assurance that injured plaintiffs are paid for their out-of-pocket expenses even if some defendants are insolvent. At the same time, this reform limited the “deep pocket” approach to awarding damages for noneconomic losses. Another pattern limited joint and several liability when the defendants are together less than 50 percent at fault or less at fault than the plaintiff. This reform aimed at “fairness”: limiting the number of situations in which a defendant who is only slightly at fault will have to pay for the entire amount or a large portion of damages. Generally, tort reform in this area was focused on financial responsibility in proportion to fault rather than on one’s ability to pay. Few, if any, of the reforms completely abolished joint and several liability. Of the thirty-three states that passed joint and several liability reform, only four completely eliminated the use of that doctrine. Some states enacted restrictions limiting frivolous lawsuits (in which there is no foundation for a liability claim); others instituted fines against attor-
976
■
The Eighties in America
Tower Commission
neys who filed frivolous suits. Contingent fee arrangements were also limited so that attorneys would have less incentive to seek unusually large damages for clients. Impact Some of the measures passed, such as caps on damage awards, both compensatory and punitive, have modestly reduced lawsuits, damage awards, and liability insurance premiums. Most of the reforms, however, have had little or no effect. Research has indicated that publicity about the “litigation explosion” may be changing the attitudes of juries and judges toward plaintiffs in personal injury cases. Juries have become increasingly suspicious of plaintiffs in tort cases, and judges in product liability cases have curtailed some of the litigious policies of the 1960’s, 1970’s, or 1980’s, perhaps in reaction to publicity generated by tort reformers. Consequently, numbers of claims have dropped and plaintiffs’ awards have become more difficult to obtain. Further Reading
Burke, Thomas F. Lawyers, Lawsuits, and Legal Rights: The Battle over Litigation in American Society. Berkeley: University of California Press, 2002. Describes the policies that promote the use of litigation in resolving disputes and implementing public policy. Contains detailed endnotes and numerous scholarly references. Church, George J. “Sorry, Your Insurance Has Been Cancelled.” Time 127, no. 12 (March 24, 1986): 16-26. An overview of the scenarios leading to tort reform and several of the proposals for reform. Daniels, Stephen, and Joanne Martin. Civil Juries and the Politics of Reform. Evanston, Ill.: Northwestern University Press, 1995. Analysis of patterns of jury verdicts in areas such as medical malpractice, product liability, and punitive damages within the context of the larger political and academic debate over tort reform. Depperschmidt, Thomas O. “State Tort and Insurance Reform: The Net Result of Two Years Effort.” Journal of Forensic Economics 2, no. 1 (1989): 23-46. A well-researched scholarly article on tort reform, with numerous references. Lee, Han-Duck, Mark J. Browne, and Joan T. Schmit. “How Does Joint and Several Tort Reform Affect the Rate of Tort Filings? Evidence from the State Courts.” The Journal of Risk and Insurance 61, no. 2
(1994): 295-316. A scholarly article emphasizing the doctrine of joint and several liability in tort reform. Marcia J. Weiss See also
Business and the economy in the United States; Crime.
■ Tower Commission Identification
The board appointed by U.S. president Ronald Reagan to investigate the Iran-Contra affair
As the first official inquiry into the Iran-Contra affair, the Tower Commission uncovered the basic facts of the arms sales to Iran and the diversion of proceeds to the Nicaraguan Contras and reported these facts candidly to Reagan and the American public. U.S. foreign policy in the 1980’s faced twin threats from civil wars in Central America and growing terrorism in the Middle East. In June, 1986, media reports began to appear about allegations of U.S. aid to the right-wing Contra guerrillas operating in Nicaragua against the left-wing Sandinista government. In October, 1986, attention focused in particular on an American crew member who was captured by the Sandinistas after his plane was shot down during a supply mission to the Contras. This incident in turn touched off a crisis for the Ronald Reagan administration because of the possibility that the U.S. government had been providing aid to the Contras in contravention of the Boland Amendment passed by Congress to ban such assistance. Aides of Attorney General Edwin Meese III visited the offices of the National Security Council (NSC) and discovered a memorandum that confirmed that there had been arms sales to Iran and that some of the proceeds had been diverted to the Contras. On October 26, 1986, Meese informed President Reagan of the diversion. Not knowing the full extent of the actions taken by his NSC staff, and under pressure to furnish answers to Congress and the media, Reagan appointed on November 26, 1986, a threemember commission chaired by former senator John Tower and including former secretary of state Edmund Muskie and retired Air Force lieutenant general Brent Scowcroft. Initially given a mandate to complete its work in just sixty days, the commission
The Eighties in America
assembled a staff of twenty-three employees and began searching for evidence. In addition to investigating the Iran-Contra affair, the commission recruited outside experts to conduct twelve studies of how the NSC had performed in crises dating back to the presidency of Harry S. Truman. With the notable exceptions of the president’s national security adviser, Navy vice admiral John Poindexter, and his assistant, Marine lieutenant colonel Oliver North, both of whom faced possible prosecution, the commission completed interviews of more than fifty individuals, including principals in the Iran-Contra operation, the three living former presidents, various past presidential advisers, President Reagan, and Vice President George H. W. Bush. Two weeks before its mandate was to expire, the commission made the important discovery that hundreds of backup copies of deleted electronic messages remained in the NSC’s computer system, and these “PROF notes” allowed the
Tower Commission
■
977
commission to solidify and add credibility to its conclusions. After receiving two time extensions, the commission presented its report to Reagan and the public on February 26, 1987. The Commission’s Findings
The Tower Commission’s report presented a detailed account of the six arms deliveries that the United States made to Iran, for which Iran paid $48 million, and traced how some of the proceeds, along with money raised from donors in the United States and foreign countries, was used to fund and supply the Contras in Nicaragua. The commission drew the conclusion that the structure of the NSC was sound but that members of its staff had been allowed to function too independently and to usurp the role normally served by the Central Intelligence Agency in conducting covert operations. Finding that the motive for arms sales to Iran was to gain release of several Americans being held hostage in Lebanon, the commission con-
Members of the Tower Commission take questions from the press on February 26, 1987. From left: Edmund Muskie, John Tower, and Brent Scowcroft. (AP/Wide World Photos)
978
■
The Eighties in America
Tower Commission
cluded that a strategic opening to Iran was a worthwhile objective, but that the U.S. government should not have engaged in an arms-for-hostages deal when it ran against its own policy of refusing to deal with terrorists and because its actions might serve as an incentive for further kidnappings. Concerning aid to the Contras, the commission was undecided as to whether the congressional ban applied to the NSC staff, but it questioned the aid on the grounds that, if disclosed, it could jeopardize the Reagan administration’s pro-Contra position. The commission criticized the NSC staff members responsible for the Iran-Contra affair for running a highly unprofessional operation and also cast an unfavorable light on Reagan, who, while he had not been involved in any effort to cover up the facts, had failed to exercise sufficient care in overseeing the implementation of U.S. foreign policy.
The Tower Report In its report on the Iran-Contra affair, the Tower Commission cited a “failure of responsibility” on the part of the principal members of the National Security Council (NSC), as well as of President Ronald Reagan, as set forth in this excerpt: The NSC system will not work unless the President makes it work. . . . By his own account, as evidenced in his diary notes, and as conveyed to the Board [Tower Commission] by his principal advisors, President Reagan was deeply committed to securing the release of the hostages. It was this intense compassion for the hostages that appeared to motivate his steadfast support of the Iran initiative, even in the face of opposition from his Secretaries of State and Defense. In his obvious commitment, the President appears to have proceeded with a concept of the initiative that was not accurately reflected in the reality of the operation. The President did not seem to be aware of the way in which the operation was implemented and the full consequences of U.S. participation. . . . The President’s management style is to put the principal responsibility for policy review and implementation on the shoulders of his advisors. Nevertheless, with such a complex, high-risk operation and so much at stake, the President should have insured that the NSC system did not fail him. He did not force his policy to undergo the most critical review of which the NSC participants and the process were capable. At no time did he insist upon accountability and performance review. Had the President chosen to drive the NSC. system, the outcome could well have been different.
Impact The Iran-Contra affair was the crowning government scandal of the 1980’s and, for a time, appeared to threaten Reagan’s political future. Had he been forced from the presidency, or even remained under a lingering cloud of suspicion, especially just a decade after President Richard M. Nixon’s resignation in the wake of the Watergate scandal, the American political scene could have faced a very uncertain future. The Tower Commission accomplished the difficult feat of conducting both a speedy and evenhanded investigation that, while shedding an unflattering light on Reagan’s administrative style, pinpointed the principals behind the Iran-Contra affair at the staff level of the National Security Council. The commission built a solid foundation for the lengthier congressional and criminal investigations that followed it and that produced a fuller and more detailed picture of the Iran-Contra affair. Further Reading
Draper, Theodore. A Very Thin Line: The Iran-Contra Affairs. New York: Hill & Wang, 1991. One of the
most detailed and comprehensive histories of the Iran-Contra affair. Tower, John. Consequences: A Personal and Political Memoir. Boston: Little, Brown, 1991. Includes a chapter on the author’s experiences as chair of the Tower Commission. Tower, John G., Edmund S. Muskie, and Brent Scowcroft. The Tower Commission Report: The Full Text of the President’s Special Review Board. New York: Bantam Books, 1987. The report of the Tower Commission as released to the public. Walsh, Lawrence. Firewall: The Iran-Contra Conspiracy and Cover-Up. New York: W. W. Norton, 1997. The
The Eighties in America
story of Iran-Contra from the point of view of the independent counsel who conducted the criminal investigation from 1986 to 1993. Larry Haapanen See also Foreign policy of the United States; IranContra affair; North, Oliver; Poindexter, John; Reagan, Ronald; Reagan Doctrine.
■ Toys and games Definition
Recreational products, introduced in the 1980’s, that had major economic and/or cultural significance
The 1980’s saw a series of new toy and game brands that had unprecedented popularity, many becoming cultural icons. Several factors contributed to the toy industry of the 1980’s. New technologies allowed for the development of a wide range of electronic toys. The oil crisis of the late 1970’s caused an increase in the cost of plastic, which, in turn, inspired toy manufacturers to pursue less expensive ways of making plastic toys as well as to increase the use of other materials, such as die-cast metal. These experimentations with new technologies and materials led to new product designs and gimmicks. Meanwhile, many companies tried to imitate the success that Kenner had with its toys based on the Star Wars films from 1977 to 1983, using elements of the company’s formula: action figures with vehicles and play sets, figures that were based on “good versus evil” archetypes, and cross-merchandising. This trend was compounded when the Federal Communications Commission (FCC) lifted its restrictions on tie-ins between children’s programs and toys. The rise of national retailers such as Toys “R” Us and Wal-Mart created opportunities to sell toys year round, not just on holidays and birthdays. Cable television broadened opportunities for advertising, allowing fads to spread more quickly than before. This led to the creation of an annual tradition: the “must have” toy that parents were expected to buy for Christmas, leading to long lines and waiting lists at department stores. Games The first fad toy to hit America in the 1980’s was the Rubik’s Cube, a puzzle that had been
Toys and games
■
979
invented in 1974. The puzzle was a cube with nine colored squares on each side, in six different colors. The objective of the puzzle was to get all the same color on each side. Consumers of all ages spent hours trying to solve the puzzle. At the height of the fad, there were books being published about how to solve the cube as well as similar puzzles and replacement stickers to put on an unsolved cube. The other craze in 1980’s games was trivia. While television game shows were on decline in general, syndicated evening versions of Jeopardy! and Wheel of Fortune became national phenomena, spinning off various board game and electronic versions. Trivial Pursuit was released in 1982 and had become a fad by 1984, spinning off dozens of sequels and variations. It was an easy game for large groups of people to play at parties, and similar kinds of “guessing games” came to be released by the end of the decade, such as Pictionary (and its television game show spin-off Win, Lose or Draw), Scattergories, and Outburst. Electronics The early 1980’s saw the rise of popular video games, such as Pac-Man and Donkey Kong, a new medium that changed the cultural landscape. Children, teenagers, and adults alike would gather around home computers and home video game consoles. Stand-alone arcade machines became fixtures at restaurants and other gathering places. Video games would increase in popularity and technology throughout the 1980’s and beyond, eventually overwhelming the market share of traditional toys. Advances in electronics created more than video games. Electronic games such as Simon were released in the late 1970’s, but their popularity and proliferation grew in the 1980’s. While remotecontrolled and radio-controlled toy vehicles had been around for decades, refined technologies led to a new wave of popularity in the 1980’s, and many other toy lines began incorporating motorized components. Meanwhile, some companies began to produce dolls, action figures, and vehicles with built-in voice recordings. Other toy lines capitalized on advances in laser and infrared technology to make a new kind of toy gun: Lazer Tag and Photon lines were games in which players used special light guns and “body armor” to shoot at each other and score points based on hits recorded by the armor. A similar idea was attempted, unsuccessfully, in Captain Power, an action
980
■
Toys and games
The Eighties in America
The Masters of the Universe were featured in their own half-hour-long children’s cartoon show, which drove sales of the action figures. (Hulton Archive/Getty Images)
figure line in which the toy guns and vehicles could be used by children to interact with the television series. In 1982, Hasbro’s Playskool division released Glo Worm, a plush doll that glowed when squeezed, providing young children with a combined comfort toy and night light. In 1985, Hasbro released the My Buddy and Kid Sister dolls, which were partially motorized to serve as imaginary friends. While the line was not successful, its aggressive advertising campaign created a popular jingle. Far more successful was Teddy Ruxpin, an animatronic teddy bear with a built-in tape player that “told stories” by moving its mouth and eyes. Teddy Ruxpin pioneered a whole new area for toys and novelties in the following decades. Action Figures
Mattel and Hasbro both found huge success with their attempts to jump on the Star Wars bandwagon. Novels and films in the “sword and sorcery” subgenre had been popular at the time.
While the film Conan the Barbarian was not released until 1982, Mattel released a toy line in 1981 that was loosely based on characters and concepts from the novel. Masters of the Universe, first marketed in a comic book by DC Comics, featured the war between the warrior He-Man and the evil sorcerer Skeletor. The franchise’s first few action figures were released in 1981, and the line came out in full force in 1982, but the concepts would be tweaked several times before the line finally became a huge hit. In 1983, the FCC lifted a long-standing rule forbidding connections between cartoons and toy lines. As a result, toy manufacturers began producing cartoons based on their latest lines. When Mattel went to Filmation, one of the leading animation companies of the time, to make Masters of the Universe into a cartoon, Filmation took the unprecedented route of putting the cartoon in first-run syndication and airing the new episodes on weekday afternoons, rather than on Saturdays. The move paid off. The toys became a top-selling brand for several years and
The Eighties in America
inspired a spin-off line for girls, She-Ra: Princess of Power, featuring He-Man’s sister. An ill-fated liveaction movie released in 1987 signaled the decline of the brand, and an attempted revival and repositioning in 1989 failed. However, Masters of the Universe would continue to enjoy international popularity and a strong fan base. Similarly, in 1982, Hasbro reinvented its G.I. Joe brand, which had been out of production for several years as a direct competitor to Star Wars. Reducing the 12-inch action figures to the same size as the 3.75-inch Star Wars figures, Hasbro introduced a similar line of vehicles and play sets, an “evil empire” enemy, and personalities (previously, the G.I. Joe toys had been anonymous soldiers). The line was designed in conjunction with Marvel Comics and with an animated advertising campaign by Sunbow Productions. The toy line and the comic book were issued on the same day in September, 1982. Both sold out within a few days, and the comic became one of the most successful titles of the 1980’s. An animated series, produced by Sunbow, premiered in 1983, setting G.I. Joe: A Real American Hero as a perennial brand for Hasbro. A major trend in American toys was imported from Japan. Several Japanese toy companies had produced toy lines featuring robots that changed into other forms. These toy lines coincided with the popularity of anime in Japan and were imported to the United States in the mid-1980’s. Many of the lines introduced in the United States were pastiches of Japanese toys and cartoons licensed and then rebranded by American companies. Lines included such brands as Voltron, Gobots, and Robotech, but by far the most successful of the “transforming robots” brands, in fact the brand that gave the category its name, was Hasbro’s Transformers, introduced in 1984. Hasbro licensed the designs for various robotthemed toys from Takara, a Japanese company that had originally licensed the G.I. Joe brand from Hasbro in the 1960’s. Using the same formula that had succeeded with the new G.I. Joe toys, Hasbro cross-promoted its Transformers line with Marvel Comics comic books and a Sunbow cartoon. Eager for new products, Hasbro also licensed robot toys from other Japanese manufacturers, but those designs led to copyright issues when those companies’ brands came to the United States. The Transformers line was so successful that Takara canceled its origi-
Toys and games
■
981
nal lines and bought the Transformers concept from Hasbro. The line was an international hit, and, even when one of the companies put the line on hiatus, Transformers was continually produced by either Takara or Hasbro for more than two decades. The line would see a renewal in 2007 with a major motion picture that become one of the highest-grossing films of all time. As the Star Wars line faded with the release of Return of the Jedi (1983) and as the new brands from Hasbro and Mattel featured various action gimmicks, Kenner needed a new brand to regain its market share. In 1984, Kenner licensed the rights to make action figures based on DC Comics’s superheroes and had great success with the Super Powers Team line. Another success for Kenner was M.A.S.K., a series about a team of heroes, similar to G. I. Joe, that drove shape-changing vehicles and battled an evil organization. However, neither line quite captured the market the way Hasbro and Mattel had done. The Ghostbusters (1984) movie was spun off into both a successful toy line and cartoon named The Real Ghostbusters (to distinguish from the similarly named television series Ghost Busters, a Filmation copyright that predated the film and was made into a competing toy line by Mattel). Other movies and television series turned into action figures included The A-Team, Knight Rider, and the World Wrestling Federation. In 1987, Playmates issued a line of toys based on the Teenage Mutant Ninja Turtles comic book and cartoon series. The franchise would be hugely successful, sparking various movies, cartoon series, and revival toy lines. The final big toy line of the decade was Galoob’s Micro Machines, a line of toy cars introduced in 1989. The toy cars were smaller than the traditional Hot Wheels and Matchbox cars and featured elaborate play sets. Dolls Perhaps the biggest toy craze of the 1980’s was none of the various action figure lines but rather a line of dolls marketed to girls. The Cabbage Patch Kids were a line of unusual dolls designed by Xavier Roberts and first mass-marketed by Coleco in 1982. The gimmick was that each particular doll was not just a toy; rather, it was an adopted child, complete with an imaginary adoption certificate and the conceit that it was as unique as a child. The Cabbage Patch Kids became the first “must have” Christmas toy. Parents waited in long lines to try to obtain the
982
■
The Eighties in America
Transplantation
dolls, inspiring marketers and toy companies to focus on creating the next big “craze.” The dolls generated more than $2 billion in the year 1984 alone. Also in 1982, Hasbro introduced My Little Pony, a line of cute, somewhat anthropomorphic and multicolored toy ponies that ranked with Transformers and G.I. Joe in success and, for a time, overtook Mattel’s Barbie as the top girls’ toy brand. Hasbro also enjoyed great success with Jem, a line of dolls released in 1985 with an accompanying Sunbow cartoon, featuring the adventures of an all-girl rock band. In the early 1980’s, Kenner had success in the girls’ and young children’s markets with toys based on characters from American Greetings’s juvenile cards, including the Care Bears and Strawberry Shortcake. Mattel and Hallmark responded with Rainbow Brite, a billion-dollar franchise at its peak. Tonka’s Pound Puppies, released in 1985, applied the Cabbage Patch Kids concept to stuffed animals, presenting its characters as rescue dogs with supplies and adoption certificates. Impact Many of the toy franchises introduced in the 1980’s became cultural icons. As children who grew up in the 1980’s became the teenagers and young adults of the Internet revolution in the late 1990’s, they reached out to one another online, starting Web sites and online discussion groups to share their continued love for their favorite childhood toys and entertainment franchises. This led to many of these toys being revived in the early twentyfirst century as part of a nostalgia movement. While some lines were revived only briefly and mostly unsuccessfully, certain brands from the 1980’s, such as the Transformers, Cabbage Patch Kids, Trivial Pursuit, and My Little Pony, have demonstrated perennial, cross-generational, and international appeal, making them indelible landmarks of American culture. Further Reading
Miller, G. Wayne. Toy Wars: The Epic Struggle Between G. I. Joe, Barbie, and the Companies That Make Them. New York: Crown, 1998. The story of how Mattel and Hasbro dominated the toy industry and then bought up most of its competition throughout the 1980’s and 1990’s. Santelmo, Vincent. The Complete Encyclopedia to G. I. Joe. 3d. ed. Iola, Wis.: Krause, 2001. Covers the G.I. Joe brand from 1964 to 2000.
Sweet, Roger. Mastering the Universe: He-Man and the Rise and Fall of a Billion-Dollar Idea. Cincinnati: Emmis Books, 2005. The man who originally designed He-Man for Mattel chronicles the history of the toy line, its success, and its failure. Walsh, Tim. Timeless Toys: Classic Toys and the Playmakers Who Created Them. Kansas City, Mo.: Andrews McMeel, 2005. A history of various “classic” toys, from Slinky to Trivial Pursuit. John C. Hathaway See also Advertising; Cabbage Patch Kids; Consumerism; Ghostbusters; Hobbies and recreation; PacMan; Trivial Pursuit; Video games and arcades.
■ Transplantation Definition
Transferring tissues or organs from one person to another
The 1970’s had represented a period of significant change in procurement of organs for transplantation. For instance, the concept of “brain death” allowed for a larger source of tissues and organs, rather than using cadavers as the sole source. In the 1980’s, a series of acts were passed by Congress that established a network for procuring organs and further refined the procedures begun in the previous decades. The necessity for genetic matching had likewise been a problem, one which invariably resulted in rejection if the donor and recipient were genetically distinct. The approval of cyclosporine, the first of the major antirejection drugs, allowed for greater leeway in addressing the problem of matching. Serious attempts at successful organ transplantation date as far back as the early 1900’s, but it was only with the development of immunogenetics and the discovery of tissue-associated proteins known as histocompatibility antigens in the 1930’s that it became possible to understand the science of rejection. In the 1950’s, the first attempts to transplant tissues was carried out between identical twins. In the following decade, transplantation of other organs was attempted, including the heart and lungs, but with only limited success. Antirejection Drugs One of the greatest problems associated with transplantation between donors and recipients was the likelihood of rapid rejection if the individuals did not genetically match. The ear-
The Eighties in America
lier generation of antirejection drugs had been initially developed for use in cancer chemotherapy and were too toxic for more than limited application. In 1983, cyclosporine was approved for use by the Food and Drug Administration (FDA). Discovered in the 1970’s, cyclosporine represented the first antirejection drug with wide application for the transplantation field because of its specificity and limited toxicity. The success rate for kidney transplantation as well as other organs showed an immediate increase. The maintaining of organs once they had been removed from the donor but prior to their actual implantation in a recipient was also a problem. Cold saline solutions had been helpful. Researchers at the University of Wisconsin developed an improved isotonic solution, known as UW solution, that was shown to be more effective than other solutions in maintaining organs in a viable state. Marketed as Viaspan, the solution received FDA approval in 1988 for preservation of donated livers. Viaspan subsequently was found to be equally effective for preservation of other organs and tissues. Establishment of Transplant Networks In 1968, the Uniform Anatomical Gift Act was passed by Congress, establishing the use of a “donor card” as a means to allow a person or family to request organ donation upon death; the definition of “death” was later more firmly clarified to mean “brain death” (1978). However, a legitimate concern was whether this law would result in a “market” for organs, one producing both a “seller” and a recipient, or “buyer.” In 1984, Congress passed the National Organ Transplant Act, which prohibited the sale of organs for use in transplantation. The result was the Organ Procurement and Transplantation Network, which two years later was placed under the auspices of the United Network for Organ Sharing (UNOS). Among the duties of the UNOS was the maintenance of a national registry of potential recipients, the number of which subsequently grew to nearly 100,000 persons within two decades. No cost for the organ was to be passed to the recipient. In 1986, a “routine request” law required hospitals to discuss organ transplantation with families under appropriate circumstances. Impact During the 1980’s, a number of developments in organ transplantation allowed for not only a greater increase in such procedures but also (coupled with improvements in medical technology) a much wider range in the types of surgeries that
Trivial Pursuit
■
983
could be safely carried out. The first cornea transplant, one of the first types of transplant surgery, was successfully carried out in 1905; kidney transplantations, first between twins and then between unrelated individuals, had been carried out since the 1950’s. Nobel Prizes in Physiology or Medicine were awarded in 1990 to two of the pioneers in this area, Joseph Murray and E. Donnall Thomas. The 1980’s saw the transplantation of organs previously thought to be too complex for such procedures, such as the lungs or liver. Even heart transplants became more common, though hardly yet routine. Further Reading
Brent, Leslie. A History of Transplantation Immunology. San Diego, Calif.: Academic Press, 1997. Focuses on the early (pre-1980) history of the subject. The impact of immunosuppressive drugs such as cyclosporine is described. Murphy, Kenneth. Janeway’s Immunobiology. 7th ed. New York: Garland Science, 2007. Classical textbook on the subject of immunology. An extensive portion addresses transplantation genetics and immunity. Roitt, Ivan, et al. Roitt’s Essential Immunology. 11th ed. Malden, Mass.: Blackwell Science, 2006. Several chapters in this textbook address the subject of immunology. Includes a large number of photographs. Veatch, Robert. Transplantation Ethics. Washington, D.C.: Georgetown University Press, 2000. Addresses issues such as the definition of death, sources of organ procurement, and questions related to organ allocation. Richard Adler See also
Cancer research; Fetal medicine; Genetics research; Medicine.
■ Trivial Pursuit Definition
Trivia board game
A serious fad among baby boomers in the early 1980’s, Trivial Pursuit helped revive sales of adult-oriented board games and was a good example of the use of word-of-mouth advertising. Created by two Canadian reporters in the 1970’s, Trivial Pursuit was first made available for sale in the
984
■
The Eighties in America
Trivial Pursuit
United States in 1982. The company that purchased the game, Selchow and Righter, employed a word-ofmouth advertising strategy, sending copies of the game to people in the entertainment industry, radio personalities, and toy buyers at the 1983 New York Toy Fair. This promotion strategy led to 1.3 million copies of the game being sold in 1983; the company’s goal had been to sell 300,000 copies. By 1984, Trivial Pursuit had become a fad similar to the Cabbage Patch Kids, with copies of the game selling out as quickly as they hit the shelves. It was estimated that about 20 million copies of the game were sold in 1984. The New York Times that year published several stories of Trivial Pursuit parties that lasted well into the night, and confessions from “Trivial Pursuit addicts.” In 1984, thanks to the popularity of the game, sales of the adult board games reached $777 million. In 1986, it was estimated that 1 in 5 families in the United States owned the game. Impact Numerous articles were written speculating about Trivial Pursuit’s popularity, with popular cul-
ture critic Jack Santino suggesting in 1985 that the game’s devotion among baby boomers was a result of their generation “developing a nostalgia for a shared era.” Strong sales continued throughout the decade, as new versions of the game, such as Silver Screen, Junior, Genus II, and Baby Boomer were introduced. Similar games were produced by other board-game companies in an attempt to cash in on the fad, and there was an increase in the number of adult board games being introduced to the market overall. Further Reading
Dougherty, Philip. “Trivial Pursuit Campaign.” The New York Times, July 17, 1984, p. D19. Santino, Jack. “From Jogging to Trivia Games, Fads Create Status.” U.S. News and World Report, February 11, 1985. “Seeking Board Game Bonanza.” The New York Times, December 30, 1986. p. D1. Wulffson, Don. L. Toys! Amazing Stories Behind Some Great Inventions. New York: Henry Holt, 2000. Julie Elliott
Students at St. Vincent College in Latrobe, Pennsylvania, play Trivial Pursuit on an auditorium-sized version of the game on October 17, 1984. (AP/Wide World Photos)
The Eighties in America
Tron
■
985
The Pursuit of Trivia In the original Genus edition of Trivial Pursuit, players had to correctly answer questions in six categories: geography, entertainment, history, arts and literature, science and nature, and sports and leisure. The questions were written on a set of cards, with the questions on one side of the card and the answers on the other side. Below are the questions and answers on one of the game’s cards. Category
Question
Answer
Geography
What Rocky Mountain ridge separates North America’s eastward and westward-flowing rivers?
The Continental Divide
Entertainment
What was Stanley Kubrick’s first film after 2001: A Space Odyssey?
A Clockwork Orange
History
What was the nineteenth-century term used by the United States to justify expansion?
Manifest Destiny
Arts and Literature
What Samuel Taylor Coleridge poem tells of sailor who kills an albatross?
The Rime of the Ancient Mariner
Science and Nature
How many grams make up a dekagram?
Ten
Sports and Leisure
What do you call the playing pieces in dominoes?
Bones
See also
Advertising; Cabbage Patch Kids; Fads; Hobbies and recreation; Toys and games.
■ Tron Identification Science-fiction film Director Steven Lisberger (1951Date Released July 9, 1982
)
One of the first feature films extensively to utilize computergenerated imagery, Tron also combined an innovate technique called backlight compositing with footage of live actors, creating a visually exciting virtual computer environment for the big screen. Released in 1982 by the Walt Disney Company, Tron stars Jeff Bridges as Kevin Flynn, a talented programmer whose video game inventions have been stolen by an unscrupulous corporate executive named Dillinger (David Warner). Flynn asks Alan (Bruce Boxleitner) and Lora (Cindy Morgan), who both work at Dillinger’s company Encom, for help in proving the theft, but the sinister Master Control Program (MCP) “digitizes” Flynn and transports him into the virtual computer environment over which it rules. Disoriented, Flynn is shocked to find that com-
puter programs are not just lines of code, but rather individual alter-egos of the users who create them. Tron, for instance, is Alan’s counterpart, a heroic program who defies the MCP in its efforts to wipe out programs’ belief in the users. The MCP forces programs to play games to the death in a gladiator-like environment (represented as the “reality” behind computer video games) and intends to make Flynn play on the grid until he dies, but Flynn, Tron, and another program named Ram escape the grid and mount an assault on the MCP with the help of Yori, Lora’s alter-ego program. Ultimately, the good guys destroy the MCP, restoring open communication between programs and their users.Tron represents one of the earliest attempts visually to portray cyberspace, an inhabitable “virtual reality” somehow located inside and between computers. Written by its director, Steven Lisberger, Tron’s plot grew out of the visuals that Lisberger hoped to create, rather than the other way around. Several computer and special-effects companies contributed to the film, and part of the shooting took place at the Lawrence Livermore National Laboratory in California, lending authenticity to many of the film’s “real world” sequences. Although Tron’s story line is somewhat simplistic, the idea of individuals battling against a multinational corporation for free and open access to com-
986
■
Trudeau, Pierre
The Eighties in America
puter information was popular among audiences. A Tron video arcade game, which included four subgames based on sequences in the film, was introduced in the same year and became quite profitable. Later, computer-game film tie-ins would become de rigeur for almost any action or science-fiction film. Impact
Although many critics felt that Tron embraced visual style over substance, the film quickly gained a cult following among viewers of the videogame generation. In addition, Tron’s relatively extensive use of computer-generated imaging (CGI) hinted at the vast potential for computer technology to be used in filmmaking and helped pave the way for future breakthroughs in this area. In 2002, a twentieth anniversary collector’s edition DVD was released that included extensive features about the making of the movie.
Further Reading
Bankston, Douglas. “Wrap Shot.” American Cinematographer 84 (June, 2003): 136. Bonifer, Michael. The Art of Tron. New York: Simon & Schuster, 1982. Glass, Fred. “Sign of the Times: The Computer as Character in Tron, War Games, and Superman III.” Film Quarterly 38 (Winter, 1984/1985): 16-27. Amy Sisson See also
Blade Runner; Computers; Cyberpunk literature; Film in the United States; Gibson, William; Science-fiction films; Special effects; Video games and arcades; Virtual reality.
■ Trudeau, Pierre Identification
Canadian prime minister, 19681979, 1980-1984 Born October 18, 1919; Montreal, Quebec, Canada Died September 28, 2000; Montreal, Quebec, Canada The final part of Trudeau’s political career was particularly significant, as his administration dealt with a major economic downturn, a separatist movement in the province of Quebec, increasing alienation among the western provinces, and major constitutional reform. Trudeau’s impact, however, extended to the international scene as he initiated an ultimately unsuccessful peace effort in the months before his retirement.
Prime Minister Pierre Trudeau, around 1980. (Library and Archives Canada)
Pierre Trudeau’s political career appeared to end in 1979 when he and his government were voted out of office. He even announced his retirement, although he had not followed through on this when the Progressive Conservative government of Prime Minister Joe Clark called an election for February, 1980. Trudeau led the Liberal campaign to victory and a majority government. The Domestic Agenda Once back in office, Trudeau and his government quickly had to face a referendum in the province of Quebec, as its sovereignist government under the leadership of Premier René Lévesque sought a mandate to begin the process that would ultimately lead to a form of independence for the province. Trudeau, a strong federalist, had long opposed Quebec nationalism, and he and his government made a concerted effort at defeating the referendum, which they did decisively. Part of Trudeau’s campaign involved the promise of constitutional reform. He followed through on this in 1982, when the Canadian constitution was brought
The Eighties in America
back to Canada from the United Kingdom and, in the process, amended. The major amendment involved the creation of a Canadian Charter of Rights and Freedoms that guaranteed individual rights for Canadians, a strong Trudeau principle. The reform did not occur without opposition. Several provincial governments, most notably that of Quebec, strongly opposed the change. In the end, Trudeau won them all over, with the exception of Quebec, which remained opposed to the new constitution. Other domestic issues bedeviled Trudeau. His government appeared incapable of addressing a major economic recession that saw high levels of unemployment, inflation, and interest rates. Energy was another major issue, and Trudeau’s government, in an effort to assert federal control over the sector, introduced the National Energy Program (NEP). The program proved unpopular both in the province of Alberta, Canada’s leading oil producer, and with the U.S. presidential administration of Ronald Reagan. A decline in the price of oil that followed the NEP led some in Alberta to put the blame for the setback on the NEP and the Trudeau government. Trudeau became widely despised in parts of western Canada, a region that he later admitted to never having fully understood. The Peace Mission and Retirement Suffering unpopularity at home and facing the prospect of another election or retirement, Trudeau turned to international policy in the seeming pursuit of a legacy for him. The first half of the 1980’s was a period of heightened Cold War tensions between the Soviet Union and the United States. Trudeau was personally concerned by the apparent hard-line stance taken by the Reagan administration toward the Soviets. At a G7 summit in March, 1983, he repeatedly clashed with Reagan and British prime minister Margaret Thatcher. Trudeau also found himself at odds with Washington over the reaction to the Soviet downing of a Korean Air Lines passenger plane in September, 1983. Seeking his own path, in October, 1983, Trudeau initiated a peace mission to improve relations between the two superpowers. He began traveling the globe and conducting a series of meetings with world leaders, including Chinese leader Deng Xiaoping. His effort was greeted with cynicism back in Canada, and the Reagan administration was adamantly opposed to his mission. In the end, nothing
Trudeau, Pierre
■
987
came of his travels. By February, 1984, he had had enough. After a long walk in the midst of a snowstorm, he announced his retirement. Before leaving office, however, he would reward several of his loyal supporters with patronage appointments. These arrangements, combined with a poor record at handling the economy over the previous four years, left his Liberal Party in a weak position when it sought reelection under a new leader, John Turner, in September, 1984. The Progressive Conservatives, led by Brian Mulroney, crushed the Liberals, and Trudeau quickly disappeared from public life. Occasionally, he would reemerge in subsequent years to make public comments, most notably in 1990 when he successfully opposed the Meech Lake Accord, an ultimately failed effort by the Mulroney government to amend the Canadian constitution and, in the process, bring the province of Quebec back into the constitutional fold. Impact Always a colorful character, Trudeau in the 1980’s had a decidedly mixed record as Canadian prime minister. The high point of his time in office was the repatriation of the Canadian constitution and the creation of the Charter of Rights and Freedoms, Trudeau’s ultimate legacy. On the other hand, his government presided over the increasing alienation of Quebec and western Canada while appearing to be unable to cope with and fully understand the economic malaise that gripped Canada in these years. Further Reading
Bliss, Michael. Right Honourable Men: The Descent of Canadian Politics from Macdonald to Chrétien. Toronto: HarperCollins Canada, 2004. A collection of short biographies of Canadian prime ministers, including Trudeau. Clarkson, Stephen, and Christina McCall. Trudeau and Our Times. Vol. 2. Toronto: McClelland & Stewart, 1997. A detailed study of the political career of Trudeau, with particular attention to the 1980’s. Martin, Lawrence. The Presidents and the Prime Ministers: Washington and Ottawa Face to Face—The Myth of Bilateral Bliss, 1867-1982. Toronto: Doubleday Canada, 1982. A history of relations between prime ministers and presidents, including a brief section about Trudeau and Reagan. Simpson, Jeffrey. Discipline of Power. Toronto: University of Toronto Press, 1976. An award-winning study of the short-lived government of Clark and
988
■
Turner, John
Trudeau’s return to power in February, 1980. Thompson, John Herd, and Stephen J. Randall. Canada and the United States: Ambivalent Allies. Montreal: McGill-Queen’s University Press, 2002. A strong history of relations between Canada and the United States, including during the tenure of Trudeau. Steve Hewitt See also Business and the economy in Canada; Canada Act of 1982; Canada and the United States; Canadian Charter of Rights and Freedoms; Elections in Canada; Foreign policy of Canada; Inflation in Canada; Lévesque, René; National Energy Program (NEP); Quebec referendum of 1980; Reagan, Ronald; Soviet Union and North America; Strategic Defense Initiative (SDI); Turner, John; Unemployment in Canada.
■ Turner, John Identification Prime minister of Canada in 1984 Born June 7, 1929; Richmond, Surrey, England
The Liberal Turner was seen as a more moderate and business-friendly successor to Prime Minister Pierre Trudeau, but his party was swept from power less than three months after he became prime minister, and Conservative Brian Mulroney assumed the ministry. Long-serving Liberal Canadian prime minister Pierre Trudeau had announced his retirement from politics in 1979, after his party was defeated at the polls by the Conservative Party. He returned, however, in February, 1980, to lead his party in defeating the Conservatives, headed by Prime Minister Joe Clark, and to reassume the ministry. In 1984, though, Trudeau felt ready to retire for good at age sixty-five, so the Liberals had to elect a new leader. The two main contenders for the post were John Turner, a former finance minister who had spent the last nine years working on Bay Street—the Canadian equivalent of Wall Street—in Toronto, and Jean Chrétien, a working-class Quebecer who ran a more populist campaign. Turner won the support of his party and accordingly became prime minister early in July, 1984. Turner was widely seen as being more pragmatic than was the idealistic Trudeau, bringing the atmosphere of the business world into Parliament. One Canadian radio commentator christened
The Eighties in America
Turner’s government “Boys Town on the Rideau.” (The Rideau is the canal that runs through Ottawa, right by Parliament Hill.) Turner faced fundamental problems, however. The Liberal Party was unpopular, and he had little time to renovate it before the next election. Furthermore, Trudeau had made large-scale patronage appointments before leaving office, and Turner, in turn, made even more such appointments once he assumed power. Turner seemed unable to decide whether to assume Trudeau’s mantle or to jettison it. This perceived vacillation hurt Turner, especially in Quebec, the traditional stronghold of the Liberals. Despite his distinguished appearance, Turner came across as stiff and conventional on the campaign trial. Unusually, Conservative leader Brian Mulroney was himself a Quebecer who spoke French with near fluency, and Mulroney’s party swept Quebec on the way to comprehensively defeating the Liberals. It was all Turner could do to win his own seat in the British Columbia riding of Vancouver Castro. Despite the Liberal defeat, Turner retained leadership of the party, defeating another challenge by Chrétien in 1986. In the next few years, Liberals assumed the premiership of Ontario, and the Conservatives began to look vulnerable again, as constitutional tension over Quebec’s role in the nation mounted. Mulroney had also alienated many vocal Canadian constituencies through his perceived alignment with the United States, especially his advocacy of a free trade agreement with that nation. Leading the Liberals into the 1988 campaign, Turner swerved sharply from his previous pro-business stance, advocating a platform of economic protectionism that would preserve Canada’s distinct economic and cultural identities. Though Turner won significantly more seats for the Liberals in 1988 than in 1984, however, the Liberals still lost badly, prompting many to assert that Canada had chosen an irreversible path toward absorption into the American economic sphere. Impact Turner inherited the leadership of a party in decline, and he was unable to retain his party’s majority in the 1984 elections. His changing tactics in the late 1980’s responded to the changing national perception of its proper cultural and economic relationship to the United States, but that issue alone proved insufficient to bring the Liberals back to power.
The Eighties in America Further Reading
Cahill, Jack. John Turner: The Long Run. Toronto: McClelland and Stewart, 1984. Weston, Greg. Reign of Error: The Inside Story of John Turner’s Troubled Leadership. Toronto: McGrawHill Ryerson, 1988. Nicholas Birns See also
Canada and the United States; CanadaUnited States Free Trade Agreement; Elections in Canada; Meech Lake Accord; Mulroney, Brian; Trudeau, Pierre.
■ Turner, Kathleen Identification American actor Born June 19, 1954; Springfield, Missouri
A throwback to the stars of earlier eras, Turner was one of the leading film stars of the decade. During the 1980’s, it was relatively rare for television actors to transition successfully to playing leading film roles. However, Kathleen Turner was able to do so. A featured performer in the daytime television soap opera The Doctors, she achieved film stardom with her first cinematic role, in Lawrence Kasdan’s Body Heat (1981). In this film noir reminiscent of Double Indemnity (1944), Turner played a sultry woman who manipulates a lawyer (William Hurt) into murdering her rich husband (Richard Crenna). Turner’s breathy delivery of such lines as “You’re not too smart, are you? I like that in a man” earned her comparisons with such 1940’s stars as Lauren Bacall and Lizabeth Scott, and Double Indemnity star Barbara Stanwyck sent Turner a fan letter. Turner showed her versatility by playing a gold digger in Carl Reiner’s The Man with Two Brains (1983), a farce costarring Steve Martin. The film showcased Turner’s comedic skills, which she would display several more times during the decade. She followed with one of her biggest hits, Robert Zemeckis’s Romancing the Stone (1984), playing a meek romance writer who discovers her more adventurous side while trying to rescue her kidnapped sister in Colombia with the help of a soldier of fortune played by Michael Douglas. The Jewel of the Nile (1985) was a less successful sequel. Next came the most overtly sexual of Turner’s many smoldering roles during the decade: In Ken
Turner, Kathleen
■
989
Russell’s Crimes of Passion (1984), she played a bored fashion designer who spends her nights as a flamboyant prostitute. As with Romancing the Stone, the film could be interpreted as a commentary on the failure of some successful women to find fulfillment in their jobs. One of Turner’s most unusual roles was in John Huston’s Prizzi’s Honor (1985), in which she played an assassin who marries hit man Jack Nicholson only for the two to be assigned to kill each other. Prizzi’s Honor reinforced Turner’s femme fatale skills, though her performance was more tongue-in-cheek than it had been in Body Heat. Turner’s other notable 1980’s roles include an unhappily married woman transported to her high school days in Francis Ford Coppola’s time-travel comedy Peggy Sue Got Married (1986); another unhappy wife in The Accidental Tourist (1988), in which she reunited with Kasdan and Hurt; and her unhappiest wife yet, opposite Douglas, in Danny DeVito’s darkly comic look at divorce, The War of the Roses (1989). The number of Turner films with disintegrating, complicated marriages may be seen to reflect the decade’s unease with this social institution. Turner also provided the voice for Jessica Rabbit in the combined live action-animated film noir spoof Who Framed Roger Rabbit (1988). As the title character’s sexy wife, Turner huskily purred, in the manner of Bacall and Scott, her most famous line: “I’m not bad. I’m just drawn that way.” Impact Turner specialized in playing strong, resourceful women who controlled their destinies. Because many of her films had strong sexual content, she acquired a reputation as the decade’s most sensual star. Further Reading
D’Agostino, Annette M. From Soap Stars to Superstars. New York: St. Martin’s Press, 1999. Fuller, Graham. “Kathleen Turner.” Interview 25 (August, 1995): 66-69. Segrave, Kerry, and Linda Martin. The Post-feminist Hollywood Actress: Biographies and Filmographies of Stars Born After 1939. Jefferson, N.C.: McFarland, 1990. Michael Adams See also
Film in the United States; Hurt, William; Martin, Steve; Nicholson, Jack; Who Framed Roger Rabbit.
Turner, Ted
The Eighties in America
■ Turner, Ted
became a fixture on basic cable. It broadcast primarily old movies, basketball, and baseball. Interested in further exploiting the possibilities of satellite broadcasts, Turner assembled a team to launch the Cable News Network (CNN). The project faced daunting hurdles, as start-up costs were steep, and investors and advertisers were skeptical that viewers would tune in. In 1979, the satellite Turner had hired to eventually carry the network malfunctioned, and it took a lawsuit to acquire another one. The Federal Communications Commission (FCC) fought Turner’s application for a broadcasting license for CNN and relented less than ninety days before the network launch. Once CNN hit the air in June, 1980, viewer share was only half of what Turner had projected, and the network lost $30 million during its first year. CNN lacked a sound financial foundation, but it filled a major void that had developed in American broadcast journalism. For years, network newsrooms had cut back on international coverage. Turner and CNN covered world events at a fraction of the cost the broadcast networks spent. The result was a steadily growing viewership, although the quality in the first years was decidedly uneven and earned CNN the nickname “Chicken Noodle Network.” However, the network’s mix of news veterans and eager neophytes created a powerful synergy that baffled the legions of skeptics. Turner was resolute in his support of the network through its lean years. Salaries for journalists were extremely low, but Turner reminded them that he was risking his own fortune on the network. Turner became a global gadfly and went to meet Fidel Castro in Cuba to try to convince him of the merits of cable news. Castro watched fascinated as CNN became the only network to show footage of the search-andsalvage operation at a Titan missile silo in Arkansas. CNN also provided extended coverage of the trial of the Gang of Four in China, an event that offered valuable insights into the political situation in postMao Zedong China. In its first ten years, the network’s viewership grew from under two million to sixty million. Despite the impact of CNN, not all of Turner’s endeavors in the 1980’s proved successful. His first Goodwill Games, an attempt to hurdle the political issues that had crippled the Olympics, lost $26 million. In 1987, his failed bid to purchase the Columbia Broadcasting System (CBS) showed that his thirst
990
■
Identification American media mogul Born November 19, 1938; Cincinnati, Ohio
In 1980, Turner founded the first major cable television news network, CNN. His media empire eclipsed the three broadcast television networks and established twenty-fourhour news networks as a powerful factor in journalism. Ted Turner grew up in Savannah, Georgia, and proved to be an indifferent student who was fascinated by visionary leaders. He took over his father’s billboard business, then purchased a struggling Atlanta television station in 1970, WJRJ, changing the station’s call letters to WTCG. Turner became fascinated with the potential of satellite broadcasts in the 1970’s, and in 1976, WTCG became one of the nation’s first so-called superstations, that is, local stations that broadcast via satellite to more than one market. As the broadcasts were picked up by cable television providers across the country, the station (which again changed its name, to WTBS, in 1979)
Ted Turner. (George Bennett)
The Eighties in America
Turner, Tina
■
991
for expansion was coming to an end. A $40 million divorce payout was another setback for Turner. Impact Turner transformed cable television from a media sideshow into a powerful factor in news and entertainment. Of all his endeavors, CNN revolutionized the way people get their news. In the wake of the network’s success, a continuing controversy arose over the pros and cons of twenty-four-hour news. One highlight was CNN’s coverage of the space shuttle Challenger explosion in January, 1986. Broadcast networks no longer covered shuttle launches, but CNN was there, providing an anguished nation with all the tragic details. A network dedicated solely to news was better equipped to cover breaking stories. However, twenty-four-hour news was criticized, because, in the rush to cover such breaking news, rumor and speculation could taint the quality of journalism. Moreover, because twenty-four-hour news networks require far more content than do traditional network news programs, they were more likely to cover sensational and other marginally newsworthy events in order to fill airtime. They were also more likely to overdramatize minor developments in ongoing stories, in order to keep those stories fresh. For better or for worse, however, the model first popularized by CNN became common, and the nature of television journalism was permanently changed.
Tina Turner performs in 1988. (Hulton Archive/Getty Images)
Further Reading
Evans, Harold. They Made America. New York: Little, Brown, 2004. Hack, Richard. Clash of the Titans. Beverly Hills, Calif.: New Millennium Press, 2003. Michael Polley See also
Business and the economy in the United States; CNN; Goodwill Games of 1986; Journalism; Sports; Television.
■ Turner, Tina Identification American rock-and-roll singer Born November 26, 1939; Nutbush, Tennessee
The phenomenal success of her musical career at age fortyfive and beyond made Turner a legend and changed public perceptions of middle-aged women.
By 1980 and age forty, Tina had divorced Ike Turner, ending nearly two decades of violent abuse. The divorce settlement left Tina with no money, but it allowed her to keep the name given to her by her exhusband to form the Ike and Tina Turner Revue. Once enjoying moderate fame and relative wealth as part of this duo, Tina Turner entered the 1980’s nearly broke and struggling to regain credibility. Acquiring a new manager and updating her wardrobe and hairstyle, Turner toured Europe to welcoming crowds. In the summer of 1981, she was invited to entertain at New York City’s Ritz nightclub. Turner’s new persona and natural talent were a surprising success. Her startling and unusual appearance, a legacy of mixed African and Native American ancestry, combined with a modern, high-voltage vitality and spirituality, made Turner unique in contemporary American popular music. In 1983, she was again invited to perform at the Ritz. Her show
992
■
Twilight Zone accident
was extremely well attended by celebrities and the general public and was hailed by the music press as Turner’s “comeback performance.” In England, also in 1983, Turner recorded and released the song “Let’s Stay Together”; it reached number six on the United Kingdom singles chart. Later, in the United States, it reached number twenty-six on the Billboard Hot 100 and number three on the Billboard Hot R&B/Hip-Hop Singles and Tracks chart. With the 1984 release of the album Private Dancer, Turner became an acknowledged international superstar. The album reached number three on the Billboard 200, and a single from the album, “What’s Love Got to Do with It,” reached number one on the Billboard Hot 100, a first for one of Turner’s recordings. With the best available musicians, song material, and stage settings, Turner performed to soldout crowds. She won the American Music Award in 1985 for Favorite Female Vocalist and Favorite Video Artist. Later that year, she also won a Grammy award. In 1988, Turner was named in the Guinness Book of World Records for having the largest audience for a single performer (in Rio de Janeiro, Brazil). In 1989, she was fifty years old and still enjoying the status of an international superstar. Impact Tina Turner overcame domestic violence to serve as a role model for women of the 1980’s and beyond. Her personal experiences, outlined in an autobiography, gave insight into the restorative powers of determination, self-respect, religious faith, and health-promoting practices. Turner’s on-stage persona and style, complete with leather mini-skirts, tangled blond hair, and high-energy sexuality, shattered age stereotypes for female entertainers of the era. Further Reading
Carby, Hazel V. Cultures in Babylon: Black Britain and African America. New York: Verso, 1999. Lyman, Darryl, and Michael Russel. Great AfricanAmerican Women. New York: J. David, 2005. Turner, Tina, with Kurt Loder. I, Tina. New York: Morrow 1986. Twyla R. Wells See also
Adams, Bryan; African Americans; Hairstyles; Music; Music videos; Pop music; USA for Africa; Women in rock music.
The Eighties in America
■ Twilight Zone accident The Event
Vic Morrow and two Vietnamese juvenile extras are killed filming a movie Date July 23, 1982 Place Indian Dunes Park, near Los Angeles, California The Twilight Zone accident raised public awareness of the risks assumed by film actors, stunt performers, and crew members to meet directors’ demands for realism. The ensuing court case marked the first trial of a Hollywood film director for crimes related to an on-set accident. At 2:30 a.m. on July 23, 1982, the final day of shooting on the first of four planned segments of Twilight Zone: The Movie, actor Vic Morrow, seven-year-old Myca Dinh Lee, and six-year-old Renee Shinn Chen were killed when detonated explosives hit the tail rotor of a low-flying helicopter. The three performers were struck by the main rotor; Morrow and Lee were beheaded. The actors were shooting a scene in which Morrow’s character, an American bigot transported into the past and transformed from oppressor to oppressed, had become a Vietnamese citizen being attacked by American soldiers. His character was rescuing two children from a bomb-besieged village. An inquiry by the National Transportation Safety Board was followed by a Los Angeles County grand jury investigation in 1983 and a preliminary hearing in 1984. The investigations culminated with five production crew members being charged with involuntary manslaughter. The most famous defendant was director John Landis, who was also one of the movie’s producers (with Steven Spielberg). Landis had directed The Blues Brothers (1980) and An American Werewolf in London (1981) in the previous two years. When the trial opened on July 23, 1986, prosecutor Lea Purwin D’Agostino described the defendants as “careless and negligent”; Landis’s attorney, James Neal, countered by characterizing the deaths as an “unforeseeable accident.” Over sixty-nine days, seventy-one witnesses were called, many offering potentially damaging testimony regarding Landis’s conduct and demeanor on the set. When Neal called his client to testify on February 19, 1987, Landis admitted breaking California child labor laws by hiring Lee and Chen to work after 6:30 p.m. However, he maintained that he was never warned of any potential peril in shooting the scene. After closing argu-
The Eighties in America
Tylenol murders
■
993
ments on May 19, 1987, the jury deliberated for nine days and returned not guilty verdicts for all defendants. The acquittal was based on the prosecution’s failure to prove that the accident was foreseeable. When the film was released in 1983, it included the work of Vic Morrow. Impact
The Twilight Zone accident inspired increased oversight on film sets. Studios and production companies became both more careful and more carefully regulated in their use of children; their efforts to achieve bigger, more dramatic mechanical and physical effects; and their safety precautions to protect actors and stunt doubles. The tragedy also marked the end of the era of silence and secrecy regarding potentially dangerous film scenes; workers in the industry felt freer to express safety concerns without fear of losing their jobs.
Further Reading
Farber, Stephen, and Marc Green. Outrageous Conduct: Art, Ego, and the “TwiChicago City Health Department employees test Extra Strength Tylenol capsules light Zone” Case. New York: Arbor for cyanide in October, 1982, in the wake of the September murders in the city. (AP/Wide World Photos) House, 1988. Labrecque, Ron. Special Effects: Disaster at “Twilight Zone”—The Tragedy and ing over-the-counter medications and requiring tamperthe Trial. New York: Scribner, 1988. resistant packaging. McBride, Joseph. Steven Spielberg: A Biography. New York: Simon & Schuster, 1997. On September 29, 1982, four people in Chicago Cecilia Donohue were hospitalized for similar symptoms that ultimately led to their deaths. Analysis of blood samples See also Action films; Epic films; Film in the United indicated that all four deaths were the result of cyaStates; Horror films; Science-fiction films; Special efnide poisoning. Investigation determined that all fects; Spielberg, Steven. four deaths also resulted from the use of Extra Strength Tylenol capsules. Further lab analysis revealed that the capsules contained approximately ■ Tylenol murders sixty-five milligrams of cyanide poison, more than ten thousand times the amount needed kill a single The Event Tylenol capsules are poisoned with individual. cyanide, resulting in seven deaths In an attempt to save the reputation of Tylenol, as Date September, 1982 well as their own, the companies that produced the Place Chicago, Illinois product, McNeil Consumer Products and Johnson & Johnson, issued a recall. Questions remained, The Tylenol murders pressured Congress and the Food however, concerning whether the poison was added and Drug Administration to enact federal laws regulat-
994
■
The Eighties in America
Tyler, Anne
before or after the product was sold. Prior to the recall, three more citizens in the Chicago area were found dead as a result of poisoned Tylenol capsules. These well-publicized deaths produced a nationwide fear that overwhelmed hospitals and medical providers, who provided care for many patients with suspected cyanide poisoning symptoms. Federal investigations concluded the deaths were most likely the result of a lone individual who implanted cyanide poison into the bottles and then returned them back to store shelves to be sold. Investigators had two primary suspects. The first suspect was an employee at a Tylenol warehouse from which two of the poisoned bottles were shipped. He was an amateur chemist, and searches of his residence found research on the methods of killing people with poisoned capsules. The evidence was inconclusive, however, and the suspect was not charged. The second suspect, James W. Lewis, was pursued after he mailed a handwritten letter to Johnson & Johnson claiming that the murders would continue until the company paid him one million dollars. Further investigation concluded that Lewis was not the murderer, but only a con artist. No further leads presented themselves, and the Tylenol murderer was never identified.
See also
Business and the economy in the United States; Crime; Medicine; Night Stalker case; Post office shootings; San Ysidro McDonald’s massacre; Stockton massacre; Tamper-proof packaging; Terrorism.
■ Tyler, Anne Identification American novelist Born October 25, 1941; Minneapolis, Minnesota
The novels published by Tyler during the 1980’s firmly cemented her status as a major figure in contemporary American literature. Two of these titles received major writing awards. Although Anne Tyler had published seven novels between 1964 and 1977, it was her work during the 1980’s that established her reputation as an influential twentieth century fiction writer. The decade began for Tyler with the publication of Morgan’s Passing (1980), a novel featuring a type of lead character
Impact Congress responded to the Tylenol murders by passing the Federal Anti-Tampering Act, which President Ronald Reagan signed into law in October, 1983. This law made tampering with consumer products a federal offense. In February of 1989, the Food and Drug Administration increased the tamper-resistant requirements for over-thecounter human drug products. All hard gelatin products were required to have two forms of tamperresistant packaging. The publicity surrounding the Tylenol murders influenced copycat killers throughout the United States. Further Reading
Beck, Melinda, and Susan Agrest. “Again, a Tylenol Killer.” Newsweek, February 24, 1986, 25. Beck, Melinda, Sylvester Monroe, and Jerry Buckley. “Tylenol: Many Leads, No Arrests.” Newsweek, October 25, 1982, 30. Wolnik, K. A., et al. “The Tylenol Tampering Incident: Tracing the Source.” Analytical Chemistry 56 (1984): 466. Nicholas D. ten Bensel
Anne Tyler. (© Diana Walker)
The Eighties in America
that would become the signature Tyler protagonist: an ordinary Baltimore-area resident (in this case, Morgan Gower) who addresses middle-age malaise with random acts of interpersonal connection laced, more often than not, with an extraordinarily quirky dimension. Morgan’s Passing was followed by Dinner at the Homesick Restaurant (1982), which received extensive critical acclaim from both scholarly and casual readers. This novel, a narrative told in flashback of Pearl Tull’s problematic relationships with her three children, is arguably Anne Tyler’s richest in terms of plot and character, and it stands as the work that placed her name on the list of must-read writers of the era. Her next two novels enjoyed equal popularity, and both were honored with prestigious literary awards. The Accidental Tourist (1985), winner of the National Book Critics Circle Award, introduced readers to Macon Leary, a travel writer who overcomes family tragedy and obsession with control to experience another Tyler trademark plot point: the opportunity to reinvent one’s life. Three years later, Tyler’s novel Breathing Lessons (1988) received the Pulitzer Prize. In this work, the family foibles of the Moran clan unfold against the backdrop of a drive from Baltimore to Pennsylvania to attend a funeral. Breathing Lessons features several plot devices that Tyler employed in the two novels immediately preceding it. For example, both Macon Leary and Ira Moran are male lead characters who crave control. Impulsive marriages following an abrupt jilting play a role in Dinner at the Homesick Restaurant as well as in Breathing Lessons, and both novels include scenes of unpleasant family dinners. Despite recurring motifs and situations, Tyler’s consistently careful crafting of characters makes each family ensemble unique and memorable. Impact Anne Tyler’s treatment of subject matter previously considered too mundane for the novel— life events randomly experienced by average, uncelebrated families—paved the way for other writers of the 1980’s and beyond to address similar topics. In addition, her concentration on the Baltimore, Maryland, region of the United States was influential in the growth and popularity of American Southern regional fiction. Further Reading
Bail, Paul. Anne Tyler: A Critical Companion. Westport, Conn.: Greenwood Press, 1998.
Tyson, Mike
■
995
Salwak, Dale, ed. Anne Tyler as Novelist. Iowa City: University of Iowa Press, 1994. Cecilia Donohue See also
Beattie, Ann; Erdrich, Louise; Literature in the United States; Miller, Sue; Naylor, Gloria.
■ Tyson, Mike Identification
World heavyweight boxing champion Born June 30, 1966; Brooklyn, New York Tyson exploded onto the heavyweight boxing scene in the middle of the 1980’s and remained a dominant figure in the sports world generally for the remainder of the decade. After experiencing a troubled, inner-city childhood, Mike Tyson was discovered in the early 1980’s by wellknown boxing manager and trainer Cus D’Amato, who guided him into a professional boxing career. Although relatively short for a modern-era heavyweight fighter at five feet, eleven inches, Tyson’s physique was heavily muscled and compact, and he punched with a ferocious, animal intensity. On November 11, 1986, after winning his first twenty-seven bouts—twenty-five of them by knockout—and less than two years after the start of his professional career, Tyson defeated Trevor Berbick in a dramatic second-round technical knockout (TKO) to win the World Boxing Council’s heavyweight title. At the age of twenty years and four months, he was the youngest man ever to win the heavyweight title, and his dramatic ascent in the sport captivated the nation in a manner reminiscent of the emergence of Muhammad Ali two decades earlier. Trained by D’Amato protégé Kevin Rooney following D’Amato’s death in November of 1985, Tyson quickly fought the leading heavyweights of the period, and in the process unified the splintered heavyweight division. He won the World Boxing Association title by decision from James “Bonecrusher” Smith in March of 1987, defeated highly ranked heavyweight Pinklon Thomas by a sixth-round knockout in May, and won the International Boxing Federation title by decision from Tony Tucker in August. After defending the title once more in December, he took on former champion Larry Holmes in January of 1988, inflicting the only knockout
996
■
The Eighties in America
Tyson, Mike
scored against Holmes in his seventy-five-fight career. Tyson continued to score dramatic victories during the remainder of 1988, defeating Tony Tubbs by a second-round TKO in Tokyo in March and knocking out Michael Spinks, the man who had taken the title from Larry Holmes earlier in the decade, in the first round in June. By 1989, however, problems had surfaced in both Tyson’s career and his personal life. Late in 1988, he fired Rooney and without his guidance became increasingly reckless and undisciplined in the ring. He looked sloppy in defeating British heavyweight Frank Bruno in February of 1989, although he managed to defeat Carl Williams with a first-round knockout in July. During this time, he also came under the control of controversial boxing promoter Don King, whose role in Tyson’s career seemed more exploitative than attuned to the boxer’s best interests. Finally, Tyson’s brief (19881989) marriage to actress Robin Givens ended in a
highly publicized divorce, amid accusations of physical abuse and marital infidelity. Impact In the years that followed, Tyson’s personal life would continue its decline, including a threeyear prison term for rape in the early 1990’s, and his invincibility in the ring would fade, but during the second half of the 1980’s, he was unquestionably one of the best-known and most dynamic figures in the world of professional sports. Further Reading
Heller, Peter. Bad Intentions: The Mike Tyson Story. New York: New American Library, 1989. O’Connor, Daniel, ed. Iron Mike: A Mike Tyson Reader. New York: Thunder’s Mouth Press, 2002. Scott Wright See also
African Americans; Boxing; Holmes, Larry; Sports.
U ■ Ueberroth, Peter Identification
President of the 1984 Summer Olympics and commissioner of Major League Baseball, 1984-1989 Born September 2, 1937; Evanston, Illinois Ueberroth brought an entrepreneurial spirit to his position as the president of the 1984 Los Angeles Summer Olympics, which resulted in a major financial windfall for the Olympics. Later, as commissioner of Major League Baseball, Ueberroth instituted a zero tolerance drug policy for baseball players and resolved other labor issues.
After his tenure as president of the 1984 Summer Olympics, Ueberroth took office as the sixth commissioner of Major League Baseball (MLB) on October 1, 1984. He approached baseball with the same passion and problem-solving skills with which he had approached the Olympics Games. At that time, MLB faced several challenges and controversies. Ueberroth resolved these issues and returned baseball to its prominent position as America’s favorite sport. In March, 1985, he reinstated baseball Hall of Famers Willie Mays and Mickey Mantle, who had been banned from the sport because of their association with Atlantic City casinos. In 1985, Ueberroth arbitrated a labor dispute between team owners and the Major League Baseball Players Association union regarding the issue of free agents; however, the players later filed charges of collusion (which Ueberroth had facilitated) against the team owners and won. In February, 1986, show-
Peter Ueberroth, a multimillionaire and travel industry executive, served as the president of the 1984 Los Angeles Summer Olympic Games. As an entrepreneur, Ueberroth brought business acumen to the Games. His negotiations with the American Broadcasting Company (ABC) television network to air the Summer Olympics resulted in revenue of $225 million. He raised $150 million from foreign television corporations, and he increased the number of corporate sponsorships to the Olympics Games. Ueberroth accomplished all this despite a boycott led by the Soviet Union. As a result of his hard-nosed managerial approach, he turned a multimillion-dollar profit for the Olympics, the first time the Games made a profit in over fifty years. Ueberroth also lent his time and expertise to public affairs. He served on several of President Ronald Reagan’s presidential committees to address some national social issues, and he was tapped by Lee Iacocca, president of Chrysler Corporation, to join the commission Peter Ueberroth, right, leads International Olympic Committee president Juan Antonio for the restoration of the Statue of Samaranch through the Los Angeles Memorial Coliseum before the opening ceremonies of the 1984 Summer Olympics. (AP/Wide World Photos) Liberty.
998
■
The Eighties in America
Unemployment in Canada
ing little patience for baseball players who tested positive for drug use, he suspended and fined several players, continuing this zero tolerance policy over the next few years. In one of his last major actions as commissioner, in February, 1989, Ueberroth met with incoming baseball commissioner Bart Giamatti to discuss alleged gambling in baseball. Ueberroth left his commissionership prior to the start of the 1989 baseball season and returned to the private sector to work as a corporate turnaround specialist. Impact Through his actions, Ueberroth set a new standard for the Olympic Games. His take-charge leadership style and his no-nonsense policies lifted Major League Baseball from a dark period and brought a new appreciation for the country’s national pastime. Further Reading
Ajemian, Robert. “Peter Ueberroth, Man of the Year.” Time 125, no. 1 (January 7, 1985). Ueberroth, Peter, with Richard Levin and Amy Quinn. Made in America: His Own Story. New York: William Morrow, 1985. Joseph C. Santora See also
Baseball; Business and the economy in the United States; Olympic boycotts; Olympic Games of 1984; Rose, Pete; Sports; Statue of Liberty restoration and centennial.
■ Unemployment in Canada Definition
The proportion of the Canadian labor force that is without work and seeking work
The rate of unemployment in the United States and Canada was virtually the same from 1948 to 1980. Beginning in 1981, however, the Canadian unemployment rate averaged more than two percentage points higher, and it rose throughout the decade. The year 1981 marked the beginning of a deep recession in Canada caused by inflationary pressures from the energy crisis of the 1970’s. Although increased capital investment created more jobs, with employment growing by an annual rate of 2.5 percent, consumer spending reflected economic insecurity. Interest rates rose significantly in the 1980’s, and the monetary policy pursued by the Bank of
Canada widened the gap between Canadian and U.S. interest rates. The first Canadian industries to reflect the recession were those most responsive to interest rates: home construction, durable consumer goods, and export goods. Regional differences in unemployment increased substantially, with British Columbia, the Atlantic provinces, and Quebec suffering most. Unemployment was also unevenly distributed among segments of the population. Although older workers (those aged fifty-five to sixty-four) had always had higher-than-average unemployment rates compared to other workers, the 1980’s saw a twopercentage-point rise in unemployment for that segment of the population. Female unemployment was always higher than the male rate, but the actual spread between the rates did not change. The most considerable rise in unemployment occurred among the low-skilled or undereducated segment (those with eight or fewer years of schooling). This group’s unemployment rate rose to more than 11 percent, an increase of 2.5 percentage points from 1981 to 1989. Finally, the percentage of long-term unemployment (defined as twelve months or more) rose from 3.6 percent to a peak of 9.5 percent in 19841987, then fell to 6.9 percent by the end of the decade. Older workers were especially likely to suffer longer periods of joblessness. Impact Because the economic performance of the United States and Canada was similar during the 1980’s, the difference in the countries’ unemployment rates requires explanation. Some researchers believe that the disparity between U.S. and Canadian unemployment rates is partly accounted for by statistical issues; for example, “passive” job searchers, whose only effort to find employment involves searching want ads, were classified as unemployed by Canada, but not by the United States. A case can also be made that the more generous coverage of unemployment insurance in Canada raised labor force participation and made longer-term unemployment more bearable than in the United States. Further Reading
Card, David, and W. Craig Ridell. “Unemployment in Canada and the United States: A Further Analysis.” In Trade, Technology, and Economics: Essays in Honour of Richard G. Lipsey, edited by B. Curtis Eaton and Richard G. Harris. Cheltenham, England: Edward Elgar Press, 1997.
The Eighties in America
Unemployment in the United States
Sharpe, Andrew. “The Canada-U.S. Unemployment Rate Gap: An Assessment of Possible Causes.” Research Paper R-96-15E.a. Ottawa: Human Resources Development Canada, Strategic Policy, Applied Research Branch, 1996. Jan Hall See also Business and the economy in Canada; Business and the economy in the United States; Canada and the United States; Demographics of Canada; Demographics of the United States; Inflation in Canada; Inflation in the United States; Recessions; Unemployment in the United States.
■ Unemployment in the United States Definition
The proportion of the U.S. labor force that is both without work and seeking work
■
999
ing one. Those without a job who are not looking for one are not considered to be unemployed, because they are not part of the labor force. Types of Unemployment In order to understand what caused changes in the unemployment rate during the 1980’s, it is useful to divide unemployment into three categories. Frictional unemployment results when people who are qualified for available jobs have not yet secured a position. Structural unemployment results when people are unqualified for the jobs available. Cyclical unemployment results when too few jobs are available because the economy is not strong enough to support its entire labor force. The unemployment rate will change if there is a change in any one of these three type of unemployment. Changes in the unemployment rate during the 1980’s resulted from changes in both structural and
Persistently high and growing unemployment in the early 1980’s created anxiety regarding the health of the U.S. economy. As an economic indicator, the unemployment rate is used to gauge the state of the economy and to guide economic policy. When the unemployment rate during the early part of the 1980’s reached its highest level since the 1930’s (the years that included the Great Depression), many saw this high unemployment as evidence that the economy was in decline. Viewed from a longer perspective, though, the high unemployment rate in the early 1980’s, followed by its steady decline to much lower levels at the end of the decade, seems to indicate that the national economy was adjusting to globalization. The official U.S. unemployment rate is compiled monthly by the Bureau of Labor Statistics, which is part of the U.S. Department of Labor. Based on a random sample of households, the unemployment rate estimates the percentage of individuals in the labor force without a job. To be considered part of the labor force, an individual must be sixteen years old or older and either have a job or be actively seek-
President Ronald Reagan’s son, Ronald Reagan, Jr., stands in line to collect unemployment benefits on October 14, 1982. (AP/Wide World Photos)
1000
■
The Eighties in America
Unemployment in the United States
U.S. Unemployment Rates, 1980-1989 Year
Unemployment %
1980
7.1
1981
7.6
1982
9.7
1983
9.6
1984
7.5
1985
7.1
1986
7.0
1987
6.2
1988
5.5
1989
5.3
Source: Department of Labor, Bureau of Labor Statistics, Local Area Unemployment Statistics, March, 2005.
cyclical unemployment. Early in the decade, the Federal Reserve attempted to slow the growth of the money supply in order to reduce the rate of inflation. This decision caused a decline in economic activity and resulted in two recessions spanning the period from 1980 to 1982. As a result, the number of available jobs decreased, increasing cyclical unemployment. The early 1980’s also witnessed an increase in the value of the dollar relative to the currencies of other countries. Economic policies implemented under the Ronald Reagan administration, often referred to as “Reaganomics,” included tax cuts and increased defense spending. These policies led to increased borrowing by the federal government, which, combined with the slowing growth of the money supply, raised interest rates and, in the process, the value of the dollar. The increased value of the dollar made it more difficult for U.S. producers to sell their products in other countries, while the relatively low value of foreign currency made it easier for foreign producers to sell their producets in the United States. Not only did this contribute to a rise in cyclical unemployment, but it also increased structural unemployment, as many manufacturing jobs permanently moved to other countries, eliminating the need in the United States for those skills possessed by many manufacturing workers.
As the economy emerged from the recession that ended in 1982, inflation came under control and the Federal Reserve allowed the money supply to grow at a faster rate. As a result, the unemployment rate began to fall from its high of 11.4 percent in January, 1983, and by 1988 it had settled at 5 to 6 percent. Many economists see that range as being within the range of the “natural” rate of unemployment, that is, the rate at which only frictional and structural unemployment exist. As the value of the dollar began to fall relative to other currencies over the latter part of the decade, U.S. firms became more competitive relative to the rest of the world. The expansion in economic activity eliminated cyclical unemployment. Impact High unemployment imposes a burden on both individuals and society. Individuals lose a major source of income, and society loses the output that could have been produced. In addition, the high unemployment of the early 1980’s, coupled with shrinking union membership accompanying the decline of manufacturing jobs, reduced the bargaining power of labor. Although policies were implemented that cut the unemployment rate through the remaining part of the decade, the fear of unemployment tended to keep wage demands relatively modest. As cyclical unemployment decreased, moreover, structural unemployment (which results largely from inadequate education and training) became a greater source of worry. When a federal commission published A Nation at Risk in 1983, Americans’ attention was focused on the importance of education to sustain the long-term health of the economy. Further Reading
Brenner, Robert. The Boom and the Bubble: The U.S. in the World Economy. London: Verso, 2002. Wonderful analysis of the impact of globalization on the U.S. economy since the early 1970’s and its effect on unemployment in the 1980’s. French, Michael. U.S. Economic History Since 1945. Manchester, England: Manchester University Press, 1997. Concise overview of American socioeconomic history since World War II; puts the events of the 1980’s into a larger context. Heilbroner, Robert, and Lester Thurow. Economics Explained. New York: Touchstone, 1998. Useful overview of key economic concepts such as unemployment, inflation, and globalization and how those phenomena have affected society. Randall Hannum
The Eighties in America See also
Business and the economy in the United States; Demographics of the United States; Economic Recovery Tax Act of 1981; Globalization; Income and wages in the United States; Inflation in the United States; Nation at Risk, A; Reaganomics; Recessions; Tax Reform Act of 1986; Unemployment in Canada; Unions; Welfare.
■ Unions Definition
Organizations of workers who join together to protect their common interests and improve their wages and working conditions
From the 1930’s to the 1970’s, organized labor was a potent economic and political force; by 1980, one-quarter of American workers belonged to a union. During the 1980’s, however, union membership fell rapidly as a result of economic changes, outsourcing of jobs overseas, and an increasingly hostile legal and political climate. The 1980’s was a devastating decade for American workers and their unions. Approximately eighty national unions existed in the United States during the 1980’s; roughly 84 percent were affiliated with the AFL-CIO, and 16 percent were independent unions. Overall union membership declined from 25 percent of workers in 1980 to 16 percent in 1990. Some of the largest and most powerful unions were particularly hard hit: Between 1978 and 1981 alone, the steelworkers lost 827,000 members, autoworkers lost 659,000, and building trades unions lost more than 1 million. Only public-sector unionism held relatively strong, averaging 37 percent of eligible workers (not all public employees had the right to join unions). By contrast, private-sector unionism plummeted to 11 percent in 1990, by far the lowest rate of any Western industrialized country. In neighboring Canada, 36 percent of the total workforce remained unionized. The causes of this decline are multifaceted. As the 1980’s began, the nation was still in the midst of the worst economic downturn (1979-1981) since the Great Depression of the 1930’s. Since the mid-1970’s, U.S. corporations had faced increasing competition from abroad. They sought to reduce costs by cutting wages and moving their plants to nonunion areas within the United States (primarily the South and the West) or overseas. Industrial America, once the heartland of blue-collar unionism, was decimated by
Unions
■
1001
plant closings. Meanwhile, many employers adopted aggressive antiunion policies, hiring new “unionbusting” firms at the first hint of union organizing in their plants. Others sought to break unions with which they had bargained for years, resulting in some of the most bitterly fought strikes of the decade. At the same time, unions faced a far chillier political, legal, and social climate. The National Labor Relations Board (NLRB), the courts, the media, and the executive branch all endorsed policies that increasingly restricted the rights of American workers to organize, bargain collectively, and strike. Reagan and PATCO
The policies of the Ronald Reagan administration were pivotal in helping to establish this antiunion climate. On August 3, 1981, nearly 13,000 federal air traffic controllers walked off their jobs after months of unsuccessful negotiations. For years, they had complained about obsolete equipment, chronic understaffing, mandatory overtime, rotating shift schedules, safety problems, and stress. Within four hours, President Reagan declared on national television that controllers who did not return to work within forty-eight hours would lose their jobs. Two days later, Reagan fired approximately 11,300 controllers. After breaking the strike, he announced that the strikers would never be rehired for their former positions and the Professional Air Traffic Controllers Organization (PATCO) would be decertified (essentially destroyed). PATCO’s demise represented the most stunning defeat for unions in many decades. The president’s actions foreshadowed the increasingly harsh political climate facing union organizers and heralded a decade of dramatic defeats that reversed many of the gains of the previous fifty years. Ironically, the politically conservative PATCO had been one of the few unions to support Reagan. To many observers, Reagan’s defeat of PATCO appeared to put a presidential seal of approval on hard-line antiunion strategies.
Concessions and Givebacks
Unions were on the defensive throughout the decade. In almost every strike and contract negotiation, union leaders conceded to demands for ever larger givebacks and concessions. During the first half of the 1980’s, workers lost an estimated $500 billion to cuts in wages and benefits. In 1982 alone, the seven largest steel companies demanded $6 billion in concessions in that
1002
■
Unions
The Eighties in America
Garment workers in Scranton, Pennsylvania, march with picket signs in 1984. (AP/Wide World Photos)
year’s Master Steel Agreement negotiations. By 1986, less than 33 percent of major union contracts still had cost-of-living adjustments (COLAs, which are indexed to inflation), compared with 60 percent in 1979. Annual raises were replaced by bonuses that were not included in a worker’s base pay. “Twotiered” systems were established in which new hires received significantly lower pay and benefits. Unions also relinquished some paid holidays and personal days, acceded to cuts in medical and retirement plans, and agreed to work out changes that gave employers more control. Nationally, average wages declined. By 1990, unions no longer set the wage standards in any major industry, as they had done for decades in auto, steel, rubber, mining, and transportation. The nation’s dramatic shift from a manufacturing to a service economy compounded the problem. Between 1979 and 1989, the number of manufacturing jobs declined from 21 million to 19.4 million, while the number of service-sector jobs (in fast-food restaurants, retail and discount stores, hotels, nursing homes, and offices) mushroomed from 32 mil-
lion to 45 million. These new jobs tended to be poorly paid, insecure, devoid of benefits, and nonunionized. Strike Breaking and Replacement Workers
Employers seized upon an obscure loophole in a 1938 Supreme Court decision, in the case of National Labor Relations Board v. Mackay Radio, that allowed them to continue operations by “permanently replacing” rather than firing striking workers. Between 1985 and 1989, employers hired permanent replacement workers in nearly 20 percent of strikes. In theory, a striking worker lost only his or her particular job; the employer was legally obligated to offer such an employee the first new position that opened. However, in an era of downsizing, few striking workers were recalled. Before 1981, the Mackay Radio decision was little known and rarely used. However, it quickly became the bane of the labor movement, as a growing number of major corporations began using it. Replacement workers were recruited during strikes at Maytag and Greyhound Lines in 1983; Phelps Dodge and Continental Airlines in 1984;
The Eighties in America
Hormel and the Chicago Tribune in 1985; Colt and Trans World Airlines in 1986; International Paper in 1987; and Eastern Airlines, Pittston Coal Company, and Greyhound Lines in 1989. These strikes were among the bitterest and most violent of the postWorld War II period. During the 1980’s, thousands of strikers lost their jobs to permanent replacement hires, sending a profound chill through union ranks. Strike activity declined sharply. The number of major walkouts (defined as involving 1,000 or more workers) plummeted from an average of 290 per year in the 1970’s to 35 per year in the early 1990’s. Firings for Legally Protected Activities
In 1984, pro-union workers were fired at a rate four times greater than in 1960. Legally, workers had the right to form unions, and an employer could not fire a worker for union activity. However, because the penalties were modest (illegally fired workers received no punitive damages), employers increasingly resorted to firing union supporters, leaving it up to the workers to seek redress. In the 1950’s, the NLRB ruled that workers were illegally fired in only 4 percent of union-organizing drives; by the early 1980’s, that percentage had soared to 32 percent. In 1985, an average of one of every thirty-eight workers who voted for a union was illegally fired (and later reinstated by NLRB order), compared to one worker in six hundred during the 1950’s. However, a far greater number of illegally fired workers simply gave up and failed to request reinstatement. According to a 1990 Government Accounting Office (GAO) study, it took three years for the average worker to complete the prolonged NLRB appeals process. When illegally fired workers were finally rehired, they returned to a very different workplace: The union organizing drive they had been fired for supporting had typically collapsed.
Hostile Legal and Judicial Climate From 1981 to 1983, the Reagan administration failed to fill two seats on the NLRB, creating a backlog of cases. When appointments were made, openly probusiness candidates were selected. Under the chairmanship of corporate lawyer Donald Dotson, the NLRB issued a string of antiunion decisions. For example, in Meyers Industries, Inc., the NLRB overturned a 1975 decision by upholding the firing of a truck driver who had complained about an unsafe truck. In Rossmore House, the NLRB overturned previous precedents by ruling that an employer could inter-
Unions
■
1003
rogate workers about a union organizing drive. In a series of decisions on plant closings and runaway shops, the board removed barriers to employer relocation to nonunion, low-wage areas. Meanwhile, since the 1940’s the Supreme Court had been redefining labor law. Nearly all of the tactics used by unions during the great upsurge of the 1930’s—including sit-down strikes, factory occupations, mass picketing, secondary boycotts, wildcats (strikes without formal union authorization), and strikes over grievances—had been declared unconstitutional. Whereas corporations faced no punitive damages for illegally firing workers for their union activity, during the 1980’s unions faced severe penalties and hefty fines if they engaged in any of these activities. Meanwhile, employers hired antiunion consulting firms in more than half of organizing drives. Unions Adopt a Cautious Approach
For the most part, union leaders adopted a conciliatory stance. Many unions responded to their declining numbers by either merging with other unions or competing to win over independent employee associations, neglecting the critical yet time-consuming task of organizing and recruiting new members. Although coalitions against plant closings sprang up across the nation, such as the “Save Our Valley” coalition in Youngstown, Ohio (which saw the closing of three steel mills and loss of ten thousand union jobs by 1980), few enjoyed any significant success. Lane Kirkland’s presidency of the AFL-CIO (1979-1995) began with high hopes that were quickly dashed. Although he appeared more openminded and exuded a more professional persona than his burly, cigar-chomping predecessor, George Meany (AFL-CIO president, 1955-1979), as the 1980’s wore on it became apparent that little had changed. Kirkland devoted few resources to recruiting and new outreach initiatives until 1989, when an AFLCIO “Organizing Institute” was created. While Kirkland played a major financial role in supporting the Polish trade union Solidarity, he also embraced the movement against communism and lent labor’s support to various right-wing groups and institutes abroad, further alienating the federation from other progressive movements. The AFL-CIO continued to hold itself aloof from potential new constituencies and social movements. Although Kirkland did expand the executive council to include its first woman and more minorities, in 1989 there were
1004
■
The Eighties in America
Unions
only two women and two minorities on the thirtyfive-member board. Increasingly, union leaders were perceived as a clique of mostly middle-aged white men, even though women and minorities represented the fastestgrowing sectors of the labor movement. Grassroots discontent mounted and was channeled into the formation of groups such as Teamsters for a Democratic Union, the autoworkers’ New Directions Caucus, Black Workers for Justice, Asian Pacific American Labor Alliance, Jobs with Justice, and the Gay and Lesbian Labor Activist Network. Throughout the 1980’s, insurgents challenged union bureaucrats on issues relating to foreign policy, concessions, corruption, lack of internal democracy, failure to organize new members, racism, and sexism. These reform efforts culminated in 1995 with the election of John Sweeney as president of the AFLCIO on a “new voice” slate that defeated all of the federation’s top officers. Sweeney’s victory represented the first successful challenge to an AFL-CIO president in more than one hundred years. Successes and Failures Although successful campaigns were few and far between during the 1980’s, they were significant in pointing the way toward new strategies and forms of organizing. In many cases, “rank-and-file” or local union activists came into bitter conflict with their more conservative national union leadership. The most successful struggles utilized creative tactics and welcomed broad community support. Victories (full and partial) included the 1984 United Mine Workers strike; the 1985-1987 strike of Watsonville cannery workers in Salinas Valley, California; the Justice for Janitors campaign that began in Los Angeles in 1985; and successful union organizing drives among clerical and technical staff at Yale and Harvard Universities. Both university campaigns lasted three years and involved civil disobedience, community rallies, student and faculty support, and office shutdowns. During the 1989 United Mine Workers strike against the Pittston Coal Company, workers established a “camp solidarity” program whereby the “Daughters of Mother Jones” trained thousands in civil disobedience. In eleven states, forty thousand miners staged wildcat solidarity strikes while others occupied a plant. Unions also lobbied successfully for legislation to provide some severance pay and retraining programs for workers laid off as a result of plant closings.
Among the most significant union defeats were the 1983 Phelps Dodge Mining Company copper miners’ strike and the 1985 Hormel meatpackers’ strike. Both involved the use of National Guard troops. Equally representative of the decade’s struggles was the stunning defeat of the United Auto Workers’ effort to unionize a Nissan plant in Tennessee in 1989. When the election results were announced, antiunion workers cheered and danced in the street, carrying banners that read “Union Free and Proud.” Impact During the 1980’s, the rights of American workers to organize, bargain collectively, and strike were seriously eroded. The NLRB’s conservative rulings created enduring legal precedents. Likewise, employers’ use of permanent replacement workers and other union-busting tactics continued long after the decade’s end. Nearly all studies reveal that unionized workers receive higher wages and better benefits than their nonunionized peers. The loss of union jobs contributed to the shrinking of the American middle class. After 1980, a declining number of workers enjoyed benefits such as full medical coverage, guaranteed retirement plans, and vacation and sick pay. Between 1980 and 1990, more than 10 million new jobs were created that paid less than thirteen thousand dollars per year, while only 1.6 million new jobs were created that paid more than twenty-seven thousand dollars (in year 2000 dollars). Whereas between 1947 and 1978, real hourly wages grew 80 percent (adjusted for inflation) and workers saw their standard of living steadily improve, during the 1980’s real hourly wages declined. Family income did rise slightly after 1986, but only because more wives were working outside the home and both men and women were working longer hours. Under attack from all sides, American unions, which had once enjoyed great power, influence, and prestige, lost the leading role they had played in the nation’s economy and, at the same time, lost their ability to lift a significant portion of America’s blue-collar and industrial workers into the ranks of the middle class. Further Reading
Babson, Steve. The Unfinished Struggle: Turning Points in American Labor, 1877-Present. Lanham, Md.: Rowman & Littlefield, 1999. Chapter 5, “At the Crossroads,” offers a thoughtful examination of
The Eighties in America
the crisis of unions in the 1980’s. Extremely concise yet comprehensive. Brisbin, Richard A. A Strike Like No Other Strike: Law and Resistance During the Pittston Coal Strike of 19891990. Baltimore: The Johns Hopkins University Press, 2003. Brisbin recounts one of the decade’s few successful union victories. Theoretical, for more advanced students. Dubofsky, Melvyn, and Foster Rhea Dulles. Labor in America: A History. 7th ed. Wheeling, Ill.: Harlan Davidson Press, 2004. Written by two of the leading scholars of U.S. labor history, this classic, vividly written overview has an excellent chapter on the 1980’s, aptly titled “Hard Times.” Goldfield, Michael. The Decline of Organized Labor in the United States. Chicago: University of Chicago Press, 1989. An incisive analysis of the causes of union decline in the 1980’s. Uses statistical and historical data to puncture many popular myths. Moody, Kim. U.S. Labor in Trouble and Transition: The Failure of Reform from Above, the Promise of Revival from Below. New York: Verso Press, 2007. A leading labor scholar and activist, Moody offers a penetrating analysis of the failure of unions in the 1980’s. Concludes with a survey of the new leadership, strategies, and initiatives that have emerged since then. Murolo, Priscilla, and A. B. Chitty. From the Folks Who Brought You the Weekend: A Short, Illustrated History of Labor in the United States. New York: The New Press, 2001. Easily accessible and written for a popular audience, this comprehensive survey devotes a lengthy chapter to the 1980’s, titled “Hard Times.” The section “Fighting Back” chronicles a number of innovative union campaigns. Perusek, Glenn, and Kent Worcester, eds. Trade Union Politics: American Unions and Economic Change, 1960’s-1990’s. Atlantic Highlands, N.J.: Humanities Press, 1995. A sophisticated analysis for advanced students, this provocative volume offers competing theories and perspectives on unions’ responses to globalization and corporate offenses. Rachleff, Peter. Hard-Pressed in the Heartland: The Hormel Strike and the Future of the Labor Movement. Boston: South End Press, 1992. A brief and engaging account of one of the most important strikes of the decade. The conflict pitted local union activists against their national affiliate and AFL-CIO leadership.
United Nations
■
1005
Rosenblum, Jonathan D. Copper Crucible: How the Arizona Miners Strike of 1983 Recast Labor-Management Relations in America. Ithaca, N.Y.: Cornell University Press, 1995. Sympathetic but not uncritical examination of the 1983-1986 strike against the Phelps Dodge copper company. The use of the National Guard and permanent replacement workers made this a pivotal event in 1980’s labor history. L. Mara Dodge See also
Air traffic controllers’ strike; Business and the economy in the United States; Chrysler Corporation federal rescue; De Lorean, John; Globalization; Iacocca, Lee; Income and wages in the United States; Reagan Revolution; Reaganomics; Recessions; Unemployment in Canada; Unemployment in the United States; Women in the workforce.
■ United Nations Identification International organization Date Established in 1945 Place Headquartered in New York City
During the 1980’s, the United Nations faced a wide array of global problems but managed to maintain global peace and continue its work addressing the worldwide problems of hunger, poverty, and disease. The 1980’s proved to be a tenuous decade, as the Soviet Union and the United States intensified efforts to undermine each other during the Cold War. Many violent conflicts that drew the support of one superpower immediately attracted the opposing support of the other, escalating the level of death and destruction. As the leading members of the United Nations, including the superpowers, often limited the organization’s decision-making abilities, it had to assume a rather awkward position—or one of ambivalence—in many of the decade’s conflicts, despite its best efforts to bring about peace. Some of the most controversial conflicts of the decade included the Soviet invasion and occupation of Afghanistan, the Iran-Iraq War, and the impasse over apartheid in South Africa. The Islamic revolution in Iran that was attended by 444 days of a U.S. hostage standoff, the 1982 Israeli invasion of Lebanon, the escalating tensions between Pakistan and India, and the intensification of terrorist acts in the Middle East
1006
■
The Eighties in America
United Nations
combined to create a deadly scenario for the United Nations during the decade. As the levels of interest in many of the conflicts created a kaleidoscope of alliances and blocs of protest among nations, it was often difficult if not impossible for the United Nations to take a decisive and unambiguous position on many of these issues. The Soviet invasion of Afghanistan in December, 1979, was a case in point. At its meeting in Islamabad in January, 1980, the Organization of the Islamic Conference deplored the Soviet invasion and demanded its withdrawal. In the same vein, the U.N. General Assembly voted overwhelmingly for a resolution that “strongly deplored” the “recent armed intervention” in Afghanistan and called for the “total withdrawal of foreign troops” from the country. The resolution was considered illegal, however, because the invasion was favored by the “legitimate” government of Afghanistan; therefore, the resolution was a violation of its sovereign rights. Ironically, many nonaligned countries such as India, Algeria, Iraq, Syria, Libya, and Finland considered the U.N. resolution illegal and failed to grant their support. The U.N. Security Council found it impossible to act, because the Soviets had veto power. The General Assembly was therefore limited to passing various resolutions opposing the Soviet occupation. Informal negotiations for a Soviet withdrawal from Afghanistan that started in 1982 came to fruition in 1988 as the governments of Pakistan and Afghanistan signed an agreement settling their major differences during the Geneva Accords. The United Nations set up a special mission to oversee the process, and the withdrawal of Soviet troops was announced on July 20, 1987. Fighting Poverty, Hunger, and Disease
The United Nations was confronted with two major problems in its efforts to combat disease, hunger, and poverty in most developing parts of the world during the 1980’s: the massive debt crisis that had crippled the economies of developing nations and the acquired immunodeficiency syndrome (AIDS) pandemic caused by human immunodeficiency virus (HIV). The International Monetary Fund (IMF) and the World Bank, subsidiaries of the United Nations, were blamed directly for the exacerbating poverty that spread throughout most of the developing world following countries’ compliance with Structural Adjustment Programs (SAP). The programs had,
among other things, insisted on the removal of government subsidies on essential services such as education, health, and transport, which in turn led to a drastic reduction of jobs, real income, and purchasing power. As most of the developing world, especially Africa, stood in the throes of the HIV/AIDS pandemic, the United Nations was forced to redouble its efforts to address this urgent problem threatening millions of lives during the decade. The United Nations, working through the World Health Organization (WHO) and nongovernmental organizations (NGOs), collaborated with various governments to provide both education and medicine in highly affected regions. Impact In the 1980’s, the United Nations demonstrated that it could withstand turbulent global events that threatened its existence. The League of Nations, the precursor to the United Nations, proved to be less resilient at the outbreak of World War II. Although the United Nations was not threatened by a conflict of that magnitude, there were many close calls, especially the festering ideological conflict of the Cold War between the East and West, that set the tone for many proxy conflicts around the world. During the 1980’s, the leadership of the United Nations managed to maintain its commitment to dealing with issues of conflict and peace as well as those of hunger, poverty, and disease affecting many of the developing countries of the world. As expected, there were many high and low points for the administration. Further Reading
Bennett, LeRoy, and James K. Oliver. International Organizations: Principles and Issues. 7th ed. Upper Saddle River, N.J.: Prentice Hall, 2002. Describes the organization, structure, and operations of the United Nations. O’Sullivan, Christopher D. The United Nations: A Concise History. Huntington, N.Y.: Krieger, 2005. Covers the first sixty years of the United Nations. Peterson, M. J. The General Assembly in World Politics. Boston: Unwin Hyman, 1986. Provides an in-depth look at the structure and function of the U.N. General Assembly during the 1980’s. United Nations. www.un.org. The organization’s official Web site. Contains a search feature allowing users to research U.N. actions. Austin Ogunsuyi
The Eighties in America See also
Africa and the United States; China and the United States; Cold War; Europe and North America; Foreign policy of Canada; Foreign policy of the United States; Iranian hostage crisis; Israel and the United States; Japan and North America; Latin America; Mexico and the United States; Middle East and North America; Soviet Union and North America.
■ US Festivals The Event
Two popular music festivals take place in Southern California Date September 3-5, 1982; May 28-30 and June 4, 1983 Place Glen Helen Regional Park, near Devore, California The US Festivals were the largest multiday music events in North America during the 1980’s.
US Festivals
■
1007
The US Festivals were the first major multiday rock music festivals produced in the United States since the mid-1970’s. They were the brainchild of Apple Computer executive Steve Wozniak, who invested over $12 million in the festivals. Both festivals took place near the small town of Devore, California. Wozniak paid concert promoter Bill Graham’s agency to hire the performers. Approximately 400,000 people attended the 1982 concerts. Tickets cost $37.50 for all three days. Friday’s performers included the Ramones, Gang of Four, the Police, Talking Heads, the B-52s, and Oingo Boingo. Saturday’s show featured Santana, Tom Petty and the Heartbreakers, Pat Benatar, Eddie Money, the Cars, and The Kinks. Sunday’s show began at 9:30 a.m. with the Grateful Dead, followed by Jackson Browne, Jerry Jeff Walker, Jimmy Buffett, and Fleetwood Mac. The 1982 festival lost $4 million, yet the organizers considered the festival a success in artistic and entertainment terms. The festival featured several air-conditioned tents filled with then-new personal computers.
Members of Gang of Four perform at the beginning of the first US Festival on September 3, 1982. More than eighty thousand people were in attendance. (AP/Wide World Photos)
1008
■
The Eighties in America
US Festivals
The US Festivals Performers at the US Festivals, in order of appearance: Friday, September 3, 1982 Gang of Four The Ramones The English Beat Oingo Boingo The B52s Talking Heads The Police Saturday, September 4, 1982 The Joe Sharino Band Dave Edmunds Eddie Money Santana The Cars The Kinks Pat Benatar Tom Petty and the Heartbreakers Sunday, September 5, 1982 Grateful Dead Jerry Jeff Walker
Jimmy Buffett Jackson Browne Fleetwood Mac Saturday, May 28, 1983 Divinyls INXS Wall of Voodoo Oingo Boingo The English Beat Flock of Seagulls Stray Cats Men at Work The Clash Sunday, May 29, 1983 Quiet Riot Mötley Crüe Ozzy Osbourne Judas Priest Triumph Scorpions Van Halen
The 1983 US Festival took place over two weekends. One-day tickets cost twenty dollars. The concert began on Memorial Day weekend, 1983. The opening day acts were the Divinyls, INXS, Wall of Voodoo, Oingo Boingo, the English Beat, Flock of Seagulls, Stray Cats, Men at Work, and the Clash. Saturday’s show was titled Heavy Metal Day and was attended by over 400,000 fans. The day was marred by several incidents, including a fatal beating and a fatal overdose. The bands that played were Quiet Riot, Mötley Crüe, Ozzy Osbourne, Judas Priest, Triumph, the Scorpions, and Van Halen. Monday featured David Bowie, Stevie Nicks, Little Steven and the Disciples of Soul, Berlin, Quarterflash, U2, Missing Persons, the Pretenders, and Joe Walsh. There was a country music day the following Saturday. Over 100,000 fans showed up to see and hear country music groups Riders in the Sky, Thrasher Brothers, Ricky Skaggs, Hank Williams, Jr., Emmylou Harris
Monday, May 30, 1983 Little Steven and the Disciples of Soul Berlin Quarterflash U2 Missing Persons Pretenders Joe Walsh Stevie Nicks David Bowie Saturday, June 4, 1983— Country Day Riders in the Sky Thrasher Brothers Ricky Skaggs Hank Williams, Jr. Emmylou Harris and the Hot Band Waylon Jennings Alabama Willie Nelson
and the Hot Band, Waylon Jennings, Alabama, and Willie Nelson. Impact The US Festivals represented one of the first attempts to combine a music-oriented demographic with personal computers. The strategy would ultimately pay incredible dividends in future decades, as the relationship between popular music and computer technology grew. Further Reading
Hunter, David. “Steve Wozniak Throws a Party.” SoftTalk Magazine 3, no. 10 (October, 1982): 128-140. Kirk, Cynthia. “Fun and Violence Mingle at US Festival: Break-Even Point Is Still Elusive.” Variety 311 (June 1, 1983): 55. _______. “US Festival in Cal. Deemed a Success, Attendance Is Good.” Variety 308 (September 8, 1982): 1. Ron Jacobs
The Eighties in America See also Apple Computer; Country music; Farm Aid; Heavy metal; Live Aid; Mötley Crüe; Music; New Wave music; Osbourne, Ozzy; Pop music; Talking Heads; U2; Van Halen.
■ U.S. Senate bombing The Event
A leftist group calling itself the Armed Resistance Unit sets off a bomb in the Senate Wing of the U.S. Capitol Date November 7, 1983 Place Washington, D.C. The U.S. Senate bombing resulted in tightened security measures in and around the U.S. Capitol. The Senate Chamber was closed to the public, and a system of staff identification cards was instituted. At 10:58 p.m. on November 7, 1983, a small, powerful bomb composed of six or seven sticks of dynamite exploded near the Senate Chamber of the U.S. Capitol. The bomb had been placed underneath a bench at the eastern end of a corridor outside the chamber several hours earlier; it used a pocket watch for its timing device. The explosion ripped the door off the office of Senator Robert Byrd of West Virginia, tore a hole in a wall, and shattered several mirrors and paintings opposite the Republican cloakroom. Officials estimated the cost of the damage at around $250,000. A group calling itself the Armed Resistance Unit claimed responsibility for the bombing in a call to the Capitol switchboard minutes before the explosion. The group also sent a letter to National Public Radio that claimed responsibility for the bombing. Both claims stated that the bombing was in response to the October 25, 1983, U.S. military invasion of Grenada and the U.S. military intervention in Lebanon that autumn. The Senate was originally scheduled to be in session until at least 11:00 that evening but adjourned early, potentially preventing injuries and even deaths. Police and Federal Bureau of Investigation (FBI) officials publicly considered that the bombing was tied to other similar attacks on a number of other government installations, including the National War College building in Washington, D.C.; the Staten Island, New York, Federal Building; and several Navy and Army Reserve centers in the Washington, D.C., area and around New York City. In May, 1988, fed-
USA for Africa
■
1009
eral agents arrested six U.S. citizens in connection with the Capitol bombing and several other attacks on U.S. government buildings. The defendants included Linda Evans and Susan Whitehorn, two former members of the Weather Underground, and Marilyn Buck, a member of the May 19 Communist Organization. Buck had recently been convicted for her participation in the 1981 robbery of a Brink’s truck outside Nyack, New York, that resulted in the deaths of three people. Buck, Whitehorn, and Evans were sentenced to long prison terms for conspiracy and malicious destruction of government property. Charges against the other three defendants were dropped. Impact In the wake of the bombing, security in and around the U.S. Capitol building was tightened. Areas once open to the public were closed off, a series of staff identification badges was instituted, several entrances were closed, and metal detectors were set up at all the remaining entrances. Further Reading
Berger, Dan. Outlaws of America: The Weather Underground and the Politics of Solidarity. Oakland, Calif.: AK Press, 2006. Memorial Institute for the Prevention of Terrorism. MIPT Terrorism Knowledge Base. http://www .tkb.org/Home.jsp Ron Jacobs See also
Beirut bombings; Foreign policy of the United States; Grenada invasion; Terrorism.
■ USA for Africa Identification
Popular music benefit project to fund African famine relief Date “We Are the World” recorded on January 28, 1985 Place A & M Studios, Hollywood, California On the heels of a widely publicized famine in East Africa, a star-studded group of forty-five popular music artists recorded a song and produced an accompanying music video to raise money to help victims of the famine. The project earned millions of dollars and served as inspiration for thousands of additional efforts across America in support of the cause.
1010
■
The Eighties in America
USA for Africa
A Stellar Ensemble The following performers participated in USA for Africa, recording the song “We Are the World”: Dan Ackroyd Harry Belafonte Lindsey Buckingham Kim Carnes Ray Charles Bob Dylan Sheila E. Bob Geldof Daryl Hall James Ingram Jackie Jackson LaToya Jackson Marlon Jackson Michael Jackson Randy Jackson Tito Jackson Al Jarreau Waylon Jennings Billy Joel
Cyndi Lauper Huey Lewis and the News Kenny Loggins Bette Midler Willie Nelson John Oates Jeffrey Osborne Steve Perry The Pointer Sisters Lionel Richie Smokey Robinson Kenny Rogers Diana Ross Paul Simon Bruce Springsteen Tina Turner Dionne Warwick Stevie Wonder
When a devastating famine struck Ethiopia in the fall of 1984, Americans began exploring ways to contribute to the relief efforts. Singer Harry Belafonte, known for his hit songs and movies from the 1950’s and early 1960’s and his civil rights activism, approached his manager, music producer Ken Kragen, about organizing a benefit concert. In November of 1984, a group of Britain’s most famous pop musicians had assembled under the name Band Aid and recorded a song, “Do They Know It’s Christmas?” that had raced to the top of the music charts and earned millions for African relief. As a result, Belafonte and Kragen, along with Lionel Richie (another of Kragen’s clients) and Michael Jackson, two of pop music’s most successful artists and composers, began to organize a similar effort with American musicians. The group recruited Quincy Jones, a successful producer whose work with Jackson had helped resurrect the singer’s career, to produce the effort, and Jackson and Ritchie composed a new song for the occasion. Instrumental tracks were recorded
and mailed to a broad spectrum of artists, and on January 28, 1985, a group of fortyfive performers gathered at A & M Studios in Hollywood following the American Music Awards to record “We Are the World.” Among the forty-five singers assembled in the studios were Belafonte, comedian Dan Ackroyd, Band Aid producer Bob Geldof, and band members and family members of the twenty-one artists for whose brief solo performances the song came to be known. These included many who were among pop music’s biggest stars of the 1980’s—Richie, Stevie Wonder, Paul Simon, Kenny Rogers, James Ingram, Tina Turner, Billy Joel, Michael Jackson, Diana Ross, Dionne Warwick, Willie Nelson, Al Jarreau, Bruce Springsteen, Kenny Loggins, Steve Perry, Daryl Hall, Huey Lewis, Cyndi Lauper, Kim Carnes, Bob Dylan, and Ray Charles. When the record was released in March of 1985, the group was identified on the label as USA for Africa.
Impact The song became an immediate hit, reaching number one on the music charts in just three weeks, and it eventually sold more than seven million copies over the next seven years. It won the 1985 Grammy awards for Song of the Year, Record of the Year, and Best Pop Performance by a Duo or Group. The foundation created by Kragen to administer and distribute the sixty-four million dollars raised in the effort, United Support of Artists for Africa, continued to operate and sponsored later efforts on behalf of America’s music industry to address emergency and long-term development needs in Africa. Further Reading
Berger, Gilda. USA for Africa: Rock Aid in the Eighties. London: Franklin Watts, 1987. Garofalo, Reebee, ed. Rockin’ the Boat: Mass Music and Mass Movements. Cambridge, Mass.: South End Press, 1991. Devon Boan See also Africa and the United States; Comic Relief; Farm Aid; Jackson, Michael; Journey; Lauper, Cyndi; Live Aid; Music; Music videos; Pop music; Richie, Lionel; Springsteen, Bruce.
The Eighties in America
■ USA Today Identification
First general-interest national daily newspaper Publisher Gannett Company Date Launched on September 15, 1982 Founded by Gannett Company CEO Al Neuharth, USA Today was designed to represent a completely novel alternative to the traditional newspapers of the 1980’s. In style, content, and physical appearance, the new national newspaper USA Today was designed for the television generation, and it rejected established rules about what a newspaper should be. It focused on celebrity news written in short, attention-getting articles with simplistic prose that could be read easily and quickly by busy people. It included many sentence fragments that began with typographical bullets, instead of complete, grammatically correct sentences. USA Today also featured bold color photographs, charts, and graphics, along with a huge, colorful national weather map. The color ink used to publish USA Today did not rub off on readers’ hands the way traditional gray newsprint could. Gannett Company chief executive officer (CEO) Al Neuharth founded the magazine in the belief that USA Today’s new type of journalism and design would be an effective way to communicate to readers a greater number of discrete news items. When the first issue of USA Today hit the stands in 1982, it sold out. By the end of its first year, USA Today’s circulation reached almost 400,000, and seven months later the newspaper had more than one million readers. By 1985, USA Today was publishing internationally, printing via satellite in Singapore and Switzerland. The success of USA Today was not without its challenges, however. The newspaper was an expensive, high-risk venture, and it had a difficult time securing advertising. It took five years before USA Today began to make a profit. Newspaper traditionalists did not welcome the entry of USA Today and disapproved of its departure from the rules of traditional journalism. Critics compared USA Today’s content to the offerings at a fastfood restaurant, earning the newspaper the nickname “McPaper, the junk food of journalism.” While critics coined this nickname to show their contempt of the newspaper, Neuharth, confident about its success, used the “McPaper” image to his advantage. One columnist would later describe USA Today as the
USS Stark incident
■
1011
“Big Mac of journalism,” and over time traditional newspapers began adapting USA Today’s “McNuggets” style of journalism in their own publications. Impact USA Today changed the world of journalism. Its enterprising approach to journalism, using colorful layout and short, easy-to-read articles, would later be copied by its competitors, who had to reinvent themselves to keep newspapers relevant in the digital age. USA Today would later publish more serious news stories and would one day be ranked with The Wall Street Journal and The New York Times as one of the top-selling newspapers in the United States. Further Reading
Mogal, Leonard. “The Three Titans: USA Today, The Wall Street Journal, and The New York Times.” In The Newspaper: Everything You Need to Know to Make It in the Newspaper Business. Pittsburgh: GATFPress, 2000. Neuharth, Al. Confessions of an S.O.B. New York: Doubleday, 1989. Pritchard, Peter S. The Making of McPaper: The Inside Story of “USA Today.” Kansas City, Mo.: Andrews, McMeel & Parker, 1987. Eddith A. Dashiell See also
CNN; FOX network; Journalism; Tabloid television; Television; Turner, Ted.
■ USS Stark incident The Event
An Iraqi fighter jet launches two missiles into a U.S. Navy vessel Date May 17, 1987 Place The Persian Gulf The USS Stark incident was at the time the worst peacetime naval disaster in American history. It was surpassed only by the explosion of a gun turret on the USS Iowa in 1989. During the Iran-Iraq War (1980-1988), an Iraqi Dassault Mirage F1EQ fighter launched two Exocet missiles, hitting the American Oliver Hazard Perryclass guided-missile frigate USS Stark and severely damaging the vessel. The Stark was in international waters in the Persian Gulf at the time of the attack. President Ronald Reagan had ordered a U.S. naval fleet to the Persian Gulf to monitor the area.
1012
■
USS Stark incident
The Eighties in America
The USS Stark lists to port after being struck by two Iraqi missiles on May 18, 1987. (U.S. Department of Defense)
The Stark’s crew was unaware of the firing of the first missile, which failed to detonate when it hit the port side of the hull. The fuel from the rocket caught fire, however, increasing the damage caused by its impact. Now aware that they were under attack, the crew was nevertheless helpless to stop the second missile from being fired: It impacted at roughly the same part of the ship as the first, penetrated to the crew’s quarters, and exploded. The casualty list included thirty-seven sailors killed and twenty-one injured. After the attack, the Stark was listing and on fire. The crew struggled to gain control of the ship, finally succeeding during the night. The Stark made its way to the tiny kingdom of Bahrain, near Qatar, where it was met by the USS Acadia. The Stark was returned to seaworthiness with temporary repairs made by the crew of the Acadia. Under its own power, the Stark made it back home to the United States. The ship returned to Mayport, Florida, its home
port. In 1988, the ship traveled to Mississippi for permanent repairs conducted by Ingalls Shipbuilding at a cost of $142 million. After the repairs were made, the Stark returned to active service. Impact The attack on the USS Stark was not provoked; Iraq and the United States were at peace at the time. It is unknown if the Iraqi pilot who launched the missiles was ever punished for the accident, because the Saddam Hussein regime was in control of Iraq at the time. American officials have speculated that the pilot was executed. Further Reading
Levinson, Jeffrey L., and Randy L. Edwards. Missile Inbound: The Attack on the Stark in the Persian Gulf. Annapolis: Naval Institute Press, 1997. Wise, Harold Lee. Inside the Danger Zone: The U.S. Military in the Persian Gulf, 1987-88. Annapolis: Naval Institute Press, 2007. Timothy C. Hemmis
The Eighties in America See also Beirut bombings; Iranian hostage crisis; Libya bombing; Middle East and North America; USS Vincennes incident; West Berlin discotheque bombing.
■ USS Vincennes incident The Event
An American warship accidentally shoots down an Iranian passenger airliner Date July 3, 1988 Place Strait of Hormuz The Vincennes shot down Iran Air Flight 655, killing all 290 people on board. Besides being a tragedy for those on the airplane and their families, the incident further poisoned relations between the United States and Iran. In 1980, Iraq—under the leadership of President Saddam Hussein—invaded the Islamic Republic of Iran, beginning the Iran-Iraq War (1980-1988). The bloody war eventually devolved into stalemate and led to the deaths of hundreds of thousands of people. One of the war’s battlefields was the Persian Gulf, specifically the Strait of Hormuz, where Iran began to attack tankers carrying oil as a means of damaging Iraq’s economy. The presidential administration of Ronald Reagan deployed warships from the U.S. Navy to the region to protect the oil tankers, if necessary through the use of force. One of the ships sent to the region in 1988 was the USS Vincennes, a Ticonderoga-class guided-missile cruiser under the command of Captain William C. Rogers III. On July 3, 1988, the ship pursued Iranian gunboats operating in the area. Afterward, the bridge of the Vincennes picked up a single airplane flying over Iranian territory that seemed to be preparing for a possible attack on the Vincennes. In fact, it was Iran Air Flight 655, an Airbus A300B2 on a regularly scheduled flight from Bandar Abbas, Iran, to Dubai, United Arab Emirates. The ship’s bridge radioed the plane to warn it off, but the Airbus A300 was unable to receive military communications. As the airliner flew closer to the American cruiser, the order was given to fire on the airplane. A missile was launched, destroying the airplane and killing all on board, including dozens of children. Impact
The USS Vincennes incident was met with outrage in Iran and a denial of responsibility on the
USS Vincennes incident
■
1013
part of the U.S. Navy and the Reagan administration. Vice President George H. W. Bush later made it clear that the United States had no intention to apologize for the tragic mistake, and the commander of the Vincennes later received a medal from President Reagan for his service in the region. It would later emerge that the airliner had been ascending, not descending, as the U.S. Navy claimed in the immediate aftermath of the attack, and that other American naval commanders in the area had been concerned about the behavior of the Vincennes’s captain. Although the U.S. government would never admit responsibility for shooting down the airliner, it did in the 1990’s propose compensation to the families of the dead. Further Reading
Ansari, Ali M. Confronting Iran: The Failure of American Foreign Policy and the Next Great Crisis in the Middle East. New York: Basic Books, 2006.
The USS Vincennes launches a missile during exercises in July, 1987, one year before it shot down Iran Air Flight 655. (U.S. Navy)
1014
■
The Eighties in America
U2
Rajaee, Farhang, ed. The Iran-Iraq War: The Politics of Aggression. Gainesville: University Press of Florida, 1993. Rogers, Will, Sharon Rogers, and Gene Gregston. Storm Center: The USS Vincennes and Iran Air Flight 655—A Personal Account of Tragedy and Terrorism. Annapolis: Naval Institute Press, 1992. Steve Hewitt See also
Air India Flight 182 bombing; Bush, George H. W.; Foreign policy of the United States; Iran-Contra affair; Iranian hostage crisis; Middle East and North America; Pan Am Flight 103 bombing; Reagan, Ronald; USS Stark incident.
■ U2 Identification Irish rock band Date Formed in 1976
During the 1980’s, U2 became one of the most popular musical acts in the world. The band had a substantial impact on pop culture, particularly in the United States. U2 formed in 1976 in Dublin, Ireland. With a lineup of Larry Mullen, Jr., on drums, Adam Clayton on bass, Dave “the Edge” Evans on guitar and keyboards, and Paul “Bono” Hewson as lead singer, U2’s live performances had already attracted a solid fan base in the United Kingdom before it signed with Island Records in March, 1980. Later that year, the band released its first album, Boy, and the single “I Will Follow” gave U2 its first radio airplay in North America. Its second album, October (1981), featured spiritual and religious themes, influenced by U2’s involvement with the charismatic Shalom religious movement. By 1983, U2 was on the verge of stardom, and the group’s increasingly sophisticated songwriting and politically charged lyrics propelled the album War, featuring the radio hits “New Year’s Day” and “Sunday Bloody Sunday,” into the top twenty in the United States and Canada. U2’s members donned combat boots and proclaimed themselves “Militants for Peace” as they headlined a North American tour in the summer of 1983. Their concert at Red Rocks Amphitheatre, near Denver, spawned a live album, U2 Live: Under a Blood Red Sky, and a video in heavy rotation on MTV. In 1984, U2 retreated to Slane Castle in Ireland to
record The Unforgettable Fire. Produced by the legendary Brian Eno, this more experimental album sold fewer copies than War had, but its track “Pride (In the Name of Love)” became the group’s first Top 40 single on the U.S. pop charts, as it reached number thirty-three on the Billboard Hot 100 chart. “Pride (In the Name of Love)” was inspired by the life of Martin Luther King, Jr., and reflected the band’s growing fascination with America. U2 performed at the Live Aid festival in 1985, and Bono’s unscripted leap into the crowd turned him into a celebrity. In a cover feature, Rolling Stone magazine proclaimed U2 to be the “Band of the Eighties.” U2’s commitment to social and political issues continued to evolve. Bono and his wife, Ali, volunteered in Ethiopia and toured war-torn Central America, and the band participated in the Artists Against Apartheid campaign. This activism, along with band members’ growing interest in American roots music, influenced their most successful album, The Joshua Tree, released in March, 1987. Musically and lyrically, The Joshua Tree showcased U2’s simultaneous love of American culture and frustration with American foreign policy. The album topped the charts in twentytwo countries around the world and featured two number one hits on the Billboard Hot 100 chart: “With or Without You” and “I Still Haven’t Found What I’m Looking For.” U2 closed out the decade with a tour that filled stadiums around the world, chronicled in the documentary film Rattle and Hum (1988). An album with the same name featured live performances and several new songs, including a duet with B. B. King. Impact
U2 was a cultural phenomenon in the 1980’s, selling over 20 million albums in the United States alone and raising awareness of political issues and social causes among the MTV generation. In addition to strongly influencing the popular music of the decade, U2 played a major role in Live Aid and other high-profile humanitarian events.
Further Reading
Scrimgeour, Diana. U2 Show. New York: Riverhead Books, 2004. U2 and Neil McCormick. U2 by U2. London: HarperCollins, 2006. Caroline Small and Andrew J. LaFollette See also Live Aid; Music; Music videos; MTV; Pop music; Rock and Roll Hall of Fame.
V ■ Valenzuela, Fernando Identification Mexican American baseball player Born November 1, 1960; Etchohuaquila, Mexico
A Mexican-born, left-handed pitcher for the Los Angeles Dodgers, Valenzuela became a star among both Spanishand English-speaking baseball fans across the United States. Both in their original home in Brooklyn and, since 1958, in Los Angeles, the Dodgers had been a baseball organization known for developing young play-
Los Angeles Dodger Fernando Valenzuela pitches in the 1986 All Star game. Valenzuela struck out five consecutive batters, tying an All Star game record. (AP/Wide World Photos)
ers. At times during the twentieth century, the Rookie of the Year award seemed virtually a Dodger preserve. Thus, it was no surprise to seasoned observers of baseball when Fernando Valenzuela—a young left-hander who had been signed with the Dodgers organization by scout Mike Brito two years earlier— was promoted to the major leagues in 1980 and quickly dazzled fans as a relief pitcher. A truly startling phenomenon emerged the following year, however, when Valenzuela was inserted into the Dodgers’ starting rotation. He threw three shutouts in his first four games, an unprecedented feat. Valenzuela’s pudgy physique, his youth, and his inability to speak English contributed to his aura. Despite striking out many batters, he was not a traditional hard thrower. His most effective pitch was a screwball (a pitch that uses a reverse motion from those of curveballs and sliders); it spun away from batters, leaving them swinging at air. His mastery of this unusual pitch made Valenzuela particularly entertaining to watch. Valenzuela, a Mexican pitcher, was performing in Los Angeles, which was home to the second-largest urban population of Mexicans in the world, after Mexico City. He excited Latino fans to come to the ballpark as never before. The socioeconomic gap between Valenzuela’s home state of Sonora, in northern Mexico, and the bucolic, well-manicured Dodger Stadium was far greater than the physical distance of several hundred miles between them. The pitcher’s ability to bridge this gap made him a hero to Latinos across America. Valenzuela became a huge star among the Englishspeaking population of the United States as well, even though his postgame comments had to be translated for the media by Dodgers coach Manny Mota. When, in Valenzuela’s fifth start of 1981, he shut out the San Francisco Giants in a Monday evening game, his status as a legitimate sensation was confirmed, and the era of “Fernandomania” began. Two starts later, Valenzuela made his debut in America’s largest city, as the Dodgers faced the New York
1016
■
Valley girls
Mets. Mets manager Joe Torre, hoping to succeed where other managers had failed against Valenzuela’s devastating screwball, started a number of lefthanded batters, unusual in facing a left-handed pitcher. Although this strategy proved to be to no immediate avail—Valenzuela shut out the Mets, striking out eleven batters—it did demonstrate a potential avenue by which teams could approach Valenzuela. The pitcher did not continue to dominate to the extent he had in the early months of his first season. He did lead the Dodgers to the World Championship in 1981, however, providing the principal highlight in a strike-marred season. Impact Valenzuela continued as a premier pitcher until the late 1980’s, when injuries slowed him down. He ended up winning 173 games in the major leagues from 1980 to 1997, achieving the record for victories by a Mexican-born pitcher. His career inspired many Latino children, who dreamed of the success their hero had achieved playing baseball.
The Eighties in America
girl” originated in the 1980’s San Fernando Valley of Los Angeles, the Valley girl figure, strongly resembling a walking and talking Barbie doll, remains a cultural icon, generally depicted as a young woman with bleached-blond hair who is skinny, sexy, enormously rich, entirely self-centered, and brainless. In 1980’s magazines, the ever-evolving Valley girl image sold cosmetics and fashions to an enormous teenage market and became a popular trope in film that would last into the twenty-first century. In 1982, Frank Zappa released “Valley Girl,” a song containing typical Valley girl expressions, in an effort to satirize the trend and illustrate how the image had come to represent the dumbing down of America. His attempt backfired: The Valley girl image became even more popular. The following year, the movie Valley Girl (1983), featuring Nicolas Cage and Deborah Foreman, was well received; it portrayed the relationship of a punk teenage boy and a Valley girl in the setting of a high school prom. The stereotype resonated with audiences and established the icon.
Further Reading
Delsohn, Steve. True Blue: The Dramatic History of the Los Angeles Dodgers. New York: Harper, 2002. Regalado, Samuel. Viva Baseball! Latin Major Leaguers and Their Special Hunger. Urbana: University of Illinois Press, 1998. Stout, Glenn, and Richard A, Johnson. The Dodgers: 120 Years of Dodgers Baseball. Boston: Houghton Mifflin, 2004. Nicholas Birns See also
Baseball; Baseball strike of 1981; Immigration to the United States; Latinos; Mexico and the United States; Sports.
■ Valley girls Definition
Pop-culture female icons representing self-centered, spoiled, wealthy, sexually promiscuous teenage girls
The 1980’s gave rise to the Valley girl icon of the dim-witted, sexy, spoiled teenage girl, which in time gained in popularity and remains a twenty-first century icon. Although the Valley girl does not exist in reality, the image of her was established during the 1980’s as a caricature of spoiled, wealthy, usually privileged white teenage girls. Although the expression “Valley
The Stereotypical Valley Girl First on the Valley girl’s checklist of characteristics is her wealth, or rather her parents’ wealth, which is conspicuously displayed upon her, and around her. With her clique of other Valley girls in tow, she flits around the local shopping mall from store to store. The mall is her natural habitat, where she spends most of her time paying no attention to the price tags of fashionable clothes and accessories while flashing a variety of platinum credit cards. For the Valley girl, fashion is foremost; she must own and wear the latest styles and trends. “Vals,” as they came to be known, are also characterized by their desire to be the center of attention. In addition to looking good, they must display the assets that set them apart from the everyday teenage girl. Although they are notoriously poor drivers, older Valley girls have driver’s licenses and must have the proper luxury car to project the correct high-class image as they gad about geographically spread out Los Angeles. In addition, the Valley girl generally has a goodlooking boyfriend, typically a sports star. Getting male attention is almost as important to the Valley girl as is jealous female attention. The Valley girl does not understand the concept of “no,” or any form of self-denial. Often she is considered to be sexually “easy.” In the 1980’s, Valley girls had fancy
The Eighties in America
phones and unlisted phone numbers; today’s Valley girls carry the latest cell phones. Beyond her striking physical appearance and material possessions, the next most noticeable characteristic of the Valley girl is her lack of intelligence. Not only is she a blond; she is a dumb blond. Her attendance at school and sports events is admirable, but she doesn’t grasp the concept of homework and may manipulate other more nerdy types to do it for her. Valspeak
Beginning in the 1980’s, Valley girls developed a form of dialect known as Valspeak that spread quickly around the country. It serves to emphasize her minimum intelligence. Utilizing a variety of mid-sentence qualifiers such as “like” and “duh,” Valspeak is characterized by inflections that convey exaggerated emotions, from enthusiasm to disdain, such as the raising of the voice at the end of every sentence—as if each statement were a question. The vocabulary and inflections of Valspeak contributed to the idea that Valley girls were not very intelligent, suggesting that they could not articulate their limited thoughts. Similarly, short statements that stood in for sentences—“As if,” “Whatever,” “Totally,” “I’m sure,” and “Gag me with a spoon”— added to the stereotype of the Valley girl as inherently stupid.
Impact The 1980’s Los Angeles Valley girl icon spread throughout the country and became enormously popular in advertising and film. Films of the 1980’s and post-1980’s featuring the Valley girl include Fast Times at Ridgemont High (1982), Buffy the Vampire Slayer (1992), Clueless (1995), Romy and Michele’s High School Reunion (1997), She’s All That (1999), Jawbreaker (1999), Bring It On (2000), and Legally Blonde (2001). Valley girls have also become a trope in such horror films as Scream (1996), Scream 2 (1997), I Know What You Did Last Summer (1997), and Scream 3 (2000). The enormously popular Buffy the Vampire Slayer television series featuring blond Valley girl Buffy Summers aired between 1997 and 2003. In turn, the image infiltrated everyday Americans’ lives, affecting mannerisms, attitudes, fashion, and the way Americans speak. Valspeak slang and expressions survived into the twenty-first century, transforming mainstream American English, particularly among teens. Calling someone a Valley girl today is to denigrate a young woman as superficial, self-centered, and overspending.
Van Halen
■
1017
Further Reading
Bernstein, Jonathan. Pretty in Pink: The Golden Age of Teenage Movies. New York: St. Martin’s Griffin, 1997. Bernstein examines the 1980’s, the Golden Age of Teenage Movies, particularly middle- and upper-middle-class teenagers, whose great concern with personal appearance and popularity gave rise to the Valley girl image. Blyth, Carl, Sigrid Recktenwald, and Jenny Wang. “I’m Like, ‘Say What?!’ A New Quotative in American Oral Narrative.” American Speech 65 (Autumn, 1990): 215-227. Scholarly but approachable article that discusses how Valley girl speech patterns, especially “like,” have entered popular American speech. Clover, Carol J. Men, Women, and Chainsaws: Gender in the Modern Horror Film. Princeton, N.J.: Princeton University Press, 1992. Illustrates how the image of the 1980’s Valley girl has become a lasting trope in American horror film. Douglas, Susan. “Valley Girl Feminism: New Feminist Magazine Jane Does Not Compare to Ms. Magazine.” The Progressive 61 (November, 1997): 17. Douglas recalls her first issue of Ms. magazine and considers how times have changed in her quest to find the premier issue of the latest feminist magazine for “uppity” Valley-girl women. M. Casey Diana See also Advertising; Cell phones; Closing of the American Mind, The; Consumerism; Fads; Fashions and clothing; Fast Times at Ridgemont High; Feminism; Film in the United States; Horror films; Preppies; Slang and slogans; Teen films.
■ Van Halen Identification American hard rock band Date Formed in 1974
One of the most popular bands of the 1980’s, Van Halen brought fresh energy to the rock genre and influenced several bands of the decade. Van Halen is a hard rock band from Pasadena, California. Its original members were Eddie Van Halen (lead guitarist), David Lee Roth (lead vocalist), Michael Anthony (bass guitarist), and Alex Van Halen (drummer). After forming in 1974, the band created its unique style of rock music, marked by Eddie
1018
■
The Eighties in America
Vancouver Expo ’86
Van Halen in 1986. From left: Michael Anthony, Sammy Hagar, Eddie Van Halen, and Alex Van Halen. (Paul Natkin)
Van Halen’s virtuoso guitar playing. The group’s live shows featured long guitar solos by Eddie, along with wild stage antics by the flamboyant Roth. In 1978, the group released its debut album, Van Halen, to immediate popular and critical success. Notable songs on that album included “Ain’t Talkin’ ’Bout Love”; “Eruption,” which included a popular and highly influential guitar solo; and “Runnin’ with the Devil.” Van Halen II was released the following year and included the hit song “Dance the Night Away.” The band’s third album, Women and Children First, released in 1980, continued the band’s established pattern of high-energy rock, with Eddie Van Halen’s guitar providing much of the sound, and included the single “And the Cradle Will Rock.” The band’s fourth album, Fair Warning (1981), also sold numerous copies, but its songs did not receive as much airplay as had others by Van Halen. The group’s 1982 album, Diver Down, is best known for two cover songs: “Dancing in the Street,” first recorded by Martha and the Vandellas, and “Oh, Pretty Woman,” cowritten by Roy Orbison. On December 31, 1983, the band released 1984. This album revealed a slight change in the band’s musical style: The hit single “Jump” included substantial use of synthesizers. Though most of the songs on the album still featured guitar, they did in-
dicate the band’s willingness to follow a trend in popular music by utilizing more keyboards. This change, however, caused tension within the group. Following the tour for this album, Roth left the band. Van Halen ushered in a new era by acquiring Sammy Hagar as its new lead vocalist in 1986. Hagar had already established a solo career in music, playing guitar and singing lead vocals. The band’s first release with Hagar was 5150 (1986), produced by Mick Jones of the rock band Foreigner. Despite the change in membership, the album sold numerous copies and continued the style of 1984. The band’s final album of the decade was OU812 (1988), which included the hit song “Finish What You Started.”
Impact Van Halen created a unique sound within the rock world that influenced the popular “hair bands” of the late 1980’s. The band succeeded in matching the high-energy sound on its studio works with its stage performances and helped to reinvigorate the rock genre. Further Reading
Bogdanov, Vladimir, et al., eds. All Music Guide to Rock: The Definitive Guide to Rock, Pop, and Soul. 3d ed. San Francisco: Backbeat Books, 2002. Levy, Joe, et al., eds. The 500 Greatest Albums of All Time. New York: Wenner Books, 2005. Kevin L. Brennan See also Bon Jovi; Guns n’ Roses; Heavy metal; Music; Osbourne, Ozzy; Pop music.
■ Vancouver Expo ’86 The Event International exposition Date May 2 to October 13, 1986 Place Vancouver, British Columbia
Vancouver Expo ’86 coincided with the city’s centennial and the arrival on the Pacific coast of the first passenger train. It was the second time that Canada had held a
The Eighties in America
world’s fair in the period after World War II, and the fair benefited from a terrorism scare in Europe that kept many potential travelers within North America’s borders. The 1986 World Exposition on Transportation and Communications (known as Vancouver Expo ’86) was a world’s fair sanctioned by the Bureau of International Expositions (BIE) and held in Vancouver, British Columbia, from May 2 through October 13, 1986. The fair, whose theme was “Transportation and Communication: World in Motion, World in Touch,” was the first Canadian world’s fair since Expo ’67. The latter fair, held in Montreal during the Canadian centennial, was one of the most successful world’s fairs in history, attracting some 50 million people at a time when Canada’s population was only 20 million. Expo ’86 was categorized by the BIE as a “class 2, special category fair,” reflecting its specific emphases on transportation and communications. The government of Canada contributed $9.8 million to the exposition’s cultural projects, including $5.8 million for the program at the Canada Pavilion, $2 million to enable Canadian artists to tour other centers en route to or from the exposition, $1.5 million for Canadian participation in the World Festival, and $500,000 to fund cultural projects for Vancouver’s centennial celebrations. The exposition was opened by England’s Prince Charles and Princess Diana and Canadian prime minister Brian Mulroney on May 2, 1986. It featured pavilions from fifty-four nations and numerous corporations. Expo ’86’s participants were given the opportunity to design their own pavilions or to opt for less expensive standardized modules. Each module was approximately two and one-half stories high and had floor space equal to one-third of a city block. The design was such that any number of the square modules could be placed together in a variety of shapes. The roof design allowed the interior exhibit space to be uninterrupted by pillars. Expo ’86 was held on the north shore of False Creek, along Vancouver’s inner-city waterway. The seventy-hectare site featured over eighty pavilions and many indoor and outdoor performance venues. Canada’s pavilion was located on a pier not contiguous with the rest of the site. To reach the pavilion, visitors would take Vancouver’s newly opened SkyTrain rapid rail system. After the exposition, the pier became Canada Place, one of Vancouver’s most recog-
Vancouver Expo ’86
■
1019
nizable landmarks. Other Canadian host pavilions included Canadian provincial and territorial pavilions for Alberta, British Columbia, Nova Scotia, Ontario, Prince Edward Island, Quebec, Saskatchewan, Yukon, and the Northwest Territories. Canadian Pacific’s main feature was a film, Rainbow War, while Telecom Canada presented a Circle-Vision 360 movie, Portraits of Canada-Images du Canada. A geodesic dome, known as Expo Centre, represented a style of architecture first seen in the U.S. pavilion at Montreal in 1967. At Expo ’86, the U.S. pavilion was devoted to space exploration in the wake of the Challenger space shuttle disaster, and the displays from the Soviet Union were colored by the Chernobyl nuclear power plant explosion. These dueling pavilions represented one of the last faceoffs between the two superpowers before the end of the Cold War only three years later. Corporate and nongovernmental-organization (NGO) pavilions included those representing Air Canada, the local BCTV television station, Canadian National, and General Motors—which had one of the more popular exhibits, “Spirit Lodge,” a live show augmented with holographic and other special effects. Impact In all, 22 million people attended Expo ’86, and, despite a deficit of 311 million Canadian dollars, it was considered a tremendous success. The event was later viewed as a transitional moment for Vancouver, which transformed from a sleepy provincial backwater to a city with some global clout. In particular, the exposition marked a strong boost to tourism for the province. It was also the last twentieth century world’s fair to take place in North America. Further Reading
Anderson, Robert, and Eleanor Wachtel, eds. The Expo Story. Madeira Park, B.C.: Harbour, 1986. Findling, John E., and Kimberly Pelle, eds. Historical Dictionary of World’s Fairs and Expositions, 18511988. New York: Greenwood Press, 1990. Kahn, E. J. “Letter from Vancouver.” The New Yorker, July 14, 1986, 73-81. Martin J. Manning See also Canada and the British Commonwealth; Canada and the United States; Challenger disaster; Knoxville World’s Fair; Louisiana World Exposition; Mulroney, Brian.
1020
■
The Eighties in America
Vangelis
■ Vangelis Identification Greek composer and keyboardist Born March 29, 1943; Volos, Greece
A prolific, accomplished keyboard composer in both classical and electronic jazz, Vangelis brought to film scores in the 1980’s an opulent feel that lent emotional weight to the theatrical experience. Recordings of his scores also stood on their own, enjoying wide commercial success. Vangelis was born Evangelos Odysseas Papathanassiou in Greece. As a composer, he spent more than twenty years compiling a distinguished repertoire of distinctive keyboard works that reflected his era’s experimentation with lush electronic sounds and the studio construction of massive sonic effects around a single, often plaintive melody line, participating in a movement that would be dubbed New Age music. He first achieved international success with his score for Hugh Hudson’s 1981 film Chariots of Fire, the inspirational story of two British runners at the 1924 Paris Olympics. Forsaking the traditional expectation that the score of a period film should reflect the musical tastes and styles of its setting, Vangelis crafted a pulsing, electronic score that itself contained a kind of heroic narrative, swelling to emotional, even inspirational peaks. The film’s main theme, released as a single in 1982, enjoyed rare international commercial success for an instrumental composition, including a week as the number one single on the Billboard Hot 100 chart. It went on to become a staple among sports anthems—as well as the subject of countless parodies. The film’s full score won an Academy Award. That same year, Vangelis was approached to provide the score for Ridley Scott’s dystopian futuristic film Blade Runner (1982). The score captured the anxious, isolated feel of the stylish science-fiction thriller. Artistic differences between director and composer led to entangling legal actions that prevented the music as Vangelis scored it from accompanying the film’s initial release. However, when the film was later re-released in a “director’s cut,” it included Vangelis’s original score. In the interim, the score was distributed in bootleg recordings, gaining a cult following similar to that of Blade Runner itself. The reclusive Vangelis continued his prodigious output, composing dozens of New Age recordings, as well as pieces for the stage (particularly ballet), and scoring films, especially epics. Ironically, for a
composer who conceived of music as a purely aesthetic form unto itself, his soaring and hummable musical themes were most often recognized because they had been appropriated by successful marketing campaigns for commodities, events, and television programs. Impact Vangelis’s work demonstrated a deliberate disregard for the conventions of film scoring that dominated the 1980’s. Those conventions were driven by the rise in tandem of music videos and aggressive cross-marketing campaigns by film studios attempting to exploit a plurality of merchandising opportunities for each film. Thus, film scores of the 1980’s were often little more than catalogs of commercial hits with Top 40 potential linked by nondescript background music, as directors received mandates to include montage sequences in their films that could enable hit singles to be played in their entirety. Vangelis, however, brought to his film scores a sense of classical elevation and unity. Aided by his uncanny ear for unforgettable themes delivered by the stirring vibrato of his signature synthesizer, his music provided a rich emotional underscoring to the events within a film’s narrative and thus became an integral part of the film’s aesthetic impact. Further Reading
Boundas, Constantin. Film’s Musical Moments. Edinburgh: Edinburgh University Press, 2006. Calotychos, Vangelis. Modern Greece: A Cultural Poetics. Oxford, England: Berg, 2003. Summer, Lisa, and Joseph Summer. Music: The New Age Elixir. Amherst, N.Y.: Prometheus, 1996. Joseph Dewey See also Academy Awards; Advertising; Blade Runner; Classical music; Jazz; Music; Music videos; Pop music; Synthesizers.
■ Video games and arcades Definition
Electronic games played by manipulating images on a video display and public centers devoted to playing them
Video games began to become a significant aspect of popular culture during the 1980’s, as they were mass marketed in both home and coin-operated versions. By the end of the decade, video game characters such as Pac-Man and Mario
The Eighties in America
were as well known to children as were cartoon characters, and indeed, several such characters made the transition from games to cartoon series. In video games, a player is usually rewarded for continued success at the game through such mechanisms as points, additional lives, and level advancements. Video games of the 1980’s could be played on a variety of platforms, including consoles (which were attached to televisions), handheld devices, personal computers, and dedicated, coin-operated game machines. Home computing technology during the decade was not yet sophisticated enough to emulate the graphics of coin-operated machines, however, so the most graphically advanced games were generally those available in arcades.
Video games and arcades
■
1021
In the same year, Russian game writer Alex Pajitnov released one of the most popular puzzle games of all time, Tetris, leading more and more people to the world of casual video games. Within the next few years, demand for console systems grew, and newly emboldened Nintendo, Sega, and Atari emerged as video game industry leaders, vying for market position. Nintendo was responsible for such games as Super Mario Brothers and The Legend of Zelda, as well as the translation to its console platform of many coin-operated games by the popular and innovative company Namco, such as Dig Dug and Galaga. In 1989, Nintendo released the Game Boy, a handheld video game console.
Impact The video game industry began to come In 1980, Atari released its home version of into its own during the 1980’s, first as a source of the arcade video game Space Invaders as a cartridge for the Atari 2600 game console. The cartridge allowed fans of the game to play it for a one-time fee and in the privacy of their own homes. The same year, Mattel released its competing Intellivision game console. Consumers flocked to buy these and similar systems. Coinoperated arcade game technology did not cease in the meantime—Pac-Man, the most popular arcade game of all time, was released in 1980 as well. Advances in video game technology would continue at a rapid pace with the release of Donkey Kong and Tempest in 1981. In 1982, Coleco released the Colecovision game console, which— with forty-eight kilobytes of random-access memory (RAM) and an eight-bit processor—was the most powerful home system available. Manufacturers flocked increasingly to the thriving home video game market, with predictable results. The American market for home game systems became oversupplied, and it crashed in 1983. The crash drove out of business many of the third-party game manufacturers (that is, independent manufacturers of game cartridges to be played on other companies’ consoles). The newly burgeoning video game industry collapsed and was stagnant for the next several years, with manufacturers afraid that any attempts to innovate would be met with further losses. In 1985, Nintendo gingerly test-marketed its Nintendo Entertainment System (NES) in the United States and found that its limited reKids play coin-operated video games in an arcade in New York City in lease was a rousing success. December, 1981. (AP/Wide World Photos) History
1022
■
The Eighties in America
Vietnam Veterans Memorial
coin-operated arcade games and then as a source of home computing technology—both general and dedicated solely to gaming. By the end of the decade, console games were a common platform, and almost every personal computer in the world had at least one game on it. Movies such as War Games and Tron reflected some of the fears and hopes for these games in the 1980’s, from nuclear war to the ability of an artificial intelligence to intervene in such a scenario. At the same time, games drove many advances in technology, particularly in the areas of video capability and processor speed. Much of the drive to improve home computers and bring them down in price was driven by the consumer demands of gamers and the possibilities glimpsed by game developers. Demand for Internet connectivity was also later shaped by its implications for gaming. As video games became more sophisticated, they began to participate in popular culture as a source of both entertainment and narrative equal to film and television. Indeed, by the early twenty-first century, the game industry would earn more revenue annually than the film industry in the United States.
Craze That Touched Our Lives and Changed the World. New York: Three Rivers Press, 2001. Provides a thorough and interesting overview of the history of video games, including interviews with many of the industry’s most important figures. Poole, Steven. Trigger Happy: Video Games and the Entertainment Revolution. New York: Arcade, 2004. Includes an exhaustive history of the game industry, along with an analysis of types of games and their varying appeals. Prensky, Mark. Don’t Bother Me Mom—I’m Learning! How Computer and Video Games Are Preparing Your Kids for Twenty-First Century Success and How You Can Help! St. Paul, Minn.: Paragon House, 2006. Discusses the multitude of skills that children can acquire through playing video games. Takahashi, Dean. Opening the Xbox. San Francisco: Prima Lifestyles, 2002. Provides insightful analysis of the video game industry, focusing on Microsoft and the company’s entrance into the industry with the Xbox console. Cat Rambo See also
Further Reading
Beck, John C. Got Game: How the Gamer Generation Is Reshaping Business Forever. Boston: Harvard Business School Press, 2004. Discusses video gamefacilitated knowledge acquisition and argues that such knowledge exerts a transformative force on the workplace. Chaplin, Heather, and Aaron Ruby. Smartbomb: The Quest for Art, Entertainment, and Big Bucks in the Videogame Revolution. New York: Algonquin Books, 2005. Discusses the corporations behind the various games, as well as the more prominent players, looking at tournaments and expositions for much of the narrative. Gee, James Paul. What Video Games Have to Teach Us About Learning and Literacy. New York: Palgrave Macmillan, 2004. Demonstrates the intersections of video game theory and educational theory without oversimplifying either field. Johnson, Steve. Everything Bad Is Good for You. New York: Riverhead Books, 2005. Tests the theory that influences such as video games are bad for the mind and argues that such games require more cognitive work than watching television. Kent, Steven L. The Ultimate History of Video Games: From “Pong” to “Pokemon”—The Story Behind the
Apple Computer; Computers; Hobbies and recreation; Information age; Inventions; PacMan; Science and technology; Toys and games; Virtual reality.
■ Vietnam Veterans Memorial Identification U.S. war monument Creators Envisioned by Jan Scruggs and designed
by Maya Ying Lin Date Built in 1982 Place National Mall, Washington, D.C.
Millions of people from around the globe would visit the black granite memorial etched with the names of more than fifty-eight thousand American military personnel who lost their lives in the Vietnam War. The painful wounds inflicted on the United States by its longest war were still fresh in 1979, when Vietnam veteran Jan Scruggs and his wife, after viewing the movie The Deer Hunter (1979), decided to launch an effort to honor Scruggs’s fallen comrades. Only four years had passed since the collapse of South Vietnam and the fall of Saigon. Scruggs once said of his own service in that divisive conflict:
The Eighties in America
Vietnam Veterans Memorial
■
1023
The Vietnam Veterans Memorial in Washington, D.C.
The bitterness I feel when I remember carrying the lifeless bodies of close friends through the mire of Vietnam will probably never subside. I still wonder if anything can be found to bring any purpose to all the suffering and death. The Memorial The Vietnam Veterans Memorial was conceived by Scruggs and fellow veterans to serve as a permanent tribute to the U.S. dead and as a means for the country to reflect on the war in all its dimensions. Scruggs founded an organization, the Vietnam Veterans Memorial Fund, but it got off to a shaky start, initially raising only $144.50 and becoming a subject of ridicule, even from mainstream media. Undeterred, Scruggs enlisted the support of national leaders such as U.S. senator John Warner of Virginia, who donated $5,000 of his own money and helped raise $50,000 more. Donations, large and
small, began to pour in from 275,000 people, and the memorial fund ballooned to $8.4 million. Private money would fulfill Scruggs’s dream of explaining the conflict to many who were not personally involved in Southeast Asia. Scruggs lobbied Congress for a suitably prominent location on the National Mall for a memorial that would serve as a site of healing and reflection, as well as become a tangible tribute to all who were touched by the conflict. Two acres near the Lincoln Memorial were reserved for the monument, and on July 1, 1980, President Jimmy Carter signed legislation authorizing that location for the construction of the Vietnam Veterans Memorial. For the next two years, Scruggs and his organization monitored the design and construction of the memorial. A national design competition, judged by a panel of architects and artists, commenced after the proj-
1024
■
The Eighties in America
Vietnam Veterans Memorial
ect received its presidential approval. Bob Doubek, Scruggs’s fellow Vietnam veteran and a member of the founding organization, explained, “The hope is that the creation of the memorial will begin a healing process.” Some 1,421 entries were submitted, and the competition had four criteria: The design must be reflective and contemplative, it must be harmonious with the site, it must be inscribed with the names of the dead and the missing, and it must make no political statement about war. The panel of experts reviewed the submissions and, after four days of careful deliberations, unanimously chose the design offered by a Chinese American Yale University undergraduate architecture student, Maya Ying Lin. Lin was only twenty-one years of age and had led a life untouched by death. Her entry had been submitted as a course requirement. She saw her challenge as enormous, but she methodically set out to create a memorial that was faithful to the competition’s original guidelines. Visiting the site, Lin commented, I thought about what death is, what a loss is. A sharp pain that lessens with time, but can never quite heal over. The idea occurred to me there on the site. I had an impulse to cut open the earth. The grass would grow back, but the cut would remain.
Inspired, Lin returned to Yale and placed the finishing touches on the design, completing it in only three weeks. Public reaction to Lin’s design was mixed. Race interjected itself into the discussion because of Lin’s ethnicity. Some veterans likened the black granite memorial to an ugly scar. Others, however, applauded the memorial, with its simple listing of the dead and missing, row after row. The wall’s construction phase continued from 1981 to 1982. The memorial was built into the earth, below ground level, with two panels arranged as giant arms pointing to either the Washington Monument or the Lincoln Memorial. On these black granite panels were etched the names of more than fiftyeight thousand men and women, some of whom remained missing. The first casualty had occurred in 1956 and the last had taken place in 1975. The names were ordered chronologically, so at first—in the section corresponding to the war’s early years— only a few appeared. As a visitor walked farther into the memorial, longer and longer lists of the dead would accumulate. Such visitors, as Lin envisioned, would walk toward the monument’s vortex, the cen-
ter where the two arms meet in a warm embrace. There, they would search for the names of friends, relatives, and unknown heroes of America’s longest war. From the first day, the memorial drew people bearing gifts for the dead and paper upon which to trace names of the fallen. Impact The memorial, dedicated on Veteran’s Day in 1982 by President Ronald Reagan, put a human face on a conflict that brought pain to so many people. The inscription on the memorial’s plaque proudly honors “the courage, sacrifice and devotion to duty and country of its Vietnam veterans.” Further Reading
Karnow, Stanley. Vietnam: A History. New York: Harper & Row, 1983. History of the war that includes mention of the memorial and its function in postwar healing. Lee, J. Edward, and H. C. “Toby” Haynsworth. Nixon, Ford, and the Abandonment of South Vietnam. Jefferson, N.C.: McFarland, 2002. History that focuses on the failure of civilian leadership to bring the war to a successful conclusion. Library of Congress, U.S. American Treasures of the Library of Congress: Vietnam Veterans Memorial. http://www.loc.gov/exhibits/treasures/ trm022.html. Official government Web site that documents the monument’s construction and meaning. Palmer, Laura. Shrapnel in the Heart: Letters and Remembrances from the Vietnam Veterans Memorial. New York: Vintage Books, 1987. Combines transcripts of messages left at the site of the memorial with interviews with those who left them there. Vietnam Veterans Memorial: Official Park Guide. Washington, D.C.: National Park Service, U.S. Department of the Interior, 1995. The official guide of the National Park Service, whose job it is to oversee the Vietnam Veterans Memorial. Wagner-Pacifici, Robin, and Barry Schwartz. “The Vietnam Veterans Memorial: Commemorating a Difficult Past.” The American Journal of Sociology 97 (1991): 376-420. Examines they pyschological and sociological effects and implications of the memorial. Joseph Edward Lee See also Architecture; Asian Americans; Boat people; Cold War; Foreign policy of the United States; Platoon; Reagan, Ronald.
The Eighties in America
■ Virtual reality Definition
Computer-generated simulation of experience designed to mimic actual experience
In the late 1980’s, the term “virtual reality” was popularized by Jaron Lanier, founder of VPL Research, which built many of the early virtual reality goggles and gloves. Virtual reality, also known as artificial reality, is a technology designed to enable users to interact with fabricated environments in ways that resemble the ways in which they interact with real environments. This interaction may be limited to one dimension, such as movement, or it may seek to replicate the entire experience of being in the world. In the 1980’s, nascent virtual reality technologies appeared, and users interacted with these three-dimensional, computer-simulated environments by means of special goggles or gloves that provided the experience of directly manipulating the computer world. While the worlds were usually depicted only visually, some environments went so far as to have auditory or tactile components. While virtual reality technology was fairly primitive and quite rare during the decade, fictional representations of virtual reality were more common. The concept of virtual reality captured the imaginations of many, as it seemed to be the ideal form of representation, a form that would eventually replace film, literature, and most other media. However, it was generally the dangers of simulated reality, rather than its benefits, that most fascinated science-fiction authors and filmmakers. Impact Virtual reality became a mainstay of science fiction, where it was quickly juxtaposed with the related concept of cyberspace. Cyberpunk authors thus led the vanguard of portrayals of the possibilities and hazards of the nascent technology, as did the television program Star Trek: The Next Generation, which featured a virtual reality system called a “holodeck” in several of its episodes. Later, as technology progressed, more real-world applications for virtual reality technology would emerge, including military, medical, therapeutic, and architectural uses. Further Reading
Burdea, Grigore C., and P. Coffet. Virtual Reality Technology. Hoboken, N.J.: John Wiley & Sons, 2003. Kalwasky, R. S. The Science of Virtual Reality and Virtual
Voicemail
■
1025
Environments: A Technical, Scientific, and Engineering Reference on Virtual Environments. New York: Addison-Wesley, 1993. Krueger, Myron W. Artificial Reality. New York: Addison-Wesley, 1991. Markley, Robert. Virtual Realities and Their Discontents. Baltimore: Johns Hopkins University Press, 1996. Sherman, William C., and Alan Craig. Understanding Virtual Reality: Interface, Application, and Design. San Francisco: Morgan Kaufmann, 2003. Teixeira, Kevin, and Ken Pimentel. Virtual Reality: Through the New Looking Glass. 2d ed. New York: Intel/McGraw-Hill, 1995. Cat Rambo See also CAD/CAM technology; Computers; Cyberpunk literature; Gibson, William; Inventions; Science and technology; Star Trek: The Next Generation; Tron; Video games and arcades.
■ Voicemail Definition
Centralized telecommunications technology in which spoken messages are recorded for later retrieval by the recipient
The popularity of voicemail in the 1980’s sped up business and family life. Prior to the invention of voicemail, phone users employed answering machines, which were cumbersome. Voicemail originated in 1975 with Steven J. Boies of International Business Machines (IBM), and the concept caught on in the early 1980’s when it was commercialized by Octel Communications. In the late 1980’s, after the American Telephone and Telegraph (AT&T) breakup, Scott Jones of Boston Technology found it possible to make the system more accessible to everyone. Voicemail added a large number of features that answering machine systems lacked. In the corporate realm, it allowed each member of a business to have a separate storage for incoming messages. Eventually, companies were able to centralize their voicemail work on one system. Voicemail was easy to use: Messages could be left even if the recipient was on another call, and users could hear instructions on the phone about how to use it. Each employee was assigned a mailbox, and a person could record a personal greeting for callers.
1026
■
Voyager global flight
With the introduction of voicemail, business people could make many more calls without having to rely on other staff members, thereby saving companies money as well as lost time and messages. Educators could reach students’ parents without getting a busy signal. The recipient could store messages, play them back remotely and at any time, or forward them to another location. The downside to voicemail was that the recipient might have to spend a lot of time listening to calls. At home, people received unwanted sales calls on their voicemail. For people who wanted to speak with each other live, a lot of time might be spent playing “phone tag” with each other before they actually connected. People who called businesses often were automatically redirected without being able to speak to a customer service representative. Impact By the early twenty-first century, voicemail had become a ubiquitous feature for cell phone users and many businesses. People found voicemail to be both a blessing and a hindrance. While it liberated them from many of the hassles associated with earlier phone systems, it required people to be always “on call”—always able to be contacted. Further Reading
Bates, Regis J., with Donald W. Gregory. Voice and Data Communications Handbook. New York: McGraw-Hill, 2006. LeBon, Paul. Escape from Voicemail Hell: Boost Your Productivity by Making Voicemail Work for You. Highland Village, Tex.: Parleau, 1999. Jan Hall See also AT&T breakup; Cell phones; Computers; Fax machines; Globalization; Inventions; Science and technology.
■ Voyager global flight The Event
An airplane circles the earth in nonstop flight Date December 14-23, 1986 Place Left from and returned to Edwards Air Force Base, Mojave Desert, California Two U.S. pilots circumnavigated the world in an innovative composite-material aircraft that transported sufficient fuel to enable them to complete the trip without stopping or
The Eighties in America
refueling in transit. Their achievement inspired aerospace design. During the early twentieth century, several airplane pilots successfully circumnavigated the earth, setting various records, as advancements in aviation technology lengthened the possible distance and duration of flight. Aviators were inspired by a 1962 B-52 Stratofortress nonstop flight covering 12,519 miles between Okinawa, Japan, and Madrid, Spain, without refueling. Their next goal was a nonstop circumnavigation of the globe requiring no supplementary fueling. In 1981, brothers Burt Rutan, an aircraft designer, and Richard Rutan, a former U.S. Air Force pilot, and their colleague Jeana Yeager envisioned an airplane capable of transporting sufficient fuel to sustain an Earth-circling nonstop flight. The trio established Voyager Aircraft and utilized Burt Rutan’s experiences with composites and canard wings to design a light aircraft capable of lifting large amounts of fuel. For approximately eighteen months, they assembled the Voyager from layers of carbon fibers, polymers, and epoxy molded into various components and heated to strengthen the aircraft, which resembled the letter “W” with a 110-foot bendable wing bisecting it. The Voyager’s small cabin, seventeen fuel tanks, canard, and wing totaled 939 pounds. Two engines powered Voyager. A Federal Aviation Administration (FAA) inspector approved Voyager for flight. The Voyager team pushed to attain the record before rival aviators achieved that goal. Starting on June 22, 1984, Richard Rutan and Yeager began test flights in the Voyager, discovering technical issues needing repair, as well as optimal flying strategies to respond to various flight conditions. Inside the narrow Voyager’s cabin, one pilot sat, while the other reclined. During sixty-seven flight tests, they achieved records, including a 11,857-mile flight in July, 1986, between San Luis Obispo, California, and San Francisco, California, with no refueling. This flight resulted in the Voyager project receiving some funds, but corporate financial support—which they had hoped to secure—remained an elusive goal. The Rutans and Yeager invested an estimated two million dollars in Voyager. Around the World Rutan and Yeager prepared to fly in early December, 1986, but rain disrupted their plan. After skies cleared, they transferred the Voyager to Edwards Air Force Base on December 13. The
The Eighties in America
Voyager global flight
■
1027
The Voyager aircraft returns from its record-breaking nonstop trip around the world in 1986. (NASA)
next morning, National Aeronautics Association representative Richard Hansen placed seals and other devices on the Voyager to detect any refueling or stops. Piloting the Voyager first, Rutan departed at 8:01 a.m. and flew west, attaining an altitude of fiftyeight hundred feet. Burt Rutan initially followed in another plane to monitor the Voyager, and ground support personnel maintained radio contact. Flying over the Pacific Ocean, Rutan and Yeager navigated with radar and a Global Positioning System (GPS) to maneuver between storms and turbulent areas. They passed Hawaii and reached the South Pacific on the second day of flight. Rutan avoided flying into Typhoon Marge but benefited from its winds to speed toward the Philippine Islands. On the third day, exhausted after flying since the beginning of the voyage, Rutan slept while Yeager piloted the Voyager through Southeast Asia. On the fourth day, the pilots stayed near the Inter-
tropical Convergence Zone (ITCZ), risking storms to benefit from winds to push Voyager along its flight path over the Indian Ocean. As it neared Africa the next day, the Voyager set the record for the greatest flight distance reached without refueling. The pilots crossed mountainous hazards by using the front engine to lift over tall peaks. During day six, the Voyager pilots turned off the front engine after clearing African mountains and reaching coastal landmarks. On the following day, they completed crossing the Atlantic Ocean to Brazil. Yeager resumed piloting while Rutan slept, guiding Voyager toward the Pacific Ocean. By day eight, the pilots headed north. They experienced fuel pump problems, and the rear engine stopped. As the Voyager began to descend over the ocean, the pilots attempted to turn the front engine on. It started when the aircraft was only thirty-five hundred feet above ground. The Voyager’s movement caused fuel
1028
■
Voyager global flight
to reach the rear engine, which also started, and the pilots decided to continue toward Edwards Air Force Base with both engines operating. On December 23, 1986, the Voyager reached Edwards at around 7:30 a.m. A crowd of approximately twenty-three thousand people watched as Rutan flew around the airfield while Yeager lowered the landing gear. The Voyager landed at 8:05 a.m. Hansen verified that his seals were still in place and confirmed the world record flight, which had covered 25,012 miles during nine days, three minutes, and fortyfour seconds aloft. Impact International news reporters covered the Voyager’s global flight, emphasizing the Rutans’ and Yeager’s achievement as an aviation milestone. On December 29, 1986, Yeager and both Rutan brothers accepted the Presidential Citizens Medal from President Ronald Reagan. They also received aviation’s prestigious Robert Collier Trophy. Pilots Rutan and Yeager discussed their flight at a February 3, 1987, hearing of the U.S. House of Representatives Committee on Science, Space, and Technology, encouraging expanded research and application of composites to military and commercial aircraft. During the summer of 1987, the Voyager was transported for inclusion in the National Air and Space Museum at Washington, D.C. The Voyager represented the emerging field of aerospace development without government support, foreshadowing later private space exploration that would not be controlled by governmental bureaucracy and restrictions. The Rutans and Yeager retained interest in designing experimental com-
The Eighties in America
posite aircraft for the remainder of the 1980’s. Their Voyager experiences influenced designs by Burt Rutan’s company, Scaled Composites. Further Reading
Fink, Donald E. “Salute to Voyager.” Aviation Week and Space Technology 126, no. 1 (January 5, 1987): 13. Editorial examines the Voyager’s possible influence on governmental and civilian aerospace in this issue, which provides thorough coverage. Marbach, William D., and Peter McAlevey. “Up, Up, and Around.” Newsweek 108, no. 26 (December 29, 1986): 34-36, 41-44. Account supplemented with maps, diagrams, and information profiling significant aviation records. Mordoff, Keith F. “Voyager Crew Faces Turbulence, Fatigue on World Flight Attempt.” Aviation Week and Space Technology 125, no. 25 (December 22, 1986): 18-21. Describes Voyager’s original flight plan, preparations, and conditions en route. Schatzberg, Eric. Wings of Wood, Wings of Metal: Culture and Technical Choice in Airplane Materials, 19141945. Princeton, N.J.: Princeton University Press, 1999. Considers the Voyager’s composite materials in context with predecessors using similar strategies. Yeager, Jeana, and Dick Rutan, with Phil Patton. Voyager. New York: Alfred A. Knopf, 1987. Comprehensive pilots’ account discussing all aspects of Voyager. Includes unique photographs. Elizabeth D. Schafer See also Gimli Glider; Inventions; Science and technology.
W ■ Wall Street Identification American film Director Oliver Stone (1946) Date Released December 11, 1987
Considered by some to be a study, or revelation, of social Darwinism at its worst, Wall Street offered a compelling portrait of a man, Gordon Gekko, for whom financial profit is the ultimate aim of human endeavor and the pursuit of financial success the only religion worthy of the name. The film creates such a vivid portrait of a man for whom greed is both aphrodisiac and consummation that the movie’s moral seems to pale by comparison. Wall Street attracted attention from movie critics because of its character Gordon Gekko’s unapologetic defense of greed as a positive motivating force in American life. Michael Douglas (who won an Academy Award for his performance) created such a powerful portrait of the financial wizard that it was difficult for viewers not to be taken in by his argument that winning is the only goal worth pursuing and that obeying the law is only for the timid. Gekko is ultimately undone in the film by his protégé, Bud Fox, but the film provides no assurance that the free market system will change as a result. Other Gekkos may easily arise to fill the vacuum left by his downfall. Director Oliver Stone used Gekko to give voice to what he saw as the collective values corrupting American culture in the 1980’s. Gekko’s famous statement “Greed is good” would at many times in history have been taken as simply wrong if not insane. During the decade that gave rise to the nickname the Me generation, however, “Greed is good” resonated with a great many Americans, either as a slogan to embrace or as a distillation of the mistake at the heart of the nation’s values. Impact Wall Street paints a startling portrait of an unprincipled man who manipulates people and uses illegal means to gain money and power. One implication of the film was that a free economy is vulnerable to the abuses of people like Gordon Gekko and
that only the goodness of people like the ambitious Bud Fox and his father can protect it. This implication seemed to apply particularly well to the 1980’s, a decade during which extreme wealth was being created on Wall Street, but real wages did not increase. Wall Street was taken, both at the time and later, as distilling the economic culture of the decade and giving it a villainous but frighteningly compelling face. Further Reading
Boozer, Jack, Jr. “Wall Street: The Commodification of Perception.” The Journal of Popular Film and Television 17, no. 3 (Fall, 1989): 90-99. Kunz, Don. The Films of Oliver Stone. Lanham, Md.: Scarecrow Press, 1997. Simon, John. “Wall Street.” National Review 40, no. 1 (January 22, 1988): 65. Bernard E. Morris See also
Academy Awards; Black Monday stock market crash; Business and the economy in the United States; Crime; Douglas, Michael; Film in the United States; Hannah, Daryl; Power dressing; Reaganomics; Stone, Oliver; Yuppies.
■ Washington, Harold Identification Mayor of Chicago, 1983-1987 Born April 22, 1922; Chicago, Illinois Died November 25, 1987; Chicago, Illinois
Washington made local and national history as Chicago’s first African American mayor. His victory demonstrated the power of unified African American voters and challenged decades of white domination of Democratic Party politics in Chicago. Harold Washington was a thirty-year veteran of Chicago politics when he entered the city’s 1983 mayoral race. An attorney by profession, Washington served in the Illinois statehouse from 1964 to 1976 and the state senate from 1976 to 1980. He ran un-
1030
■
successfully for mayor of Chicago in 1977 following the death of Richard J. Daley, whose political machine controlled the city for decades. Washington was committed to Daley early in his career but later earned a reputation for independence. The socioeconomic interests of African Americans were central to Washington’s agenda, and he built a power base among voters neglected by Chicago’s political establishment. As a U.S. representative from the first congressional district from 1980 to 1983, Washington opposed President Ronald Reagan’s policies before leaving Congress reluctantly to mount a second mayoral bid. Despite his achievements and experience, Washington was an underdog in the Democratic primary against incumbent Jane Byrne and Richard M. Daley, the late mayor’s son. Washington’s opponents raised ethical questions about his law practice and a short jail term that he had served for income tax evasion in 1971. Washington was a charismatic campaigner, however, and benefited from high African American voter registration and dissatisfaction with Byrne. After a difficult primary victory, Washington endured a general election filled with attacks on him and his supporters. White voters who had been lifelong Democrats rejected Washington and bluntly expressed racial motivations for voting for Republican Bernard Epton. Washington took office in April, 1983, after receiving almost 100 percent of Chicago’s African American votes and barely 12 percent of its white votes. His agenda met fierce resistance from members of Washington’s own party. The “Council Wars” of 1983-1986 pitted Washington against a mostly white group of Democrats who held the majority of seats on Chicago’s city council and created a virtual stalemate until late in his first term. Washington’s progress in bringing diversity to his city’s government and improving public transportation and conditions in urban neighborhoods carried him to a second term in 1987, but he died after serving just seven months. Impact
The Eighties in America
Water pollution
Harold Washington’s struggles and successes in Chicago’s tough political climate made him the most important African American elected official of the 1980’s. The racial divisions of Washington’s campaigns and terms in office highlighted similar events in national politics as large numbers of white Democrats defected to the Republican Party in the 1980’s.
Further Reading
Rivlin, Gary. Fire on the Prairie: Chicago’s Harold Washington and the Politics of Race. New York: Henry Holt, 1992. Young, Henry J. The Black Church and the Harold Washington Story: The Man, the Message, the Movement. Bristol, Ind.: Wyndham Hall, 1988. Ray Pence See also African Americans; Jackson, Jesse; Racial discrimination; Reagan, Ronald; Reagan Democrats.
■ Water pollution Definition
Decline in water quality resulting from biological, chemical, or thermal agents
Although the United States made progress in dealing with water pollution during the 1980’s, the issue was not a high priority for the Reagan administration. In addition to concern with surface water, some Americans began to register increasing concern for the quality and quantity of water available from underground aquifers during the decade. Congress had set standards for water quality in 1972 with amendments to the Federal Water Pollution Control Act (FWPCA), a 1948 law. This legislation set water quality standards for a variety of chemical, biological, and thermal pollutants and prescribed standards for the treatment and pretreatment of industrial and municipal wastes. The Federal Water Pollution Control Act Amendments (FWPCAA) also indicated that by 1985 the country should achieve a zero discharge standard of wastes into water sources. By 1977, however, it had already become evident that this standard was unachievable, and so the standard was modified to the best level possible. The construction during the 1970’s of new municipal wastewater treatment plants, often with federal assistance, led to improvements in water quality by the early 1980’s. Discharge from municipal sewage treatment plants or industrial sites is referred to as point pollution, as it is emitted at a specific point and the discharge can be monitored. Another form of point pollution occurs when water is drawn from a body of water, used for cooling, and then returned to the body of water. Thermal pollution, such as that occurring when water is used for cooling in nuclear reactors, can produce “hot” spots in rivers or lakes when the water is returned. The application of fed-
The Eighties in America
eral standards reduced the temperature of returned water, although it generally did not return it to the original level. During the 1980’s, several states developed water quality standards for point pollution to supplement the federal rules, and there continued to be improvements in stream quality, although at a slower rate than during the previous decade. Nonpoint Pollution
In spite of continuing gains in dealing with point pollution during the decade, a major source of water pollution remained more difficult to handle. Runoff from farms, parking lots, construction sites, or industrial sites is not concentrated in any one location and is more difficult to monitor than is point pollution. Nonetheless, such nonpoint pollution was a major contributor to the pollution of streams and lakes before, during, and after the 1980’s. Fertilizers such as nitrogen, used to enhance agricultural production, are a major source of nutrient material that becomes available to organisms in water when some of the fertilizer runs off from agricultural land. The growth of these oxygen-
Water pollution
■
1031
requiring microorganisms in water reduces the amount of oxygen available to satisfy the biochemical oxygen demand (BOD) of fish or useful aquatic life, leading to fish kills. Agriculture became an increasingly important source of this form of pollution during the decade. For example, approximately 16 percent of all nitrogen fertilizer applied in the Mississippi River basin was washed into the Mississippi. Some advocates urged the Environmental Protection Agency (EPA) to act to regulate nonpoint pollution, but the Reagan administration was disinclined to regulate the environment further. Nonpoint pollution could be chemical as well as biological in nature. Runoff from roads or parking lots added minute amounts of chemicals or heavy metals of various sorts to nearby water sources. In this case, the remedy often caused additional problems. Many cities piped storm water into the sewer system. Large rainstorms at times caused sewer systems to bypass water treatment plants because of the excess capacity. Although road runoff was contained
A polluted lakefront beach in Hammond, Indiana, poses a health hazard to swimmers. (Library of Congress)
1032
■
The Eighties in America
Watson, Tom
in this case, the bypassing of waste treatment facilities led to raw sewage being dumped into streams. Some cities acknowledged the problem but did not have the money to construct larger treatment facilities to deal with the issue. One form of nonpoint pollution did begin to be addressed during the 1980’s. Several states and local governments required that construction sites control runoff from the sites. This simple measure reduced the flow of dirt, organics, and chemicals into water courses. Underground Water
Initial efforts at dealing with water pollution had focused on surface water, but efforts would also focus on underground water. Underground aquifers provide sources of water for drinking, irrigation, and industrial use in many parts of the country. Tests of the wells of individuals and some cities began to reveal that pollution was also a problem for this water source. Chemicals from underground gasoline storage tanks, hazardous waste sites, or even municipal landfills was found to migrate gradually through the soil at the site into nearby wells and aquifers. Awareness of this problem dated to the 1970’s, but it became an increasing concern in some regions during the 1980’s that was coupled, in some cases, to fears about overuse of aquifers. The misuse and overuse of aquifers reduced water supplies that were not replaceable, leading to worries about the sustainability of water supplies in areas such as the Great Plains.
Impact The United States continued to make some improvements in achieving cleaner water during the 1980’s. A lax regulatory climate as well as increasingly costly methods for securing improvement slowed the progress in managing water pollution. People became more aware of the impact of nonpoint water pollution during the decade, although little was done to deal with the issue. In addition, little monitoring of water quality was done, so it was difficult to measure progress accurately. Water pollution remained an often ignored issue at the end of the decade. Further Reading
Freedman, Barr y D. Regulation in the ReaganBush Era: The Eruption of Presidential Influence. Pittsburgh: University of Pittsburgh Press, 1995.
An analysis of the Reagan-Bush environmental record. Peirce, J. Jeffrey, Ruth F. Weiner, and P. Aarne Vesilind. Environmental Pollution and Control. 4th ed. Boston: Butterworth-Heinemann, 1998. Several chapters deal with the nature and treatment of water pollution. Rogers, Peter. America’s Water: Federal Role and Responsibilities. Cambridge, Mass.: MIT Press, 1999. A good analysis of the role of the federal government in ensuring water quality. Rosenbaum, Walter A. Environmental Politics and Policy. 7th ed. Washington, D.C.: CQ Press, 2008. Provides a broad political context for an analysis of water pollution issues. John M. Theilmann See also
Air pollution; Environmental movement; Reagan, Ronald; Watt, James G.
■ Watson, Tom Identification American professional golfer Born September 4, 1949; Kansas City, Missouri
Watson was the best golf professional in the world from 1980 to 1983, winning five major championships. Tom Watson is recognized by golf historians as one of the great champions of the modern era, having won eight of golf’s major championships from 1975 to 1982. From 1980 through 1983, he won five major championships, including the British Open (1980, 1982, and 1983), the Masters (1981), and the United States Open (1982). Watson’s victory in the 1982 United States Open at Pebble Beach is symbolic of his golf legacy. He battled champion Jack Nicklaus throughout the final round and made an unexpected birdie, holing a chip shot from heavy rough just off the seventeenth green. He also defeated Nicklaus in one-on-one competition at the 1977 British Open and the 1977 Masters. Therefore, Watson became recognized both for his eight major championships and as the man who beat Jack Nicklaus more dramatically than anyone else. The year 1980 was notable for Watson, because he became the professional tour’s first player to win over one-half million dollars in one season, winning seven total tournaments. At the 1980 British Open, he finished four shots ahead of his nearest competi-
The Eighties in America
Watt, James G.
■
1033
Watson, Tom. “The Thinker Tom Watson.” Interview by Lisa Taddeo. Golf Magazine, October, 2006, 80-87. _______. Tom Watson’s Strategic Golf. New York: Pocket Books, 1992. Alan Prescott Peterson See also
Golf; Sports.
■ Watt, James G. Identification
Secretary of the interior from January, 1981, to November, 1983 Born January 31, 1938; Lusk, Wyoming Watt’s business background and the perception that he was pro-development made him a lightning rod for criticism during his brief tenure as head of the Department of the Interior.
Tom Watson. (Ralph W. Miller Golf Library)
tor, Hall of Fame golfer Lee Trevino. His victory at the Masters in 1981 was by two shots over two great champions, Jack Nicklaus and Johnny Miller. In 1983, Watson almost won consecutive U.S. Open titles, finishing second by one shot to Larry Nelson at Oakmont. Watson again finished second at the U.S. Open in 1987, behind Scott Simpson. Also in 1987, Watson won the first-ever season-ending tour championship tournament, his first tournament win in three years. Impact Tom Watson was voted six times as the United States Professional Golfers’ Association (PGA) Player of the Year, including in 1980, 1982, and 1984. He was a member of the United States’ Ryder Cup team in 1981, 1983, and 1989. In 1987, he received the Bob Jones Award, the United States Golf Association’s highest honor for distinguished sportsmanship in golf. In 1988, Watson was inducted into the World Golf Hall of Fame. Further Reading
Campbell, Malcolm. The Encyclopedia of Golf. 3d ed. New York: DK, 2001.
James G. Watt served as secretary of the interior under President Ronald Reagan from January, 1981, until November, 1983. Born in Lusk, Wyoming, he graduated from the University of Wyoming in 1960 and from that university’s law school in 1962. In 1962, he became the deputy assistant secretary of water and power in the Department of the Interior, and in 1975, he was made vice chairman of the Federal Power Commission. Before being appointed to Reagan’s cabinet, Watt was the founding president of the Mountain States Legal Foundation, a conservative organization that sought to protect and advance the interests of businesses involved in oil, timber, mining, and other natural resource development fields. With his pro-business and pro-development background, Watt was immediately controversial as the choice to head the Department of the Interior, although he was quickly confirmed by the U.S. Senate. As secretary, Watt led the way in implementing President Reagan’s environmental policies. Reagan sought to apply cost-benefit analysis to environmental regulations, to determine whether the cost and the impact on jobs and the economy outweighed the value of the regulations. Reagan also promoted “environmental federalism,” which involved transferring the responsibility for many decisions on environmental matters back to the states. Many of Watt’s appointees within the Department of the Interior were recruited from the business community, often
1034
■
Wave, the
from the very industries that the department was charged with regulating. Watt cut funding and personnel for some regulatory programs and sought to open some coastal lands and wilderness areas to exploration for resource development. Impact Within four months of his appointment, activist environmental groups such as the Sierra Club were calling for Watt’s removal. By the summer of 1981, even the more moderate National Wildlife Federation was calling for Watt to step down. By October, 1981, the Sierra Club’s “Dump Watt” petition drive had delivered to Congress over one million signatures calling for Watt’s firing. Despite the heavy criticism of Watt’s policies, the immediate cause of his resignation was the furor over a remark he made in a speech in September, 1983, in which he described the personnel of a Senate oversight committee with which he worked. He referred to the gender, ethnic backgrounds, and physical disability of committee members in a way that was perceived as bigoted. The Senate began considering a resolution calling for Watt’s removal, but he resigned before being forced out. Watt announced his resignation on October 9, 1983, and left the Interior Department on November 8, 1983. Further Reading
Kraft, Michael E., and Norman J. Vig. “Environmental Policy in the Reagan Presidency.” Political Science Quarterly 99, no. 3 (Fall, 1984): 415-439. Watt, James G., with Doug Wead. The Courage of a Conservative. New York: Simon & Schuster, 1985. Mark S. Joy See also Conservatism in U.S. politics; Environmental movement; Reagan, Ronald; Scandals.
■ Wave, the
The Eighties in America
to create a wave of movement through an assembled crowd. The origins of the wave are unclear, having been traced to various possible sources, including a hockey game in Alberta, Canada, in 1980; an American League baseball playoff game in October, 1981; and a football game at the University of Washington later that month. Having no apparent connection with a single sport or team, the wave was less a cheer than a mass communal gesture akin to the popular act of bouncing a beach ball through the crowd at a concert or sporting event. Although generally considered innocuous, the wave has been criticized for its meaningless nature and for causing food, beverages, and other objects to be thrown or spilled into the participating crowd. Although sometimes performed by specific groups of spectators, the wave was more often a nonpartisan action in which spectators were compelled by peer pressure to participate. Waves often traveled around a stadium or arena or back and forth across a section of grandstands numerous times before dying out as spontaneously as they had begun when the crowd became weary of them. Variations on the standard wave, including the simultaneous creation of two oppositely rotating waves and successive waves performed at various predetermined speeds, were sometimes performed in settings conducive to preplanning and crowd discipline, such as student sections at collegiate sporting events. The wave grew rapidly in popularity during the early 1980’s, partly as a result of mass media coverage of sporting events, and it was a standard feature of American and Canadian sporting events by the mid1980’s. The 1984 Olympic Games in Los Angeles, California, exposed an international audience to the wave, which subsequently achieved global prominence during the 1986 World Cup soccer tournament in Monterrey, Mexico. As a result, the cheer became known in many parts of the world as the “Mexican wave.”
Definition
Mass gesture involving spontaneous coordinated movement by an audience, usually at a sporting event, mimicking the appearance of a large wave
The wave became popular at sporting events across North America and eventually across the world during the 1980’s. “The wave,” also known as the “audience wave” or “Mexican wave,” involves large numbers of participants standing and raising their arms in succession
Impact Although a fixture of sporting events by the end of the 1980’s, the wave was essentially an act devoid of meaning or context, and as such exerted little discernible cultural influence. It became the subject of research by scholars studying crowd psychology and social phenomena and, despite its eventual international popularity, has been cited as an example of cultural conformity in 1980’s America. The wave waned in popularity after the 1980’s but
The Eighties in America
Weaver, Sigourney
■
1035
continued to appear sporadically at sporting events into the twenty-first century. Further Reading
Free, Marcus. The Uses of Sport: A Critical Study. London: Routledge, 2004. Wann, Daniel L. Sport Fans: The Psychology and Social Impact of Spectators. London: Routledge, 2001. Michael H. Burchett See also
Fads; Olympic Games of 1984; Sports.
■ Weaver, Sigourney Identification American actor Born October 8, 1949; New York, New York
During the 1980’s, Weaver established herself as an actor capable of playing strong, aggressive women willing to take on impossible odds. At the beginning of the 1980’s, Sigourney Weaver had just played Ellen Ripley, an independent woman who takes on the alien in Alien (1979), a moderate box-office hit. This role set the tone for her film performances of the decade, especially her reprisal of the role in the 1986 sequel, Aliens (1986), another successful film for which she received an Academy Award nomination. Weaver also starred as Dian Fossey in Gorillas in the Mist (1988), a film based on the life of a conservationist and activist who was eventually killed by poachers. Weaver was affected by the film’s material and became an ardent environmentalist, as well as the honorary chairperson of the Dian Fossey Gorilla Fund, which is devoted to the preservation of that endangered species. She received an Academy Award nomination for Best Actress for her role as Fossey and in the same year was also nominated as Best Supporting Actress for playing Katherine Parker, the conniving, ruthless career woman who finally gets her comeuppance in Working Girl (1988). She thus became one of very few actresses ever to receive two Oscar nominations in one year. Although she failed to win an Oscar, she did receive Golden Globe awards for both films. In addition to the films of the 1980’s for which she was best known, Weaver starred in other films in which she remained true to type. She was an intrepid reporter in the thriller Eyewitness (1981), and she
Sigourney Weaver as Ripley in Alien, the role for which she was best known at the beginning of the 1980’s. (Hulton Archive/ Getty Images)
played opposite Mel Gibson in Peter Weir’s The Year of Living Dangerously (1982), an underappreciated Australian film in which she portrayed a British attaché in revolutionary Indonesia. Although she is less known for her comedic talents, Weaver held her own in Ghostbusters (1984), a hugely popular comedy featuring Bill Murray and Dan Ackroyd; she repeated her role in the sequel, Ghostbusters 2 (1989). Impact Standing almost six feet tall, Weaver embodied the strong, powerful, imposing woman of the 1980’s, a role model for women to emulate. Whether confronting aliens or poachers, she was a force to be reckoned with, but she also represented the aggressive woman many men were encountering at the office and at home. Thus, she came to stand for some men as a figure to be feared. The fine line in film portrayals between strong woman and threat
1036
■
Webster v. Reproductive Health Services
The Eighties in America
to men was approached and at times crossed by several of the most famous 1980’s actresses, including Weaver and Glenn Close, and it represented on screen tensions being experienced in American society, as more households began to require two incomes to remain financially secure. Further Reading
Maguffe, T. D. Sigourney Weaver. New York: St. Martin’s Press, 1989. Sellers, Robert. Sigourney Weaver. London: Robert Hale, 1992. Thomas L. Erskine See also Action films; Aliens; Business and the economy in the United States; Close, Glenn; Environmental movement; Feminism; Film in the United States; Ghostbusters; Gibson, Mel; Murray, Bill; Women in the workforce.
■ Webster v. Reproductive Health Services Identification U.S. Supreme Court decision Date Decided on July 2, 1989
In Webster v. Reproductive Health Services, the Court upheld a Missouri state law regulating abortion, thereby signaling to other states that abortion regulation was constitutionally permissible.
Norma McCorvey, left, better known as the Jane Roe of Roe v. Wade (1973) stands with attorney Gloria Allred outside the U.S. Supreme Court Building in April, 1989, after attending the oral arguments in Webster v. Reproductive Health Services. (AP/Wide World Photos)
Webster v. Reproductive Health Services began in 1986, when Missouri health care professionals involved in providing abortion services challenged a state law regulating abortion. The Missouri law barred the use of public funds or resources for the purposes of abortion counseling or to perform abortions except to save a mother’s life. It also required health care professionals to perform tests, such as assessments of fetal weight and lung maturity, to determine the viability of a fetus after twenty weeks gestational age. The law’s preamble declared that life begins at conception, so a fetus should enjoy constitutional rights and protections. The district court found the law’s restrictions on abortion unconstitutional and in violation of the precedents established in the Supreme Court’s decision in Roe v. Wade (1973) protecting women’s abortion rights. Missouri attorney general William Webster appealed the case to the Supreme Court.
Supreme Court Action
The Court upheld the Missouri law’s abortion provisions. The decision was complex, in that a portion of it was unanimous, while other portions were contested. Chief Justice William H. Rehnquist wrote the majority opinion, which stated that the Missouri law did not contradict Roe v. Wade, because it allowed pregnant women to terminate their pregnancies so long as neither public funds nor public facilities were used during such abortion procedures. The Court did not explicitly rule on the law’s preamble proclaiming life to begin at conception. Instead, the Court interpreted the preamble statement as a “value judgment” favoring childbirth over abortion. According to the Court,
The Eighties in America
Roe did not prohibit states from issuing such value judgments. Justice Antonin Scalia wrote a separate concurrence in favor of overturning Roe. Scalia argued that abortion was a political issue that should be under the domain of state legislatures. Justice Sandra Day O’Connor, also a part of the majority, also wrote a separate concurrence. She agreed with the majority that the trimester system of Roe was problematic but indicated that there was no need to modify it in Webster. Instead, she argued that the performance of tests to determine the viability of a fetus after twenty weeks gestational age did not impose an “undue burden” on a pregnant woman’s abortion decision. Justice Harry A. Blackmun, who had written the Court’s decision in Roe, concurred in part and dissented in part. He was joined in his partial dissent by William Brennan and Thurgood Marshall, while John Paul Stevens wrote a separate opinion, also concurring in part and dissenting in part. Blackmun argued that the majority’s decision challenged Roe and other legal precedents that established the notion of an individual’s right to privacy. He indicated that the Missouri law and other state laws restricting abortion services would lead to an increase in unsafe, illegal abortions.
Weinberger, Caspar
■
1037
See also Abortion; Feminism; Supreme Court decisions; Women’s rights.
■ Weinberger, Caspar Identification
U.S. secretary of defense from 1981 to 1987 Born August 18, 1917; San Francisco, California Died March 8, 2006; Bangor, Maine As secretary of defense under President Ronald Reagan, Weinberger oversaw expenditures of more than $3 trillion to develop the U.S. military. This development of personnel and technology provided the basis for an aggressive foreign policy that was directed at victory in the Cold War; it also contributed to the collapse of the Soviet Union early in the next decade. After graduating from Harvard Law School (1941) and serving in the U.S. Army in the Pacific theater
Impact Following the Supreme Court’s Webster decision, advocates on both sides of the abortion debate disputed state regulatory measures concerning when and under what conditions a woman could seek an abortion in the light of Roe. The Webster case signaled to states that abortion regulation was constitutionally permissible, laying the foundation for later decisions permitting further regulation. Further Reading
Craig, Barbara Hinkson, and David M. O’Brien. Abortion and American Politics. Chatham, N.J.: Chatham House, 1993. Kerber, Linda K., and Jane Sherron De Hart, eds. Women’s America: Refocusing the Past. New York: Oxford University Press, 2003. O’Connor, Karen. No Neutral Ground? Abortion Politics in an Age of Absolutes. Boulder, Colo.: Westview Press, 1996. Segers, Mary C., and Timothy A. Byrnes, eds. Abortion Politics in American States. New York: M. E. Sharpe, 1995. Brooke Speer Orr
U.S. secretary of defense Caspar Weinberger. (U.S. Deparment of Defense)
1038
■
The Eighties in America
Welfare
during World War II, Caspar Weinberger worked in a San Francisco law firm before entering California Republican politics in 1952, when he won a seat in the California Assembly (1952-1958). While he did not gain any higher elected office, Weinberger became powerful in the California Republican Party. With the election of President Richard M. Nixon in 1968, Weinberger was appointed chair of the Federal Trade Commission (1969), then director of the Office of Management and the Budget (1970); in both of these positions, he developed a reputation for being careful with the public’s funds. In 1973, Weinberger reached cabinet rank, when he was appointed Secretary of Health, Education, and Welfare (1973-1975). Weinberger worked in the private sector between 1975 and 1980. During that time, he supported Ronald Reagan’s candidacy for president. With Reagan’s election, Weinberger became his secretary of defense. In that capacity, Weinberger was charged with restoring the U.S. armed forces both quantitatively and qualitatively. The impact of the Vietnam War and the administration of President Jimmy Carter had both contributed to a decline in the capacity, reputation, and morale of the U.S. military. Weinberger was provided a blank check by Reagan and moved rapidly to expand the size of the military, develop and deploy new weapons systems, support technical innovations, and rebuild the Air Force and the Navy. His tenure became particularly associated with the development of new military technologies, a more professional and highly paid cadre of soldiers, a six-hundred-ship navy, and the Strategic Defense Initiative (SDI). Weinberger also became involved in the case of Israeli spy Jonathan Pollard, when he argued that Pollard should be punished harshly for compromising American security. More seriously, Weinberger was associated with the Iran-Contra affair, which led to his resignation and indictment on charges that he deceived investigators looking into the sale of missiles to Iran and the use of those funds to support the pro-U.S. forces in Nicaragua. Weinberger was never tried on these charges; President George H. W. Bush pardoned Weinberger and others on December 24, 1992. Impact Weinberger’s major achievement was overseeing the buildup of the U.S. military during the first six years of the Reagan administration. He also
became embroiled in a constitutional crisis, the IranContra affair, that involved the illegal use of government funds to support the anticommunist forces in Nicaragua. Weinberger remained a power in American conservative circles until his death in 2006. Further Reading
Baker, James A. The Politics of Diplomacy. New York: Putnam, 1995. Weinberger, Caspar W., with Gretchen Roberts. In the Arena: A Memoir of the Twentieth Century. Washington, D.C.: Regnery, 2003. _______. The Next War. Washington, D.C.: Regnery, 1996. William T. Walker See also
Cold War; Conservatism in U.S. politics; Elections in the United States, 1980; Europe and North America; Foreign policy of the United States; Grenada invasion; Iran-Contra affair; Israel and the United States; Reagan, Ronald; Reagan Doctrine; Reagan’s “Evil Empire” speech; Stealth fighter; Strategic Defense Initiative (SDI).
■ Welfare Definition
Public provision of cash, goods, or services to those in need
During the 1980’s, the link between welfare and work was strengthened, culminating in passage of the Family Support Act of 1988. Welfare-to-work demonstration programs were encouraged, and the nation’s child support enforcement system was also strengthened. The election of Ronald Reagan in 1980 provided a political wedge for antiwelfare punditry about the values and behavior of poor persons, particularly unmarried mothers and noncustodial fathers who failed to pay child support. Martin Anderson’s Welfare (1978), Irwin Garfinkel and Sara McLanahan’s Single Mothers and Their Children (1986), George Gilder’s Wealth and Poverty (1981), Charles Murray’s Losing Ground (1984), Lawrence Mead’s Beyond Entitlement (1986), and David Ellwood’s Poor Support (1988) provided much of the theoretical and empirical underpinnings of the welfare debates throughout the 1980’s. Antiwelfare scholars such as Murray and Mead, for example, respectively argued that welfare was a moral hazard, encouraging sloth and ille-
The Eighties in America
gitimacy, and that welfare programs, to the extent they were to be retained, should have stronger work requirements. Even sympathetic welfare reformers such as Ellwood sought to turn the Aid to Families with Dependent Children program (AFDC, a federal financial assistance program started in 1935 to provide cash assistance to those whose household income fell below official federal poverty thresholds, depending on family size) into a transitional support program designed to promote short-term financial, educational, and social support, such that AFDC would be more like a stepping-stone into the labor market. Workfare over Welfare
The Omnibus Budget Reconciliation Act of 1981 provided for community work experience programs (CWEP), making it possible for the first time for states to choose to make workfare, or job-training and community-service activities, mandatory for AFDC recipients. It authorized states to fund on-the-job training programs by using (diverting) a recipient’s welfare grant as a wage subsidy for private employers. States were also permitted to develop their own work incentive (WIN) demonstration programs. According to the U.S. General Accounting Office 1987 study Work and Welfare, roughly 22 percent (714,448) of all AFDC recipients participated in these programs nationwide. The 1980’s also witnessed increased federal and state efforts to obtain child support payments from noncustodial parents, particularly from fathers of AFDC mothers, in the light of enactment of the Child Support Enforcement (CSE) program in 1975 and in the light of a high profile the Reagan administration gave to promoting family values. In 1980, only 5.2 percent of AFDC payments were recovered through child support collections, and this percentage increased to 8.6 in 1986 when Reagan called for more extensive welfare reform in his state of the union address. On September 2, 1987, Reagan issued Executive Order 12606 requiring government agencies to assess all measures that might have a significant impact on family formation, maintenance, and general well-being in the light of how an action by government strengthens or erodes the stability of the family and particularly the marital commitment. Between 1986 and 1988, a congressional consensus emerged regarding welfare reform. The one hundredth session of Congress focused on three major bills: the Family Security Act of 1987, introduced
Welfare
■
1039
by Democratic senator Daniel Patrick Moynihan; the Family Welfare Reform Act of 1987, introduced by Democratic representative Harold Ford; and the Welfare Independence Act of 1987, the Republican alternative for welfare reform, introduced by Republican senator Bob Dole and by Republican representative Robert Michel. Despite differences in some specifics, each bill linked welfare reform to work. Despite political consensus over workfare, there was some public opposition. Writing in the November, 1987, issue of Ms., political essayist and social critic Barbara Ehrenreich, for example, viewed the consensus for workfare as a throwback to the seventeenth century workhouse or worse, slavery. Writing in the September 26, 1988, issue of The Nation, social welfare policy scholar and activist Mimi Abramovitz called such welfare reform efforts a sham, whose work requirements would cheapen the costs of women’s labor force participation and weaken the basic principles on which modern welfare states rested. The Family Support Act of 1988 As welfare expenses approached $16.7 billion in 1988, Reagan signed the Family Support Act (FSA) on October 13. Titles I and II of the FSA specifically addressed child support enforcement and welfare-to-work programs. Title I amended part D of Title IV of the Social Security Act of 1935 to require withholding of child support payments from noncustodial parents’ wages upon issuance or modification of a child support order for families receiving part D services. It required immediate wage withholding for all new child support orders issued on or after January 1, 1994. Parties in a contested paternity case had to submit to genetic tests upon the request of a party in such cases. States that did not have automated data processing and information retrieval systems in effect had to have such systems operational by October 1, 1995. Title II required states to establish a job opportunities and basic skills training program (JOBS). It also authorized states to institute a work supplementation program under which state reserves sums that would otherwise be payable to JOBS participants as AFDC benefits would be used instead to subsidize jobs for such participation. Title II authorized any state to establish CWEPs. Other provisions of the FSA directed states to guarantee child-care services to AFDC families to
1040
■
The Eighties in America
Welfare
the extent that such services were necessary for a family member’s employment or participation in an education and training activity of which the state approved. The law also required a state to continue a family’s Medicaid eligibility for six months after the family loses AFDC eligibility because of specified circumstances; retained the entitlement nature of AFDC; and authorized appropriations for fiscal year 1990 through 1992 for grants to states. These grants were to fund demonstration projects testing the efficacy of early childhood development programs on families receiving AFDC benefits and participating in JOBS and of JOBS on reducing school dropouts, encouraging skill development, and avoiding sole reliance on AFDC payments. Impact One impact of welfare activities over the decade was to reduce the rate of increase in expenditures and the numbers of beneficiaries. According to the House Committee on Ways and Means’ 1994 Green Book, total costs for AFDC in 1980 had nearly tripled in a decade, reaching $11.5 billion from $4.1 billion in 1970 after adjusting for inflation. The number of families receiving AFDC benefits had increased from 1.9 million to 3.6 million; the number of recipients had grown from 7.4 million to 10.6 million. By 1989, total AFDC expenditures had reached $17.2 billion, a 48 percent increase, while the average number of families increased by 3.5 percent to 3.8 million and the total number of recipients lingered around 11 million annually throughout the decade. The Manpower Demonstration Research Corporation (MDRC) was hired to evaluate these programs using experimental and control groups at various sites throughout the United States. Summarizing the results, Judith Gueron concluded that the programs did lead to consistent and measurable increases in employment and earnings and also led to some welfare savings. Women with no work experience showed the most significant gains, while longterm welfare recipients with no recent employment did not show consistent gains. However, Gueron cautioned that work programs did not offer an immediate cure for poverty or dependence on government for cash assistance. The impact of the programs that MDRC evaluated was modest, with many participants remaining dependent and many of those who moved off welfare remaining poor. In addition, the rates of individual and family poverty during
the Reagan administration were several percentage points higher than those on average throughout the 1970’s. Poor women also benefited from the increased efforts to obtain child support throughout the 1980’s. Of the 2.6 million women below poverty with their own children twenty-one years of age or younger present from an absent father in 1981, 39.7 percent were awarded child support, but only 19.3 percent actually received any payments. In 1987, of 3.2 million such mothers, 27.7 percent received payments. Although the immediate impact of Title I of the FSA on poor persons was deemed negligible, with 25.4 percent of poor eligible mothers receiving payment in 1989, the child support provisions applied to everyone, regardless of income level. These provisions enhanced the role of the federal government in family matters traditionally left to the states. In particular, by requiring states to establish automated information systems, the federal government increased the capacity of government in general to identify and track noncustodial parents who change jobs, cross state lines, and the like, for purposes of garnishing wages if necessary to secure child support payments due custodial parents. Further Reading
Caputo, Richard K. “The Limits of Welfare Reform.” Social Casework: The Journal of Contemporary Social Work 70 (February, 1989): 85-95. Argues for shifting public debate and intervention programs for poor persons from welfare reform to poverty reduction. _______. “Presidents, Profits, Productivity, and Poverty: A Great Divide Between the Pre- and PostReagan U.S. Economy.” Journal of Sociology and Social Welfare 31 (September, 2004): 5-30. Shows the persistence of high rates of poverty despite an improved economy and welfare reform efforts in the decade. Ellwood, David T. Poor Support: Poverty and the American Family. New York: Basic Books, 1988. Documents the nature and effects of income support programs for poor persons on families in the United States. Garfinkel, Irwin, and Sara S. McLanahan. Single Mothers and Their Children: A New American Dilemma. Washington, D.C.: Urban Institute Press, 1986. Describes the nature and extent of single motherhood in the United States.
The Eighties in America
West Berlin discotheque bombing
Gueron, Judith M. “Work and Welfare: Lessons on Employment Programs.” Journal of Economic Perspectives 4 (Winter, 1990): 79-98. Analyzes results of work-related welfare programs at eight sites and is cautious about the effectiveness of such programs on poverty reduction. Handler, Joel F., and Yeheskel Hasenfeld. The Moral Construction of Poverty: Welfare Reform in America. Newbury Park, Calif.: Sage, 1991. Traces the history of welfare policy through the Family Support Act. Highlights the importance of symbols in defining and redefining different moral categories of poor persons and in framing responses to them. _______. We the Poor People: Work, Poverty, and Welfare. New Haven, Conn.: Yale University Press, 1997. Argues that welfare reform efforts signify symbolic politics rather than address the spread of poverty among working persons. Mink, Gwendolyn, and Rickie Solinger, eds. Welfare: A Documentary History of U.S. Policy and Politics. New York: New York University Press, 2003. A collection of excerpts from key documents marking the development of U.S. welfare policy in the twentieth century. Uses original sources that provide a historical record of how our understanding of poverty and interventions to deal with it has or has not changed over time. Richard K. Caputo
■
1041
■ West Berlin discotheque bombing The Event
A terrorist bomb explodes in a nightclub frequented by American servicemen Date April 5, 1986 Place La Belle nightclub, West Berlin Media-orchestrated outrage at this event generated support for the Reagan administration’s increasingly aggressive stance toward alleged foci of terrorism in the Middle East. At 1:40 a.m. on April 5, 1986, a terrorist bomb exploded in the crowded La Belle nightclub in West Berlin, an establishment frequented by U.S. servicemen. One American soldier and a Turkish woman were killed outright, and more than two hundred people were injured, some seriously. A second American later died of injuries.
See also African Americans; Conservatism in U.S. politics; Economic Recovery Tax Act of 1981; Elections in the United States, 1980; Elections in the United States, 1984; Elections in the United States, 1988; Homelessness; Income and wages in the United States; Liberalism in U.S. politics; Marriage and divorce; Moral Majority; Reagan, Ronald; Reagan Revolution; Reaganomics; Social Security reform; Unemployment in the United States; Women in the workforce.
Firefighters search the debris after the bombing of the La Belle discotheque in West Berlin on April 5, 1986. (AP/Wide World Photos)
1042
■
When Harry Met Sally . . .
No organization claimed responsibility for the bombing. German investigators pursued several leads, including neo-Nazi German nationalists, the Palestine Liberation Organization (PLO), and Libyan agents seeking revenge for an American naval attack in the Gulf of Sidra in late March. Seizing upon the Libyan connection and citing two coded messages emanating from the Libyan embassy in Berlin, the United States used the La Belle nightclub bombing as justification for an aerial attack on Tripoli and Benghazi on April 15. That attack occurred as the Cold War threat of the Soviet Union was on the wane, and the Ronald Reagan administration sought to replace it with the threat of international terrorism. Libyan leader Muammar al-Qaddafi, who actively supported the PLO and the Irish Republican Army (IRA), was an obvious target. A majority of Americans approved of the action, but international reaction was mainly negative. In 1990, after the fall of the Berlin Wall, examination of East German Stasi (secret police) files led German and American investigators to Libyan Musbar Eter, who implicated Palestinian Yasser Chraidi, a driver at the Libyan embassy in East Berlin in 1986; Ali Chanaa, a Lebanese-born German citizen who worked for the Stasi; and Verena Chanaa, his German wife. Arraigned in 1996, the four were convicted in November, 2001, of murder and attempted murder after a lengthy trial described by commentators as murky. A 1998 documentary for German ZDF television claimed that Eter worked for both the Central Intelligence Agency (CIA) and Israeli intelligence, that Chraidi was not the mastermind and possibly was innocent, and that evidence pointed to several people who were never prosecuted. Following the verdict, Qaddafi agreed to pay compensation to German victims of the bombing as part of a German-Libyan commercial treaty, but he denied direct Libyan responsibility for the attack. American victims have yet to be compensated. Impact The bombing provided justification for a military attack on Libya. After having faded from the public consciousness, the incident assumed fresh immediacy following the 1998 bombings of American embassies in Kenya and Tanzania, which prompted American missile strikes against alleged terrorist facilities in Afghanistan and the Sudan. The La Belle bombing and its aftermath established a
The Eighties in America
pattern of immediate and massive American retaliation against targets whose connection to the terrorist attack was never subsequently proven. Further Reading
Chomsky, Noam. Pirates and Emperors: International Terrorism in the Real World. New York: Black Rose Books, 1987. Davis, Briant. Qaddafi, Terrorism, and the Origins of the U.S. Attack on Libya. New York: Praeger, 1996. Kaldor, Mary, and Paul Anderson. Mad Dogs: The U.S. Raids on Libya. London: Pluto Press, 1986. St. John, Ronald Bruce. Libya and the United States: Two Centuries of Strife. Philadelphia: University of Philadelphia Press, 2002. Martha A. Sherwood See also Foreign policy of the United States; Libya bombing; Middle East and North America; Pan Am Flight 103 bombing; Reagan, Ronald; Terrorism; USS Vincennes incident.
■ When Harry Met Sally . . . Identification Romantic comedy film Director Rob Reiner (1945) Date Released July 12, 1989
When Harry Met Sally . . . was one of the most critically and popularly successful romantic comedies of the 1980’s. Clever scripting, memorable performances, and a sound track made up of vocal standards drove the film’s popularity, and many of its lines and ideas entered the cultural vocabulary of 1980’s America. Nora Ephron, the accomplished screenwriter and director, wrote the Oscar-nominated script for When Harry Met Sally . . ., directed by Rob Reiner. It tells a story about college classmates Harry Burns and Sally Albright, who after graduation periodically run into each other during their early adulthood. The neurotic, depressed, Jewish Burns, played by comic actor Billy Crystal, and the optimistic, sweet, Protestant Albright, played by Meg Ryan, first meet through a friend on a shared drive from college to New York, where both are moving. They dislike each other and decide not to keep in touch once they arrive at their destination. They meet again five years later on an airplane and then again five years after that, this time in a bookstore. After this meeting they become
The Eighties in America
friends. Harry is going through a divorce and Sally through a breakup, and they find comfort in what is, for both of them, their first honest friendship with a member of the opposite sex. Ultimately, however, they sleep together, after which they fight bitterly. It is not until some months pass that Harry realizes that he is in love with Sally, and they become a couple. When Harry Met Sally . . . raises the question of whether men and women can ever really be platonic friends—a question that particularly preoccupied young adults during the 1980’s. Harry argues “no” from the beginning, then agrees with Sally that it is possible, only to discover that sex and love replace the friendship, proving his original theory correct: Men and women cannot be friends. The supporting actors, Bruno Kirby and Carrie Fisher, also play important roles both in creating the comedy of the film and in providing an example of a successful marriage.
White, Ryan
■
1043
■ White, Ryan Identification AIDS patient and activist Born December 6, 1971; Kokomo, Indiana Died April 8, 1990; Indianapolis, Indiana
White, a teenage hemophiliac infected with HIV through a tainted blood transfusion, drew international attention to the treatment of AIDS patients at the height of widespread alarm over the new disease. On December 17, 1984, Ryan White, then thirteen, was notified by doctors that he had contracted human immunodeficiency virus (HIV) through transfusion of a contaminated blood-clotting agent, Factor VIII, administered during a partial lung removal procedure as part of a treatment for his pneumonia. He was told that he had six months to live. At the time, acquired immunodeficiency syndrome (AIDS) was widely associated with careless habits of so-called
Impact Many of the scenes in the picture have become iconic, including one in which Sally pretends to be having an orgasm loudly in a diner, only to have a nearby patron say to her waiter, “I’ll have what she’s having.” The film featured vocal standards, such as “It Had to Be You” and “Our Love Is Here to Stay,” performed by various artists, but the sound track album was recorded entirely by Harry Connick, Jr., and rose high on the Billboard charts. Further Reading
Krutnik, Frank. “Love Lies: Romantic Fabrication in Contemporary Romantic Comedy.” In Fatal Attractions: Rescripting Romance in Contemporary Literature and Film, edited by Lynne Pearse and Gina Wisker. London: Pluto Press, 1998. Pio, Ramón. “Gender and Genre Conventions in When Harry Met Sally . . . ” In Gender, I-Deology: Essays on Theory, Fiction, and Film, edited by Chantal Cornut-Gentille D’Arcy and José Angel García Landa. Atlanta: Rodopi, 1996. Lily Neilan Corwin See also
Academy Awards; Comedians; Film in the United States; Jewish Americans; Music.
Ryan White prepares for a televised interview in Rome, Italy, in February, 1986. (AP/Wide World Photos)
1044
■
White Noise
alternative lifestyles, including intravenous drug use, promiscuous sex, and homosexuality. Indeed, misconceptions about the disease’s transmission stirred community resistance to White’s continued attendance at school in his rural Indiana hometown. When the family resisted the school’s initial decision essentially to quarantine him by providing him separate bathroom facilities and disposable silverware and its subsequent decision that White be homeschooled, White was expelled. White found his case the center of a national outcry, led by AIDS activists who saw in this case manifest evidence of public ignorance. White himself became a leading advocate, appearing before congressional panels, in national magazines, and on network television, tirelessly explaining that casual contact did not transmit the disease and that its patients should be treated with compassion rather than ostracism. When a district court ordered White reinstated, fear of violence against the boy led the family to relocate to nearby Cicero, Indiana, where White attended public school without incident. Although he often asserted that he wanted only to be healthy and go to school, he accepted the importance of his fame to educate people about the disease and the dangers of stigmatizing AIDS patients. Celebrities such as Michael Jackson (who bought the Whites their home in Cicero) and Elton John (who was at the hospital bedside when White died) and politicians such as President Ronald Reagan, who had consistently resisted AIDS funding, all rallied about the boy’s quiet determination and easy charisma. White’s celebrity, however, was not without controversy, as gay activists pointed out that opprobrium was still accorded those patients whose lifestyle suggested that they somehow “deserved” the virus. Impact In 1990, White, at age eighteen, died from complications of pneumonia. He had changed perceptions about AIDS by arguing that with commonsense precautions, patients could be treated with respect. The year he died, Congress voted to fund the Ryan White Comprehensive AIDS Resources Emergency Act, an unprecedented government support of AIDS research, after stalling for years in the face of public unease fueled by ultraconservative activists. White’s heroic poise in the face of community prejudice and then ultimately in the face of death at a young age raised awareness about the disease at a critical moment in the epidemic, giving health agen-
The Eighties in America
cies an unparalleled example of grace under pressure. Further Reading
Berridge, Virginia, and Philip Strong, eds. AIDS and Contemporary History. New York: Cambridge University Press, 2002. Cochrane, Michel. When AIDS Began. London: Routledge, 2003. Shilts, Randy. And the Band Played On: Politics, People, and the AIDS Epidemic. New York: St. Martin’s Press, 1987. White, Ryan, with Ann Marie Cunningham and Jeanne White. My Own Story. New York: Signet, 1997. Joseph Dewey See also
ACT UP; AIDS epidemic; AIDS Memorial Quilt; Homosexuality and gay rights; Hudson, Rock; Jackson, Michael; Johnson, Magic; Louganis, Greg; Medicine; Reagan, Ronald.
■ White Noise Identification Postmodern novel Author Don DeLillo (1936) Date Published in 1985
The novel brought DeLillo’s works to a wider audience and defined the postmodern experience in America. White Noise (1985) opens in a mildly comic fashion as professor Jack Gladney, chair of the Hitler Studies department, looks out his office window and watches families arrive in their vans and unpack hordes of possessions for arriving students. Gladney is obsessed with death, his own and his wife’s, and his life is constructed around attempts to evade the inevitable. His obsession reaches its apex when he and his family attempt to escape after a chemical spill labeled as an “Airborne Toxic Event.” Gladney learns that he has been contaminated and that the dosage is likely fatal, but the doctors cannot predict when his death will occur. The incident exacerbates Gladney’s rampant insecurities, which he masks with repeated spending sprees, believing that possessions will confer security and fulfillment. Thus he and his family are the ultimate consumers—of food, clothes, and TV news and shows. The irony, of course, is that goods and a large
The Eighties in America
Who Framed Roger Rabbit
■
1045
when he witnesses his infant son’s miraculous escape from an auto accident. Gladney lacks the comfort of religion but struggles to find some replacement for faith in order to face his mortality. Impact White Noise, Don DeLillo’s eighth novel, became an instant popular and critical success and won the National Book Award in 1985. Since its publication, the novel has been a mainstay in university literature courses and the subject of considerable scholarly research. Reassessments of DeLillo’s oeuvre now rank him as one of America’s foremost novelists. Further Reading
Bloom, Harold, ed. Don DeLillo. Philadelphia: Chelsea House, 2003. Kavadlo, Jesse. Don DeLillo: Balance at the Edge of Belief. New York: Peter Lang, 2004. Lentricchia, Frank, ed. Introducing Don DeLillo. Durham, N.C.: Duke University Press, 1991. David W. Madden Don DeLillo. (Thomas Victor)
physical stature (Gladney admires heavy people, believing that bulk staves off death) cannot insulate him from the inevitable. Television is yet another of Gladney’s evasions; the set is constantly on, and the house is awash in commercial jingles, lines from comedies and various talking heads, and volumes of misinformation. Much of the novel’s abundant comedy emerges from family debates in which one erroneous “fact” is traded for another with smug assurance by each of the conversants. The family especially enjoys watching news coverage of catastrophes, gaining a false sense of power because of their seeming immunity from such perils. However, once Gladney is exposed to toxins, the sense of dread has a definable identity. The novel’s title emphasizes that the characters are surrounded by unseen or unrecognized forces, the most obvious of which are the waves of radio transmission and the radiation from television and other sources. Even the toxic event is “airborne,” a cloud that is perceptible but the contamination and effects of which are hidden. Just as the characters are surrounded by noise and one another, they are surrounded by the inevitability of death, which Gladney grudgingly comes to terms with at the novel’s close
See also Air pollution; Book publishing; Consumerism; Literature in the United States.
■ Who Framed Roger Rabbit Identification American film Director Robert Zemeckis (1952Date Released June 24, 1988
)
The first full-length movie to feature live actors and animated characters throughout, this Disney production also featured cartoon characters from a variety of competing studios. Who Framed Roger Rabbit was the most expensive motion picture ever made when it was released. It also was the first full-length film effectively to combine animation and live action for its entire length, the first partnership of Disney and Warner Bros., and the first teaming of familiar cartoon characters from different studios. One example was the first and only teaming of Donald Duck and Daffy Duck, seen performing a wild piano duet. The 103-minute film earned considerable critical praise, and a sizable box office, in its original theatrical release, more than doubling its reported cost of $70 million. “Where else in the Eighties can you do this?” asked director Robert Zemeckis in Rolling
1046
■
The Eighties in America
Williams, Robin
Stone magazine. Hollywood had attempted similar combinations of live and cartoon characters, such as a dance scene featuring Gene Kelly and Jerry the Mouse (of the Tom and Jerry animated series) in Anchors Aweigh (1945), as well as a nine-minute Looney Tunes cartoon, 1940’s black-and-white You Ought to Be in Pictures, starring Porky Pig, Daffy Duck, and real-life Warner Bros. producer Leon Schlesinger. To create believable visuals for Who Framed Roger Rabbit, more than eighty-five thousand hand-painted cels were created after plotting each shot. Legendary animator Chuck Jones himself storyboarded the Daffy-Donald scene (although he later criticized the film for giving live actors more sympathy than cartoon characters). Besides a fine display of new technology, Who Framed Roger Rabbit was also a good movie. Coproduced by Steven Spielberg and Disney, it was written by Peter S. Seaman and Jeffrey Price, who based their screenplay on Gary K. Wolf’s 1981 novel Who Censored Roger Rabbit? Inspired by the film Chinatown (1974) and the actual conspiracy to destroy California’s streetcar systems to sell more cars, tires, and gasoline, Who Framed Roger Rabbit was an allegory for capitalism run amok versus an ideal, pastoral, Jeffersonian innocence, and it targeted adults as well as younger audiences. Set in 1947 in a world inhabited by both humans and cartoon characters (“Toons”), the movie is a strange, funny blend of cartoon high jinks and film noir. Roger is the nephew of Bambi (1942) costar Thumper and is distracted from his acting jobs by jealousy over his wife Jessica. Roger’s boss hires hardboiled (and Toon-hating) detective Eddie Valiant (Bob Hoskins) to look into it, but things get complicated when Roger is suspected of murdering Jessica’s possible patty-cake partner. Helping Hoskins juggle all kinds of detective and cartoon devices—plus crime-drama starkness and cartoon sunniness—are Christopher Lloyd (as Judge Doom), Kathleen Turner (as Jessica Rabbit), Stubby Kaye (as Marvin Acme), Joanna Cassidy (as Dolores), and Charles Fleischer (as Roger). Before casting Hoskins, filmmakers reportedly considered approaching several high-profile actors for the detective role, including Jack Nicholson, Eddie Murphy, and Bill Murray. Distributed by Disney’s Touchstone subsidiary, the film was codirected by Richard Williams, who handled the animated segments. Those moments
featured many other famous cartoon characters from several studios, including Goofy, Porky Pig, Woody Woodpecker, Betty Boop, Droopy, and both Bugs Bunny and Mickey Mouse. Behind the scenes, the film featured notable voice actors, including Mel Blanc (Daffy Duck, Bugs Bunny, and others), Wayne Allwine (Mickey Mouse), Tony Anselmo (Donald Duck), and Mae Questel (Betty Boop). Impact Who Framed Roger Rabbit earned $150 million in its original theatrical release, won three Academy Awards, and was nominated for four others. It is credited with reviving Hollywood animation, paving the way for Dreamworks, Pixar, Fox, and other companies producing animated features. Further Reading
Corliss, Richard. “Creatures of a Subhuman Species.” Time, June 27, 1988, 52. Powers, John. “Tooned Out.” Rolling Stone, August 11, 1988, 37-38. Wolf, Gary. Who Censored Roger Rabbit? New York: St. Martin’s Press, 1981. Bill Knight See also
Academy Awards; Computers; Film in the United States; Special effects; Turner, Kathleen.
■ Williams, Robin Identification American actor and comedian Born July 21, 1951; Chicago, Illinois
Williams is best known for the unique, high-intensity, stream-of-consciousness comedy that has earned him the reputation as one of the best improvisational comedians of all time. Robin Williams overcame his childhood shyness by becoming involved in drama during high school and pursued this interest at Claremont Men’s College. In 1973, he was accepted into the highly selective advanced program at Juilliard, along with Christopher Reeve, where the two classmates studied under John Houseman and established a lifelong friendship. At Houseman’s advice, Williams returned to San Francisco to pursue a career in stand-up comedy. In February, 1978, he was cast in a guest role as the space alien Mork on the Happy Days television series, which led to a starring role in the spin-off series Mork and
The Eighties in America
Williams, Vanessa
■
1047
Mindy, which ran from 1978 to 1982. The American public was captivated by the manic, free-associating character of Mork, who was featured on lunchboxes and posters and added several catchphrases to the lexicon, including his trademark greeting, “nanoo nanoo.” Williams had become an overnight sensation. In 1980, Williams graduated from television to film with his debut in Robert Altman’s version of Popeye, a critical and box-office disappointment. His next movie, The World According to Garp (1982), was a critical success, and Moscow on the Hudson (1984) was also well received, but it was Good Morning, Vietnam (1987) that not only secured Williams’s repRobin Williams around 1987. (Hulton Archive/Getty Images) utation as a serious actor but also garnered him a Best Actor Oscar Spignesi, Stephen J. The Robin Williams Scrapbook. nomination. In 1989, he received a second Best Actor New York: Citadel Press, 1997. nomination for his work in Dead Poets Society. Mary Virginia Davis During the 1980’s, Williams continued his standup career with Home Box Office (HBO) comedy See also Academy Awards; Comedians; Comic Respecials in 1982 and 1986 and was named number lief; Film in the United States; Television. thirteen on Comedy Central’s list of 100 Greatest Stand-Ups of All Time. Impact In addition to his reputation as a comedian, Robin Williams established himself as a wellregarded and versatile actor after the mid-1980’s, able to move effortlessly from comedy to serious drama with The Fisher King (1991), Good Will Hunting, (1997), and One Hour Photo (2002). He also became well known for his charitable work, particularly the Comic Relief specials on HBO with Whoopi Goldberg and Billy Crystal, which raised funds for the homeless. Williams, a tireless supporter of U.S. troops in Iraq and Afghanistan during the early twenty-first century, appeared in several United Service Organizations shows over the course of the Iraq War. Further Reading
Dougan, Andy. Robin Williams. New York: Thunder’s Mouth Press, 1999. Givens, Ron. Robin Williams. New York: Time, 1998. Jay, David. The Life and Humor of Robin Williams: A Biography. New York: HarperPerennial, 1999.
■ Williams, Vanessa Identification Miss America, singer, and actor Born March 18, 1963; Tarrytown, New York
Williams was the first African American to be crowned Miss America. Until 1984, the Miss America pageant had not been noted for cultural or racial diversity. Though a Jewish woman, Bess Myerson, had won the Miss America crown, that victory had occurred decades earlier and not without substantial controversy. Vanessa Williams attended the School of Performing Arts in New York City (made famous by the movie and television series Fame). She also began winning beauty contests in the early 1980’s, including the Miss New York contest, thereby qualifying to enter the Miss America contest. Williams won the crown in 1984. Thrilled to win and undeniably performing outstandingly well in her new role, Williams seemed to be an ideal choice. Halfway through her tenure,
1048
■
The Eighties in America
Wilson, August
however, a scandal erupted when it was revealed that she had posed for nude photos a few years before entering the contest. As a result of the highly publicized scandal, Williams received death threats and hateful letters from disgruntled devotees of the pageant. She tried to redeem her reputation, insisting that the photos were artistic rather than pornographic, but finally resigned the crown amid growing controversy and opposition to her reign as Miss America. Williams’s nude photos were published in an issue of Penthouse magazine that grossed fourteen million dollars. After Williams resigned the Miss America crown, her runner-up, Suzette Charles, became the new Miss America. Though Charles was also African American, there were still claims that the forced resignation of Williams was due to racism. Williams went on to establish a singing career, performing backup vocals for George Clinton on a 1986 record album. Her first solo album, The Right Stuff, was released in 1988. A ballad from the album eventually
reached number one on the Billboard Hot Black Singles chart. The album subsequently was very successful and was certified gold, spawning a successful singing career for Williams. Williams was also interested in being an actor. Her first film was Under the Gun (1986). She developed very successful singing and acting careers, garnering public praise, as well as financial freedom. Impact Williams was the first woman to overcome the race barrier that seemed to prevent African American women from winning the Miss America pageant. Although she was devastated by her forced resignation of the crown, she went on to achieve great success as a singer and actress, becoming a role model for young African American girls in the process. Further Reading
Boulais, Sue. Vanessa Williams: Real Life Reader Biography. Childs, Md.: Mitchell Lane, 1998. Freedman, Suzanne. Vanessa Williams. New York: Chelsea House, 1999. Twyla R. Wells See also
African Americans; Feminism; Film in the United States; Music; Pop music; Pornography; Racial discrimination; Scandals.
■ Wilson, August Identification African American playwright Born April 27, 1945; Pittsburgh, Pennsylvania Died October 2, 2005; Seattle, Washington
Wilson gained prominence as an American playwright in the 1980’s. It was during this decade that Wilson’s vision of writing a cycle of ten plays, each set in a specific decade of the twentieth century, began to take shape.
Vanessa Williams announces her resignation as Miss America at a press conference on July 23, 1984. (AP/Wide World Photos)
Born Frederick August Kittel and having grown up in the predominantly African American Hill District of Pittsburgh, August Wilson often returned in his imagination to his native soil, mining its richness for artistic purposes. His plays explore themes ranging from African American identity and the impact of history on his many memorable characters to the challenges confronting family, especially fathers and sons. Of the five plays Wilson wrote and produced during the 1980’s, four received multiple awards. Ma
The Eighties in America
Rainey’s Black Bottom (pr. 1984) received a Tony Award and the New York Drama Critics Circle Award for the best new play of 1984-1985. Fences (pr. 1985), which grossed over $11 million in its first year on Broadway, won a handful of awards, including the Pulitzer Prize in drama in 1987. Joe Turner’s Come and Gone (pr. 1986) garnered a Tony Award and the New York Drama Critics Circle Best Play Award for 1988. Wilson received a second Pulitzer Prize in drama in 1990 for The Piano Lesson (pr. 1987). These plays served as testimony to Wilson’s versatility and success as a playwright. Set in the musical world of 1920’s Chicago, Ma Rainey’s Black Bottom utilizes the blues as a means of exploring this important decade in African American history. Fences brings to life the trials and tribulations of a family in an unnamed northern industrial city during the 1950’s. Joe Turner’s Come and Gone revisits Pittsburgh during the first decade of the twentieth century and dramatizes the legacy of slavery on characters living in a boardinghouse after coming north during the Great Migration. The Piano Lesson, set in Pittsburgh during the 1930’s, again bridges characters’ lives in the South and in the North. Similar to Wilson’s literal and figurative use of fences in Fences, a piano is used in The Piano Lesson to show characters in conflict over the past and the present in relation to what the future holds. Impact Wilson’s plays place him securely within both mainstream and African American literary traditions. Fences has been favorably compared to Arthur Miller’s Death of a Salesman (1949) in its pathos and moving depiction of protagonist Troy Maxson, feared, loved, and ultimately forgiven by a family he berates and betrays. Wilson enriched the way American theater depicted race, while he encouraged younger African American artists to write plays. Not since Lorraine Hansberry in the 1950’s and Amiri Baraka in the 1960’s had an African American artist had such an impact on American drama. Further Reading
Elkins, Marilyn, ed. August Wilson: A Casebook. New York: Garland, 1994. Shannon, Sandra G. The Dramatic Vision of August Wilson. Washington, D.C.: Howard University Press, 1995. Wolfe, Peter. August Wilson. New York: Twayne, 1999. Kevin Eyster
Winfrey, Oprah
■
1049
See also African Americans; Broadway musicals; Literature in the United States; Mamet, David; Racial discrimination; Shepard, Sam; Theater.
■ Winfrey, Oprah Identification Talk-show host Born January 29, 1954; Kosciusko, Mississippi
Winfrey’s direct approach to interviewing her guests, coupled with her ability to make them feel at ease, made her show the nation’s top-rated talk show in the 1980’s. On January 2, 1984, Oprah Winfrey hosted her first talk show on AM Chicago. She chose discussion topics to which everyone in her audience could relate. In many cases, she had experienced the same pain or circumstances as her guests. Winfrey’s popularity grew as she dealt with issues honestly and enthusiastically. By 1985, the show was changed from a half hour to an hour-long show and was renamed The Oprah Winfrey Show. Winfrey experienced a life-changing event when director Steven Spielberg cast her as Sofia in the film The Color Purple (1985). Many of the struggles of African Americans in the Deep South represented in the film were struggles that Winfrey had faced during her own life. Her role in the film revealed her talent as an actor and earned her an Academy Award nomination and a Golden Globe nomination for Best Supporting Actress. On September 8, 1986, The Oprah Winfrey Show became nationally syndicated. Winfrey formed her own production company, Harpo Productions, and served as its chief executive officer. The program increased in popularity and continued to get top ratings, garnering both Winfrey and the show several awards. She eventually bought the rights to The Oprah Winfrey Show and built Harpo Studios, making her the first African American woman to own a studio and production company. Winfrey was then able to produce the show in her own way. One change she made was to air taped shows as opposed to live broadcasts. Another change was no longer to use cue cards. Because she believed that prepared questions would compromise the show’s and her own authenticity, she conducted her show using only a few note cards and her own instincts. She asked her guests about issues that interested her audience, and her easy interaction with ordinary people softened
1050
■
The Eighties in America
Women in rock music
Oprah Winfrey relaxes in her office in December, 1985, after a morning broadcast of her local Chicago, Illinois, show. The show was syndicated nationally the following year. (AP/Wide World Photos)
the gap between societal norms for public and private expression. While talking to her guests was important, listening to them was crucial, and Winfrey demonstrated through her questions, exchanges, tone of voice, and body language that she was clearly able to do that. Impact Viewers became like Winfrey’s extended family. Her influence—not only on women and African Americans but on Americans of all other races and cultures as well—was profound. Her strong message that the power to change comes from within was reflected in her phenomenal success. Further Reading
Adler, Bill, ed. The Uncommon Wisdom of Oprah Winfrey. Secaucus, N.J.: Carol, 1997. Bly, Nellie. Oprah! Up Close and Down Home. New York: Kensington, 1993. Garson, Helen S. Oprah Winfrey: A Biography. Westport, Conn.: Greenwood Press, 2004. King, Norman. Everybody Loves Oprah! New York: William Morrow, 1987.
Krohn, Catherine. Oprah Winfrey. Minneapolis: Lerner, 2002. Mair, George. Oprah Winfrey. New York: Carol, 1994. Waldron, Robert. Oprah! New York: St. Martin’s Press, 1988. Elizabeth B. Graham See also Academy Awards; African Americans; Color Purple, The; Film in the United States; Spielberg, Steven; Talk shows; Television; Women in the workforce.
■ Women in rock music Definition
Female singers and musicians in several genres of popular music
In the 1980’s, the most obvious accomplishment made by women in rock music pertained to sheer quantity, with female singers and instrumentalists making their presence known in greater numbers than ever before. More subtly, the
The Eighties in America
emergence of women in rock during the 1980’s represented an age-old pattern of how women often achieve gender equity in popular culture. In doing so, they also influenced the demographics of rock in regard to age. Two women from the 1960’s, one a holdover and one a crossover, provided two of the biggest surprises of the decade. Grace Slick, vocalist for the band Jefferson Airplane, continued to write songs and sing with that group’s descendant, Jefferson Starship, through most of the 1980’s; in 1985, for example, she issued the massive hit “We Built This City,” a paean to San Francisco pop music. Tina Turner, a veteran of her husband Ike’s rhythm-and-blues revues for twenty years, embarked on a solo career by reinventing herself as a full-throated rock “belter,” beginning with her 1984 hit, “What’s Love Got to Do with It.” Three newcomers during the 1980’s who reflected the back-to-basics approach to rock music
Madonna performs at an AIDS benefit concert in New York’s Madison Square Garden in 1987. (AP/Wide World Photos)
Women in rock music
■
1051
led by Turner were Laura Branigan, Pat Benatar, and Chrissie Hynde. Expanding the Audience of Rock
Some of the new women in rock expedited their success by appealing to an audience hitherto unexploited by music promoters: girls in their preteens and early teens. The early 1980’s saw the simultaneous appearance of two of the most talented performers of the decade, Cyndi Lauper and Madonna. Although they projected highly sexualized images, both women’s playful, extravagant fashion styles caught on with young girls across North America, inspiring millions of them to parade through schoolyards and shopping malls costumed as their idols. Almost overnight, rock, which for thirty years had been largely the provenance of high school and college students, enlisted legions of fans of elementary and middleschool age. This trend was cannily imitated by two other young women toward the end of the decade, Debbie Gibson and Tiffany, who formed fan bases by performing in malls frequented by young girls.
More Women, More Rock The sheer number of bands that included female members or were allfemale in the 1980’s is staggering, as is the breadth of styles these women reflected. Two California bands solely comprising women, the Bangles and the GoGo’s, specialized in cheery pop music suggestive of the Beach Boys and the early Beatles. The Pixies, from Boston, with bassist Kim Deal, performed serious, sophisticated music of the sort that came to be called college rock. Throwing Muses and ’Til Tuesday, led by Kristin Hersh and Aimee Mann, respectively, were characterized by straightforward, “nofrills” rock, while Siouxsie and the Banshees, led by Susan Dallion, helped pioneer gothic rock, a somber subgenre employing gothic imagery. Edie Brickell and the New Bohemians, from Texas, performed gentle folk/jazz tunes, while the Plasmatics was a violent punk band whose lead singer, Wendy O. Williams, sometimes demolished cars on stage with a blowtorch. What did female artists do with their expanding presence in rock? Benatar’s work exemplifies the themes that women explored in the 1980’s. Like Turner, she established that women could have a tough-minded approach to romance (as expressed in songs such as “Love Is a Battlefield” and “Sex as a Weapon”), and she also expressed female assertive-
1052
■
The Eighties in America
Women in rock music
Selected Women in Rock Music in the 1980’s Artist/Group
Notable 1980’s Songs
Paula Abdul
“Forever Your Girl,” “Opposites Attract,” Straight Up”
Bananarama
“Cruel Summer,” “I Heard a Rumor,” “Venus”
The Bangles
“If She Knew What She Wants,” “Manic Monday,” “Walk Like an Egyptian”
Pat Benatar
“Hit Me with Your Best Shot,” “Love Is a Battlefield,” “We Belong”
Laura Branigan
“Gloria,” “Imagination,” “Self Control”
Edie Brickell and the New Bohemians
“Little Miss S,” “What I Am”
Belinda Carlisle
“Heaven Is a Place on Earth,” “I Get Weak,” “Mad About You"
Cher
“If I Could Turn Back Time,” “Just Like Jesse James,” “We All Sleep Alone”
Gloria Estefan & Miami Sound Machine
“Anything for You,” “1-2-3,” “Falling in Love (Uh-Oh)”
Aretha Franklin
“Freeway of Love,” “I Knew You Were Waiting (for Me),” “Sisters Are Doin’ It for Themselves”
Debbie Gibson
“Foolish Beat,” “Lost in Your Eyes,” “Only in My Dreams”
The Go-Go’s
“Our Lips Are Sealed,” “Vacation,” “We Got the Beat”
Heart
“All I Wanna Do Is Make Love to You,” “Never,” “These Dreams”
Whitney Houston
“How Will I Know,” “I Wanna Dance with Somebody (Who Loves Me),” “Saving All My Love for You”
Janet Jackson
“Control,” “Miss You Much,” “Nasty”
Joan Jett and the Blackhearts
“Crimson and Clover,” “I Hate Myself for Loving You,” “I Love Rock ’n Roll”
Cyndi Lauper
“Girls Just Want to Have Fun,” “Time After Time,” “True Colors”
Madonna
“Borderline,” “Everybody,” “Holiday,” “Into the Groove,” “Like a Prayer,” “Live to Tell," “Papa Don’t Preach”
Olivia Newton-John
“Heart Attack,” “Physical,” “Suddenly"
Stevie Nicks
“Edge of Seventeen (Just Like a White Winged Dove),” “Leather and Lace,” “Stop Draggin’ My Heart Around”
The Pointer Sisters
“I‘m So Excited,” “Jump,” “Neutron Dance"
Sade
“Never as Good as the First Time,” “Smooth Operator,” “The Sweetest Taboo”
Siouxsie and the Banshees
“Belladonna,” “Cities in the Dust,” “Dazzle”
Donna Summer
“Cold Love,” “She Works Hard for the Money,” “The Wanderer”
Tiffany
“Should’ve Been Me,” “I Think We’re Alone Now,” “Could’ve Been”
Tina Turner
“Break Every Rule,” “Private Dancer,” “What’s Love Got to Do with It”
The Eighties in America
ness in relationships (“Treat Me Right”). More important, she dealt with subjects usually neglected by male rock singers, such as child abuse (“Hell Is for Children”). However, two songs by Lauper and Madonna best illustrate how female artists of the decade achieved thematic redress by tackling subjects that males ignored or by taking a feminist tack on those issues. In “Papa Don’t Preach,” Madonna sings of teenage pregnancy without bathos or moralizing: An unwed mother tells her father that she is pregnant, takes responsibility for her situation, and asserts her right to make her own decision on the matter. In the most famous song of the decade on gender issues, Lauper’s “Girls Just Want to Have Fun,” the singer equates “fun” not with giddiness and good times alone but with freedom and self-expression. When she insists that, rather than be isolated by male possessiveness, she wants “to be the one to walk in the sun,” this anthemic song accomplishes what good pop lyrics of any genre do—it states with simple eloquence a truth or feeling often obfuscated by verbosity and cant in more sophisticated texts. Impact The strides made by female rock musicians in the 1980’s illustrate a technique women in Western culture have often used to gain access to previously male-dominated venues. For example, in the late 1700’s women in large numbers began to write novels, a new genre lacking a tradition of sexism. Likewise, in the 1800’s many women embraced Spiritualism as a means of claiming leading roles denied them in mainstream religion, as the new movement had no patriarchal tradition. In the 1980’s, women similarly used two trends in rock that emerged in the late 1970’s and early 1980’s: punk and synthpop (a New Wave form of soft rock depending primarily on electronic synthesizers for instrumentation). The punk ethos questioned everything remotely traditional, even the young traditions of rock itself, including sexism. Therefore, a woman becoming a punk rock musician was not contradicting that ethos; she was embodying it. Furthermore, both punk and synthpop downplayed guitar pyrotechnics of the sort practiced by Jimi Hendrix and instead foregrounded lyrics, melody, and vocals. This is not to say that women were poor guitarists: Many of the female rock musicians of the 1980’s were excellent guitarists or bassists. However, synthpop provided an avenue into rock for women not interested in emulating male idols of the past.
Women in the workforce
■
1053
Further Reading
DeCurtis, Anthony, et al., eds. The Rolling Stone Album Guide. New York: Random House, 1992. Of the various editions of this very useful guide, this one from the early 1990’s is best for research on artists of the 1980’s. Gaar, Gillian G. She’s a Rebel: The History of Women in Rock and Roll. Seattle: Seal Press, 1992. Excellent history of women in pop music, with good coverage of the 1980’s. Lewis, Lisa. Gender Politics and MTV: Voicing the Difference. Philadelphia: Temple University Press, 1990. Thoughtful examination of the role of music videos and their effect on women in rock. O’Dair, Barbara, ed. Trouble Girls: The Rolling Stone Book of Women in Rock. New York: Random House, 1997. Another good review of the role of women in rock. Reddington, Helen. The Lost Women of Rock Music: Female Musicians of the Punk Era. Brookfield, Vt.: Ashgate, 2007. Though focusing on British rockers, this insightful account also reveals how women found ingress to the rock world through punk. Thomas Du Bose See also
Blondie; Cher; Go-Go’s, The; Heavy metal; Lauper, Cyndi; Madonna; MTV; Music; Music videos; New Wave music; Pop music; Turner, Tina.
■ Women in the workforce Definition
Women employed or seeking employment during a specific period
Changes in the gender of the workforce had a significant impact on the culture of the later twentieth century, creating new work ethics and new demands for goods and services. During the 1980’s, women left their traditional roles as full-time housewives and mothers to join the workforce in unprecedented numbers. The 1980’s began with the inauguration of newly elected president Ronald Reagan and his lavish display of wealth in the attendant ceremonies. College graduates were entering the workplace to fill prestigious office professions that offered high salaries. These factors led to a demand for trendy and luxurious goods and a more affluent lifestyle, which in a single household often depended on the earning
1054
■
Women in the workforce
power of both the man and the woman. Fresh from the feminist movement of the 1970’s, which emphasized gender equality, women sought both more education and more jobs in the 1980’s. The demographics of gender in the workforce shifted for reasons that went beyond the women’s movement, however: As a result of the spread of nofault divorce laws in many states, divorce rates rose, and divorce became widely acceptable in the United States. Growing numbers of illegitimate children and single-parent families also played a role in women entering the workforce, as well as a trend toward delayed marriage. Many of the women who entered the workforce in the 1980’s did so because they had to work to support themselves and their children. The policies of the Reagan administration increased women’s need to work. The Reagan administration curtailed the growth of social welfare programs and limited benefits to those whom Reagan called the “truly needy.” Between 1981 and 1984, spending on the federal financial assistance program Aid to Families with Dependent Children (AFDC) was cut by 13 percent, food stamps by 13 percent, and federal support for child nutrition in the schools by 28 percent. The changing nature of work itself also increased the number of working women. The United States was in transition from an industrial age to an information age, and the 1980’s saw an expansion of technology that eventually defined the modern world of work. Not only did more jobs require technological knowledge rather than physical strength, but also some new technologies, such as commercially available handheld mobile phones and personal computers, made it easier for women to supervise their families from the workplace or even to telecommute from home. Child Care Reagan’s Republican presidential campaign emphasized “family values,” a strategy that contributed significantly to his election in 1980. Subsequently, the Democrats began to emphasize family issues, calling for concrete programs to help working mothers, such as flexible work hours, maternity and paternity leave, uniform standards for child care, and federal enforcement of child-support payments. In the 1984 elections, the Democrats unsuccessfully tried to take the family issue away from the Republicans, leading to angry controversies that surrounded child care and family concerns in the 1980’s. The result was a stalemate on family issues.
The Eighties in America
While the government was deadlocked, childcare problems continued to grow and cause anxiety for both parents and society as a whole. Child care was the most difficult issue for young families to face. By the end of the 1980’s, nearly half of all marriages were ending in divorce, and the plight of singleparent families (usually headed by the mother) was often one of severe economic hardship. Single parents had to make the difficult choice between caring for children at home and finding employment to help support their families. By the end of the 1980’s, the number of children five years old or younger whose mothers worked had increased to ten million. The number of affordable child-care programs was never enough to meet the demand. Research has typically found few significant differences between children whose mothers are employed and those whose mothers are not. However, the effects of a mother’s employment on her child often differ depending on the characteristics of the child, conditions at the mother’s workplace, and the type and quality of child care the mother is able to secure. The social class, age, and gender of children also may influence the effects of child care. Group care of children is improved when the ratio of children to caregivers is low, allowing the caregiver to be responsive to each child; when caregivers are trained; and when the caregiving environment is stimulating. Studies in the 1980’s found that infants and preschool children with employed mothers were not significantly different from those with nonemployed mothers in respect to language development, motor development, and intelligence. These studies showed that, as role models, working mothers had a positive effect on adolescent daughters; these daughters tended to be outgoing, active, and high achievers with educational aspirations. Children with mothers in the workforce viewed women and women’s employment more positively than children of full-time homemakers, and they had less traditional views of marriage and gender roles. Professional Work Prior to the 1970’s, the professions of medicine and law were generally closed to women, and the (male) practitioners of these professions controlled the membership. However, as the feminist movement provided women with a vision of equality, women advocated for new laws to ensure their access to education, jobs, and equal pay.
The Eighties in America
Women in the workforce
■
1055
Women in the Workforce During the 1980’s, the number of women workers increased, and each year women accounted for a larger percentage of the total workforce, rising from 41.8 percent in 1980 to 44.9 percent in 1989, according to the Bureau of Labor Statistics.
Year
Civilian Labor Force (in thousands)
Female Employees (in thousands)
% of Total
Male Employees (in thousands)
% of Total
1980
78,010
32,593
41.8
45,417
58.2
1981
80,273
33,910
42.2
46,363
57.7
1982
82,014
34,870
42.6
47,144
57.4
1983
83,615
35,712
42.7
47,903
57.2
1984
86,001
37,234
43.3
48,767
56.7
1985
88,426
38,779
43.9
49,647
56.1
1986
90,500
39,767
44.0
50,733
56.0
1987
92,965
41,105
44.2
51,860
55.8
1988
94,870
42,254
44.6
52,616
55.4
1989
97,318
43,650
44.9
53,668
55.1
Source: Department of Labor, Bureau of Labor Statistics.
Once these laws were in place, women entered professional schools in vast numbers, and opportunities began to open in typically male professions. Between 1970 and 1985, the proportion of female physicians almost doubled, and between 1983 and 1984 half of all applications to medical schools were from women. Women earned one-third of all U.S. law degrees being granted by the mid-1980’s, and their status in the legal profession improved. Former Supreme Court justice Sandra Day O’Connor’s experience is typical of the obstacles professional women faced. When she graduated from Stanford Law School in the top 10 percent of her class in 1952, she received only one job offer—for the position of legal secretary for a San Francisco law firm. On July 7, 1981, Reagan, who had pledged in his presidential campaign to appoint the first woman to the Supreme Court, appointed O’Connor associate justice to that prestigious body. She served in the Supreme Court from 1981 until her retirement in 2006. The formerly male-dominated field in which women found the greatest opportunities during the 1980’s was academia. Although women tended to teach in less prestigious institutions than men, to teach more courses, to have less access to research
resources, and to be concentrated in the lower-status disciplines of the humanities, education, and social work, they nevertheless grew to occupy a quarter of the full-time college and university faculty membership. Until the 1980’s, few women were able to pursue careers in science. If a woman did manage to acquire the necessary education, subtle barriers excluded her from scientific networks and encouraged her to remain in a marginal position, with little or no support from colleagues. Because women were discouraged from studying mathematics and because various specialties were still considered “male,” women were underrepresented in engineering, physics, chemistry, mathematics, and the earth sciences. In the traditionally female profession of nursing, many nurses concerned themselves with providing psychosocial services to patients, while others specialized in technical services, such as radiology. New knowledge and technologies in the 1980’s demanded that almost all nurses acquire sophisticated technological expertise for their practices. Nurses also sought more responsibilities as nurse practitioners, midwives, and anesthetists. Several problems were common to nursing, in addition to restricted oppor-
1056
■
The Eighties in America
Women in the workforce
tunities for advancement. Lack of autonomy and respect, low pay, often difficult working conditions, and lack of unity among nurses led to strategies to improve their status. In the late 1980’s, it was found that nurses’ working conditions presented serious physical, chemical, and biological hazards. Nurses were often exposed to infectious diseases and suffered high rates of hepatitis B and staphylococcus infections. In addition, nurses were exposed to toxic chemical agents and to carcinogens and radiation hazards. Teaching, another traditionally female profession, was, unlike nursing, highly organized. In 19851986, 630,000 teachers were members of the American Federation of Teachers, and 1,537,967 (about 70 percent of public school teachers) were members of the National Education Association (NEA). During the 1980’s, teacher shortages began to emerge in such fields as mathematics, the sciences, data processing, and computer programming. Teachers began to lose their enthusiasm for the profession for some of the same reasons nurses had lost enthusiasm for theirs: lack of autonomy and respect, low pay, and often difficult working conditions. Women in Blue-Collar Occupations
Blue collars symbolize men and women who work at manual labor: semiskilled operatives, skilled craft workers, and unskilled laborers. Blue-collar workers are generally involved in some type of production process: making and repairing goods and equipment. Traditionally, women were excluded from such jobs or were limited to a small number of low-wage, semiskilled jobs in the textile, apparel, and electronics industries. During the 1980’s, some changes occurred, and the work of women became divided into traditional and nontraditional categories. Patience, dexterity, and speed are the main requirements for the operative jobs within the traditional textile and apparel industries. These jobs do not require formal experience or training and can be learned within a few days or weeks, making them good opportunities for women with little education, little training, or limited English fluency. Nontraditional occupations for women as carpenters, machinists, coal miners, and transportation operatives proved more satisfying and better-paying than traditional jobs, as well as more challenging and rewarding. When women started to enter these jobs in significant numbers in the 1980’s, studies showed that
these women had secured these jobs through great initiative. On the job, they were not averse to filing union grievances and reporting sex discrimination to state and federal agencies when the occasion demanded. Impact The great influx of women into the workforce during the 1970’s and 1980’s changed the ways people think about work. Young women began to realize that any occupation they dreamed of was possible for them to achieve. Girls began to think about future careers in a more challenging way, often dreaming of a career in law or medicine instead of a more traditional field. Men, especially married men, slowly revised their views of women in the workplace. Husbands and wives grew to respect each other’s achievements. In response to the rising numbers of employed and employable women, new employment opportunities arose. While society debated the role of government in providing child care, private and corporate facilities for preschools and infant care arose. Demand for nannies continued to grow. The food industry also felt the impact, as the need for take-out food and home delivery grew. Women’s apparel also changed during the decade, as designers competed to provide suitable office attire for women. These changes had lasting effects on society in the United States, the repercussions of which are still being felt and studied. Further Reading
Acker, Joan. Doing Comparable Worth: Gender, Class, and Pay Equity. Philadelphia: Temple University Press, 1989. Reveals inconsistencies and inadequacies in pay structures of the 1970’s and 1980’s. Ferber, Marianne, Brigid O’Farrell, and La Rue Allen, eds. Work and Family: Policies for a Changing Work Force. Washington, D.C.: National Academy Press, 1991. Describes and advocates many policy changes to help double-income families. Hochschild, Arlie Russell. The Time Bind: When Work Becomes Home and Home Becomes Work. New York: Henry Holt, 1997. A fascinating study of a husband and wife employed by a family-friendly company and the challenges they meet as they try to achieve a balanced life. Stromberg, Ann Helton, and Shirley Harkness, eds. Women Working: Theories and Facts in Perspective. 2d ed. Palo Alto, Calif.: Mayfield, 1988. An important, comprehensive collection of essays that cov-
The Eighties in America
ers such topics as “Contributions of Marxism and Feminism to the Sociology of Women and Work” and “The Data on Women’s Labor Force Participation.” Sheila Golburgh Johnson See also
Affirmative action; Consumerism; Feminism; Marriage and divorce; O’Connor, Sandra Day; Power dressing; Unemployment in Canada; Unemployment in the United States; Unions; Women’s rights.
■ Women’s rights Definition
The movement to attain equal rights for women in Canada and the United States
During the 1980’s, women’s rights in the United States and Canada changed significantly, primarily in terms of reproductive and workplace rights. Changes in the rights of women affected both genders throughout the 1980’s and helped to shape gender attitudes as well as produce a backlash against these changes that would shape American politics and facilitate the rise of the conservative right. Third-Wave Feminism
In both the United States and Canada, the 1980’s saw the advent of third-wave feminism, which would shape feminist approaches to politics and rights. Whereas first-wave feminism of the early twentieth century focused on correcting legal inequalities, such as gaining the right to vote, and second-wave feminism of the 1960’s and 1970’s focused on unofficial inequalities, such as discrimination, third-wave feminism focused on production and work, reproduction and sexuality, and gender and the state. This third wave was heavily influenced by postmodern discourse and its evaluation of the politics of representation. Rather than viewing all women as a homogeneous group, as second-wave feminists had, third-wave feminists looked at other important characteristics that complicated attempts to categorize women, such as race, ethnicity, socioeconomic status, religion, and class. The underlying motive of this philosophy was an acknowledgment that women did not universally share their needs and experiences, which differed according to other characteristics. The needs and experiences of
Women’s rights
■
1057
a middle-class, well-educated black woman, for example, differed from those of her white, lower-class counterpart. These questions of race, class, and sexuality were central to third-wave feminism, as were a multitude of issues concerning women in the workplace, such as the glass ceiling, inequities in pay, sexual harassment, and maternity leave. Other areas of concern for third-wave feminists included representations of women in the media that may lead to eating disorders or unrealistic body standards, the lack of role models for girls, the presentation of women solely as sexualized objects, and antifeminism. The third wave relied on the second wave to produce women accustomed to thinking about feminist issues and questioning gender norms, and women for whom the workplace had been made accessible. Throughout the 1980’s, these attitudes would shape feminist political thought and expectations—as well as fuel conservative backlash that would end up negating many of the advances made by women in earlier decades by removing legal acts and agencies designed to promote equity in education and the workplace. One extreme and tragic example of the hatred inspired by changes in the rights of women would take place in an engineering classroom at the École Polytechnique, University of Montreal, on December 6, 1989, when fourteen female students were gunned down by Marc Lépine, a fellow student who claimed that feminists had ruined his life. The Equal Rights Amendment The Equal Rights Amendment (ERA) proved to be a major legal bat tleground for proponents of women’s rights throughout the early 1980’s. Originally proposed in 1923 as the “Lucretia Mott Amendment” by suffragist leader Alice Paul, the ERA was intended to bar legal discrimination against women. It had been introduced in every session of Congress from 1923 to 1970, and it finally was presented for ratification in 1972, with a seven-year deadline, which was later extended by another thirty-nine months. By 1979, thirty-five of the required thirty-eight states had ratified the amendment, but five of the states had withdrawn their ratification. In 1981, a federal court ruled that the extension had been unconstitutional and that the rescindment of ratification had been valid (Idaho v. Freeman, 1981). The National Organization for Women (NOW), formed by Betty Friedan in 1966,
1058
■
Women’s rights
The Eighties in America
Significant Events Affecting Women in the 1980’s 1980 Medicaid cannot be used to pay for abortions, according to the U.S. Supreme Court decision Harris v. McRae. The United Nations Second World Conference on Women is held in Copenhagen, Denmark.
1981 In County of Washington v. Gunther, the U.S. Supreme Court rules that women can seek remedies for sex-based wage discrimination under the provisions of Title VII of the Civil Rights Act of 1964. Sandra Day O’Connor becomes the first woman appointed to the U.S. Supreme Court. The Equal Rights Amendment (ERA) fails to meet the deadline for state ratification. The Congresswomen’s Caucus reorganizes as the Congressional Caucus for Women’s Issues and admits male members. The Reagan administration closes down the Office of Domestic Violence.
1982 Hysterectomy Educational Resources and Services (HERS) is founded to provide information about alternatives to this frequently performed surgical procedure. The ERA dies after failing to attain the necessary thirty-eighth state ratification; the amendment is reintroduced annually in Congress thereafter.
1983 In City of Akron v. Akron Center for Reproductive Health, the U.S. Supreme Court strikes down a state law requiring a twenty-four-hour waiting period before abortion and mandating that the physician must tell the patient that a fetus is a “human life from moment of conception.” The Coalition Against Media Pornography is founded in Canada to protest the airing of soft-core pornography on cable television.
1984 Congress passes the Child Support Enforcement Amendments to give women means of collecting late child support payments. Jeanne Sauvé becomes the first female governor-general of Canada; she serves until 1990. Antiabortion women in Canada create Real, Equal, Active for Life (REAL) and claim to speak for “real” women of the nation. Congress passes the Retirement Equality Act. In Grove City v. Bell, the U.S. Supreme Court rules that Title IX of the Education Amendments of 1972 applies only to college programs receiving direct federal support; nonfederally funded programs, such as women’s athletics, are required to comply. The Democratic Party nominates Geraldine Ferraro for vice president, making her the first female candidate of a major party in the United States. Despite the appeal of her candidacy and the gender gap in voting, President Ronald Reagan defeats her running mate, Walter Mondale. The Reagan administration ends U.S. financial contributions to international birth control programs.
1985 EMILY’s List is established to raise funds for Democratic women’s campaigns; the acronym EMILY stands for Early Money Is Like Yeast—it makes the dough rise.
The Eighties in America
Women’s rights
■
1059
1985 (continued) Feminists Andrea Dworkin and Catharine A. MacKinnon draft an antipornography ordinance for Indianapolis, Indiana, that calls works of pornography an infringement on women’s rights; the Feminist Anti-Censorship Task Force (FACT) is founded to oppose the ordinance.
1986 In Meritor Savings Bank v. Vinson, the U.S. Supreme Court unanimously recognizes that sexual harassment in the workplace represents a violation of Title VII of the Civil Rights Act of 1964. The U.S. Supreme Court strikes down an antipornography ordinance as a violation of the First Amendment in American Booksellers Association v. Hudnut. Randall Terry creates Operation Rescue to close down abortion clinics.
1987 Canada passes the Pay Equity Act. Fund for the Feminist Majority (FFM) is launched to place women in positions of leadership in business, education, government, law, and other fields.
1988 Congress passes the Civil Rights Restoration Act, restoring the ability to enforce provisions of Title IX of the Education Amendments of 1972. The Canadian Supreme Court strikes down a federal law regulating abortion as unconstitutional. Congress passes the Women’s Business Ownership Act. Congress passes the Family Support Act, which is designed to enforce child support orders and to promote self-sufficiency among large numbers of female welfare recipients.
1989 In Wards Cove Packing Company v. Atonio, the U.S. Supreme Court shifts the burden of proof in Title VII employment discrimination cases to the plaintiffs, requiring them to show that the employment practices they challenge actually cause the discrimination they claim to have suffered. The U.S. Supreme Court in Lorance v. AT&T Technologies rules that an employee filing a complaint about unfair employment practices must do so within 180 days of the alleged violation. The opinion is regarded as a setback for female employees who cannot anticipate the deadlines that they confront in filing charges. Antonia Novello is appointed U.S. surgeon general, the first woman and first Latino to fill that position. The U.S. Supreme Court rules in Webster v. Reproductive Health Services that states have the authority to limit a woman’s ability to obtain an abortion.
attempted to appeal the U.S. district court decision, but in 1982 the Supreme Court declared the ERA dead and any attempts to resuscitate it invalid. Opposition to the ERA was at times an exercise in irony. For example, one of the ERA’s most vocal opponents, conservative spokeswoman Phyllis Schlafly, declared that the ERA would “take away the marvelous legal rights of a woman to be a full-time wife and mother in the house supported by her husband.”
Schlafly was herself a Harvard-educated lawyer and a two-time congressional candidate. The fate of the ERA reflected a shift in political attitudes and a rise in the power of conservative Republicans such as Schlafly and Paul Weyrich. Critics of the ERA declared that the amendment would have granted too much power to Congress and the federal courts. Campaigns played on public fears, declaring that the ERA would mean that women
1060
■
Women’s rights
would be drafted into military service or have to perform heavy labors or that the government would be forced to recognize same-sex marriage. One highly successful campaign posited the rise of mandatory mixed-sex sports teams and bathrooms. The possible impact of the ERA on abortion laws, however, was the fear that opponents of the ERA repeatedly played on the most. On November 15, 1983, the Democratic majority in the House of Representatives tried to pass the ERA again, starting the ratification process from scratch, under a procedure that prevented the consideration of any amendments. Fourteen cosponsors proceeded to vote against the bill, insisting on an amendment proposed by Congressman James Sensenbrenner that read, “Nothing in this Article shall be construed to grant, secure, or deny any right relating to abortion or the funding thereof.” Reproductive Rights The focus on reproductive rights by ERA opponents highlighted many fears about the changing status of women and the ERA’s potential impact on antiabortion laws. This focus was in part fueled by the practice of states using ERAs to challenge antiabortion policies, efforts that succeeded in Connecticut and New Mexico. Reproductive rights continued to be a major area of contention throughout the 1980’s. In 1986, antiabortion activist Randall Terry founded Operation Rescue, whose mission was to block access to familyplanning clinics. Antiabortion leaders not only pointed to the rights of the fetus but also asserted that legal abortion challenged the potential father’s right to control the family. That same year, George Gilder wrote in his book Men and Marriage that making abortion and birth control available to women reduced the penis to “an empty plaything.” Multiple cases were brought to court defending the rights of fathers, usually against women who would not comply with male demands or who had recently filed for divorce. Many doctors stopped offering the abortion procedure; by 1987, 85 percent of the counties in the United States had no abortion services. The 1980’s also saw the implementation of fetal protection policies by numerous companies, including fifteen major corporations, such as Dow and General Motors. These policies barred women from jobs that had been traditionally male and that paid high wages but that involved exposure to harmful chemicals or radiation that might harm a fetus.
The Eighties in America
Around that time, the Reagan administration barred investigation into the harmful effects of video display terminals (VDTs), machines employed in traditionally female occupations. When the National Institute for Occupational Safety and Health attempted to investigate higher rates of reproductive problems among women working with VDTs, the Office of Management and Budget demanded that questions dealing with fertility and stress be dropped from a survey administered to the workers, saying that such questions had no practical utility. Although evidence existed that industrial toxins affected both men and women in producing birth defects, no laws were passed protecting them. Reducing the level of toxins in the workplace was also not considered. In 1984, the case Oil, Chemical, and Atomic Workers International Union v. American Cyanamid Co., brought by a group of women who had undergone sterilization in order to keep their jobs, reached a federal court. Federal appellate judge Robert H. Bork ruled in favor of the company, saying that the fetal protection policy was valid and that the company had been sued only because “it offered the women a choice.” The company’s settlement of $200,000 was divided among the eleven plaintiffs. The Conservative Backlash to Feminism
The few victories for women’s rights made in earlier decades did not go unchallenged in the 1980’s. Conservative leaders such as Weyrich and Howard Phillips proclaimed that women’s equality led only to unhappiness for women. Conservatives criticized the women’s movement for destroying moral values and dismantling the traditional family, while evangelist and Moral Majority cofounder Jerry Falwell declared that “the Equal Rights Amendment strikes at the foundation of our very social structure” in his 1981 book Listen, America! In 1981, the Heritage Foundation, a conservative think tank, produced Mandate for Leadership, which warned against “the increasing leverage of feminist interests” and claimed that feminists had infiltrated government agencies. That same year, the Heritage Foundation drafted its first legislative effort: the Family Protection Bill, which sought to dismantle legal achievements of the women’s movement. Its proposals included the elimination of federal laws supporting equal education, the forbidding of “intermingling of the sexes in any sport or other school-related activities,” the requirement that marriage and motherhood be em-
The Eighties in America
phasized as a career for girls, the revocation of federal funding from any school using textbooks that portrayed women in nontraditional roles, the repeal of all federal laws regarding domestic violence, and the banning of any federally funded legal aid for women seeking abortion counseling or a divorce. The bill provided tax incentives designed to discourage women from working, such as allowing a husband to establish a tax-deductible retirement fund only if his wife had earned no money the previous year. Perhaps as a result of such efforts, new female judicial appointments fell from 15 percent of appointments to 8 percent during Reagan’s first term, and the proportion fell even lower during the second term. Despite federal regulations requiring the Justice Department to set hiring goals aimed at increasing the number of women within its ranks, by 1986 Attorney General Edwin Meese III had yet to hire a woman as a senior policy maker. The Federal Women’s Program, which had been established in 1967 to increase the number of women in government agencies, was dismantled by removing its recruitment coordinators and its budget. As a result of the Paperwork Reduction Act, the federal government ceased collecting recruitment statistics on women. The highest female post on the Reagan staff was held by Faith Whittlesey, assistant to the president for public liaison, covering women’s and children’s issues. A central target for the Heritage Foundation was the Women’s Educational Equity Act (WEEA) program and its director, Leslie Wolfe. The only federal program to promote equal education for girls, the WEEA program had been called one of the most cost-effective programs in the government. After taking office in 1981, Reagan removed 25 percent of the program’s already-approved budget and declared his intention to completely remove its funding the following year. The program’s supporters succeeded in winning a reprieve for the program, although not without some casualties: 40 percent of the program’s budget was cut. Its field reader staff, which evaluated grant proposals, was replaced with women from Schlafly’s Eagle Forum who were unfamiliar with the program’s policies and methods and who did such things as repeatedly reject proposals to study sexual discrimination on the basis that such discrimination did not exist. A year later, Wolfe was fired along with every other woman on staff, and the
Women’s rights
■
1061
office was demoted to the lowest level of bureaucracy. Women and the Workplace Unlike their American counterparts, Canadian women were achieving some significant legal victories. As a result of the Royal Commission, led by Judge Rosalie Abella in 1984, the Employment Equity Act was passed in 1986, requiring employers to identify and remove unnecessary barriers to the employment of women and minorities. The same year, the Federal Contractors Employment Equity Program was implemented, requiring contractors with at least one hundred employees who were providing goods or services for the Canadian government to implement employee equity. In 1988, the Family and Medical Leave Act was introduced in the U.S. Congress and failed to pass, having been tied to the ABC Childcare bill and an antipornography bill that proved too contentious. It entitled employees to family leave in certain cases involving birth, adoption, or serious medical conditions and protected the employment and benefit rights of employees taking such leave. Such a bill would not pass until 1993. In 1983, a thirty-nine-year-old newscaster, Christine Craft, filed suit against her former employer, Metromedia, a Kansas City affiliate of the American Broadcasting Company (ABC), on charges of sex discrimination related to dismissal from her position. Among the reasons Metromedia allegedly cited for dismissing her were that she was “too old, too unattractive, and not deferential to men.” Other women working for the company confirmed that she was not alone in her treatment; they quoted Metromedia’s “fanatical obsession” with their appearance and had also felt pressured to quit as a result of failing to meet the company’s appearance standards for women. No men reported similar treatment. In August, 1983, the case was tried at the federal district court in Kansas City, Missouri. The jury returned a unanimous verdict in Craft’s favor, awarding her $500,000 in damages. U.S. district court judge Joseph E. Stevens threw out the verdict and called for a second trial in Joplin, Missouri. Stevens justified Craft’s dismissal on the grounds that it was not based on sex discrimination but on an application of market logic. In the second trial, which took place in 1984, the jury again decided in Craft’s favor. Metromedia appealed the decision, and this time
1062
■
Women’s rights
the verdict was dismissed by the U.S. Court of Appeals for the Eighth Circuit. The Supreme Court refused to hear the case, ending the legal struggle in Metromedia’s favor. While the concept of sexual harassment became more visible in the 1980’s, such cases were usually dismissed or determined in the defendant’s favor in the United States. In 1986, Mechelle Vinson filed suit against her employer, the Meritor Savings Bank, on the grounds of sexual harassment, arguing that her employer had subjected her to fondling, exposure, and rape. The district court ruled that her attractive appearance could be held against her and that testimony regarding whether or not her clothing could be considered provocative could determine whether or not she had encouraged the rape on her employer’s part. Across the border in Canada, some legal victories occurred. In 1987, systemic discrimination in the hiring of women was found to be illegal in the case C.N.R. v. Canada, and in 1989 the Canadian Supreme Court ruled that sexual harassment was a form of discrimination in Janzen v. Platy Enterprises. In academia, the case of Nancy Shaw led to increased awareness of women’s studies. Shaw, a lesbian and women’s studies professor at the University of California, Santa Cruz, brought suit against the university when she was denied tenure in 1982 on the basis that her work on women’s health in prison was not sufficiently scholarly. Shaw’s legal battle ran for five years; in 1987, she was granted tenure and allowed to return to teaching. Impact Events of the 1980’s would shape women’s political experience throughout the following decades and force politicians to consider their positions on women’s rights and to make a conscious effort to include or exclude those issues from their campaigns. Workplace legislation passed in the 1980’s would shape the experience of both Canadian and American women in the labor force and eventually open up new professions while removing some barriers and implementing others. Further Reading
Brownmiller, Susan. In Our Time: Memoir of a Revolution. Boston: Dial Press, 1999. Provides an insider’s view of the Meese Commission’s investigation of pornography and the feminists on either
The Eighties in America
side of the pornography debate. Craft, Christine. Too Old, Too Ugly, Not Deferential to Men. Rocklin, Calif.: Prima, 1986. Craft’s book details her legal struggles with Metromedia and media coverage of the case, which included accusations that she was lesbian as well as charges that she had been fired because she “wasn’t worth the money.” Critchlow, Donald T. Phyllis Schlafly and Grassroots Conservatism: A Woman’s Crusade. Princeton, N.J.: Princeton University Press, 2005. Describes Schlafly’s political career, including her battle against the ERA and her far-right stance on reproductive issues and feminism. Faludi, Susan. Backlash: The Undeclared War Against America’s Women. New York: Crown, 1991. Describes the conservative backlash evoked by legal victories for women in the 1980’s and points to media representations of women showcasing this reaction. Kauffman, Linda S. American Feminist Thought at Century’s End: A Reader. Cambridge, Mass.: Blackwell, 1993. This anthology contains a diverse selection of essays showcasing some of the major strands of feminist thought in the 1980’s and 1990’s. MacKinnon, Catharine A. Feminism Unmodified: Discourses on Life and Law. Cambridge, Mass.: Harvard University Press, 1988. Collection of insightful lectures given by feminist legal scholar MacKinnon that discusses the legal and social subjugation of women. Wolf, Naomi. The Beauty Myth: How Images of Beauty Are Used Against Women. New York: William Morrow, 1991. Describes media backlash against gains in women’s rights and how it shapes women’s work experience. Cat Rambo See also Abortion; Affirmative action; Bork, Robert H.; Conservatism in U.S. politics; Craft, Christine; Domestic violence; Dworkin, Andrea; École Polytechnique massacre; Environmental movement; Feminism; Ferraro, Geraldine; Gender gap in voting; Liberalism in U.S. politics; Marriage and divorce; Meritor Savings Bank v. Vinson; Pornography; Reagan, Ronald; Sexual harassment; Supreme Court decisions; Webster v. Reproductive Health Services; Women in the workforce.
The Eighties in America
■ Wonder Years, The Identification Television series Date Aired from March 15, 1988, to May 12, 1993
Unlike other family sitcoms of the 1980’s, The Wonder Years was a coming-of-age mixture of comedy and drama. The show’s complex tone came in part from its setting in the turbulent 1960’s. The Wonder Years was primarily a show about the anxieties of teenage life, but because of its context, it uniquely appealed to both children growing up in the suburbs in the 1980’s and their parents, who had grown up twenty years earlier, during the era portrayed by the show. Although sitcoms like Cheers and The Cosby Show dominated the ratings in the 1980’s, The Wonder Years earned a place in the Nielsen top ten for two of its seasons, and it won the Emmy Award for best comedy in 1988. The series followed the daily life of adolescent Kevin Arnold (played by Fred Savage) in American
Wonder Years, The
■
1063
suburbia during the 1960’s, a time of extreme turmoil and change in the United States. Kevin struggled with many of the typical complications of teenage life: acne, dating, conflicts with authority, fighting with siblings, and trying to negotiate the difficult transition from boyhood to manhood. Kevin represented the average American kid growing up in an average suburb, and the show illuminated his relationships with his mother (played by Alley Mills), father (played by Dan Lauria), brother Wayne (played by Jason Hervey), hippie sister Karen (played by Olivia d’Abo), best friend Paul (played by Josh Saviano), and heartthrob Winnie (played by Danica McKellar). Episodes were narrated in voice-over by an adult Kevin (voiced by Daniel Stern), who reflected in the present about the past events portrayed in each episode.The Wonder Years differed from such other 1980’s sitcoms as The Cosby Show, Married . . . with Children, and Family Ties in that Kevin struggled through his adolescence against the backdrop of large-scale social events such as the Vietnam
Cast members of The Wonder Years celebrate their Emmy Award for Best Comedy Series at the 1988 ceremony. From left: Alley Mills, Jason Hervey, producer Jeff Silver, Josh Saviano, Olivia D’Abo, and Dan Lauria. (AP/Wide World Photos)
1064
■
World music
War, the counterculture, the early stages of space exploration, and changes in gender roles in American society. The show appealed to and navigated the nostalgia for earlier decades so central to 1980’s culture, as well as a lingering cultural need to understand the social upheaval of the 1960’s from the vantage point of the 1980’s. Impact The Wonder Years had a widely varied viewership. As a result, the show sold air time to a broader than average range of advertisers, seeking to reach the show’s wider than average demographic. This marketing strategy was partially responsible for the show’s success. The series also offered a retrospective look at the turmoil of the 1960’s and allowed viewers to gain some historical, as well as personal, understanding from each episode.
The Eighties in America
ethnomusicology had developed methods for the formal study of music on a global scale, and a few Western classical composers had already utilized non-Western rhythmic and melodic elements in their works. Starting in the 1950’s, jazz musicians embraced Latin music, followed by pianist Dave Brubeck with meters from Eastern Europe, saxophonist John Coltrane with African and East Indian musical concepts, and other musicians who were interested in expanding their horizons. The term “world music” was used in academic circles during the 1960’s as a way of identifying the breadth of global traditions and to express cross-cultural identification. Interest in global traditions accelerated in the late 1960’s when members of the popular British rock group the Beatles began using Indian music and instruments in some of their pieces.
Further Reading
Gross, Edward A. The Wonder Years. Las Vegas: Pioneer Books, 1990. Lasswell, Mark. TV Guide: Fifty Years of Television. New York: Crown, 2002. Roman, James. Love, Light, and a Dream: Television’s Past, Present, and Future. Westport, Conn.: Praeger, 1996. Jennifer L. Amel See also Back to the Future; Cheers; Cosby Show, The; Family Ties; Married . . . with Children; Sitcoms; Television.
■ World music Definition
A marketing category for music originating from, influenced by, or blending or incorporating elements of non-Western musical traditions
Large concerts and recordings with famous popular musicians in the 1980’s gave traditional musicians a chance to be heard by a greatly expanded audience. The music industry began to market traditional musicians by using a more blended or universalist approach. Until the 1980’s, most published recordings of traditional music (with the notable exception of productions controlled by academics) were marketed as a kind of tourist experience, with appeal to the exoticism and adventure of hearing something “different.” In contrast to this, scholars in the field of
New Age and World Music Later, some musicians began exploring the possibilities of music, especially world music, for holistic therapy and meditation, in a related movement known as New Age music. This term is credited to guitarist William Ackerman, founder of the Windham Hill recording label. Jazz flutist Paul Horn became known for incorporating the actual acoustic environments of various famous sacred spaces around the world. Similarly, jazz saxophonist Paul Winter, who also tended to blur the distinction between New Age and world music, became interested in environmental sounds, especially finding affinities with animal sounds. Another important exponent of New Age music was R. Carlos Nakai, a flutist and composer of Native American (Navajo and Ute) ancestry who began using a traditional cedar flute in compositions that blended native melodies with ideas inspired by his own cultural experiences, including a deep reverence for nature. By the 1980’s, music fans had become accustomed to hearing elements of global traditions blended with jazz and popular music. Indian tabla virtuoso Zakir Hussain, who started the Diga Rhythm Band with drummer Mickey Hart of the Grateful Dead in the 1970’s, was active in experimenting with combining world percussion instruments and styles. Hart also collaborated with Nigerian drummer Babatunde Olatunji during the 1980’s. Jazz guitarist John McLaughlin, who played in Shakti (a quartet with Hussain and two South Indian musicians) during the 1970’s, continued to include Indian elements in his music during the 1980’s, when he formed a trio
The Eighties in America
that included Indian percussionist Trilok Gurtu, who played with Oregon, a world music fusion group that started in the 1970’s. A new interest in European folk traditions, including Celtic and klezmer music among others, also emerged in the 1980’s. The Klezmer Conservatory Band was established in the early 1980’s, and McLaughlin supplemented his Indian music and jazz activities by collaborating with flamenco guitarist Paco De Lucía. Global Influences on WOMAD and Minimalism
One of the most noticeable world music phenomena that emerged during the 1980’s was the World of Music, Arts and Dance festival (WOMAD), initially sponsored by Peter Gabriel, a famous rock musician whose political identification with the struggle against racial segregation in South Africa had inspired him to explore that country’s music. The first festival—which included Gabriel, drummers from Burundi, and other musicians—was held in 1982 and attracted more than fifteen thousand people. The festival expanded in subsequent years, being held in many different countries, as often as ten times per year, and has featured thousands of musicians. In 1986, world music got another important boost when American musician Paul Simon recorded Graceland, which included the South African choral group Ladysmith Black Mambazo. Although more publicity was generated by the activities of popular musicians, intercultural influence was expanding in elite circles as well. A movement known as minimalism, which also represented a break with the extreme complexity of many twentieth century compositions, incorporated aesthetic concepts as well as musical elements from Asian, African, and other sources. Philip Glass, who had supplemented his graduate work in composition with lessons from Indian sitar virtuoso Ravi Shankar; Terry Riley, who was also influenced by Indian music; and Steve Reich, who was interested in rhythms from Indonesia and Africa, were just a few of the composers who successfully incorporated global elements.
Impact
Criticisms have been raised that the world music trend aspires to breadth but sacrifices depth. Even the breadth has been called into question, because the less easily understood, less danceable genres have been underrepresented. The technological ability to “sample” or electronically capture fragments of sound and then use them in other contexts
World Wrestling Federation
■
1065
is also problematic. There is concern that globalization might obscure or even destroy some unique aspects of musical traditions and the identities they reflect. On the other hand, many musicians benefited artistically from the stimulating and challenging collaborations and benefited financially from the increased exposure resulting from the explosion of interest in world music during the 1980’s. Most of the musicians involved in these activities have continued them into the twenty-first century, and their audiences are still listening. Further Reading
Feld, Steven. “A Sweet Lullaby for World Music.” Public Culture 12, no. 1 (Winter, 2000): 145-171. This critical essay provides an overview of the history of the term “world music,” with attention to the 1980’s, and explores the moral, technical, and legal implications of musical appropriation. Fletcher, Peter. World Musics in Context: A Comprehensive Survey of the World’s Major Musical Cultures. New York: Oxford University Press, 2004. A comprehensive study that includes a historical overview, focusing on connections among the world’s peoples. Nidel, Richard. World Music: The Basics. New York: Routledge, 2004. This accessible book is primarily descriptive of the most popular genres, but still fairly broad, with 130 countries represented. Witzleben, Lawrence. “Whose Ethnomusicology? Western Ethnomusicology and the Study of Asian Music.” Ethnomusicology 41, no. 2 (1997): 220-242. Explores fundamental issues and possibilities in the cross-cultural academic study of the musics of the world. John Myers See also Classical music; Glass, Philip; Jazz; Music; Native Americans; Pop music.
■ World Wrestling Federation Identification
Professional sports entertainment organization
The forerunner of the World Wrestling Entertainment empire, the World Wrestling Federation in the 1980’s changed the face of professional wrestling. The company transformed the industry from a fractured regional structure with loose alliances into a cohesive national organization
1066
■
World Wrestling Federation
and emphasized professional wrestling as a form of entertainment rather than an authentically competitive sport. During the early years of professional wrestling, a variety of regional organizations dominated the industry, and despite monikers that denoted global competition, most activities focused on personalities and events in the northeastern United States. In 1980, however, a young Vincent McMahon founded Titan Sports, parent company to the World Wrestling Federation (WWF). McMahon, a third-generation wrestling entrepreneur, set out to build a national organization; throughout the 1980’s, he purchased regional organizations and developed national promotional strategies that ran counter to the industry’s traditional territory system. The WWF’s biggest rivals during this period were the National Wrestling Alliance (NWA), a group of northeastern independent wrestling promotions, and the American Wrestling Alliance (AWA), a Minneapolis-based territorial organization that held to the tenet that wrestling was to be presented as a traditional sport. The AWA’s matches were aired weekly on the Entertainment and Sports Programming Network (ESPN) on cable television. WWF Goes Nationwide In the early 1980’s, McMahon infuriated rival promoters by syndicating his wrestling events to television stations nationwide and by selling videotapes of matches via his Coliseum Video distribution company. He used the revenues generated by televised and videotaped matches, as well as by advertising, to lure major-name wrestlers from other organizations. The WWF’s most significant talent acquisition of the 1980’s was wrestling superstar Hulk Hogan (Terrence Gene Bollea), who had gained national recognition with his appearance in the film Rocky III (1982). Hogan was frequently pitted against another WWF employee poached from a rival promoter, the Scottish kilt-wearing bodybuilder Roddy Piper (Roderick George Toombs). This pairing created a sense of ongoing, bitter rivalry that rapidly beame a mainstay of professional wrestling. A host of memorable names and personalities followed, including ultrapatriot Sgt. Slaughter (Robert Remus) and his Iranian nemesis the Iron Sheik (Hossein Khosrow Ali Vaziri). Additional WWF acquisitions during the 1980’s included future Minnesota governor Jesse “The Body” Ventura, typically acting as a commentator rather than competitor be-
The Eighties in America
cause of health problems; Don Muraco (Don Morrow), a huge Hawaiian wrestler with an arrogant and intimidating persona; and the nearly seven-foot-tall André the Giant (André René Roussimoff). Roussimoff, the product of a rare pituitary disorder, also appeared both on television series and in film; he may be best known for his role in the 1987 classic The Princess Bride. Among other significant changes during this period was the introduction in 1985 of a nationwide payper-view championship event, WrestleMania, billed by WWF promoter McMahon as the Super Bowl of professional wrestling. Unlike other national wrestling events, which generally attracted only dedicated wrestling fans, WrestleMania targeted a wider, more mainstream audience by involving celebrities outside wrestling, such as Mr. T and Cyndi Lauper. McMahon later identified the introduction of WrestleMania as a major turning point in the identification of professional wrestling as “sports entertainment.” MTV also helped promote professional wrestling during the 1980’s, in what was termed the “Rock ’n’ Wrestling Connection,” by featuring significant WWF coverage and programming. Throughout the remainder of the 1980’s, the WWF’s business continued to boom, thanks to its blossoming empire and the popularity of Hulk Hogan, who remained the federation’s golden boy through the early 1990’s. However, toward the end of the 1980’s, it appeared that Hogan’s popularity had begun to decline, in part because it seemed that he was virtually unbeatable. Impact The 1980’s became known in the industry as the Second Golden Age of Wrestling for revitalizing the sport by wedding it with showmanship. Wrestlers wore their hair long, reminiscent of the biblical strongman Sampson, and donned elaborate, glittering costumes. Intricate, soap-opera-like plotlines enhanced the wrestling matches, luring viewers from all over the world by the millions. This focus on entertainment also de-emphasized fair play in favor of dramatic elements such as cheating, extremely violent acts both inside and outside the ring, shouting matches, and sexual, financial, and relational intrigue, raising the ire of many social critics. Equally important, this period marked the establishment of the WWF as the primary player in the professional wrestling industry, bringing the sport and its questionable social effects to a truly global audience.
The Eighties in America Further Reading
Ball, Michael R. Professional Wrestling as Ritual Drama in American Popular Culture. Lewiston, N.Y.: Edwin Mellen Press, 1990. Investigates professional wrestling from a sociological perspective as a reflection of working-class values. Beekman, Scott M. Ringside: A History of Professional Wrestling in America. Westport, Conn.: Praeger, 2006. Examines the disreputable reputation of professional wrestling compared to other sports. Guttman, James. World Wrestling Insanity: The Decline and Fall of a Family Empire. Toronto: ECW Press, 2006. Provides an exposé of the McMahon monopoly on professional wrestling and examines issues such as racism, creativity, and manipulation of the industry. Hackett, Thomas. Slaphappy: Pride, Prejudice, and Professional Wrestling. New York: HarperCollins, 2006. Uses interviews with wrestlers, promoters, and fans to investigate a range of issues surrounding professional wrestling, including fame, masculinity, violence, performance, and play. Mazer, Sharon. Professional Wrestling: Sport and Spectacle. Jackson: University Press of Mississippi, 1998. Examines how professional wrestling performances are constructed and promoted and how fans deal with the artificial nature of the sport. Soulliere, Dannelle M. “Wrestling with Masculinity: Wrestling with Images of Manhood in the WWE.” Sex Roles 55 (July, 2006): 1-11. Study examining messages about manhood presented in professional wrestling. Tamborini, Ron, et al. “The Raw Nature of Televised Professional Wrestling: Is the Violence a Cause for Concern?” Journal of Broadcasting and Electronic Media 49, no. 2 (2005): 202-220. Study linking physical violence portrayed on professional wrestling with harm to viewers. Cheryl Pawlowski See also
Action films; Advertising; Cable television; Children’s television; Lauper, Cyndi; Martial arts; Mr. T; MTV; Sports; Television.
Wright, Jim
■
1067
■ Wright, Jim Identification
Speaker of the U.S. House of Representatives, 1987-1989 Born December 22, 1922; Fort Worth, Texas Wright’s rise and fall from power reflected the battles taking place between liberals and conservatives in American politics during the Reagan years. As the 1980’s began, Democrat Jim Wright of Texas was one of the most powerful members of the U.S. House of Representatives. First elected in 1954, he had attained the position of House majority leader by 1976. His ascent to power continued as the decade progressed, and in January of 1987 he was elected House Speaker following the retirement of Democrat Tip O’Neill. Wright’s political philosophy was shaped by the New Deal and Great Society eras and thus stood in direct opposition to President Ronald Reagan’s philosophy of lower taxes and smaller government. As Speaker, Wright sought to expand the position’s role and to give it a stronger voice in the creation of national policy as a means of offering opposition to the president. When Reagan blocked Wright’s domestic policy efforts by refusing to raise taxes and by blaming soaring budget deficits on the Democrats, Wright attempted to challenge the president in the area of foreign policy, notably the Iran-Contra affair. Wright justified this foray into foreign policy on the grounds that he, as Speaker of the House, represented the American people as much as the president did. In addition to his battles with the president and House Republicans, Wright ruled his own party with an iron hand, insisting on absolute loyalty and exercising strict discipline among his fellow Democrats. To counter Wright’s growing power, and reflecting the growing conflict between liberal and conservative forces within U.S. politics at the time, Republican congressman Newt Gingrich of Georgia set out on a personal mission to remove Wright from office. Following in the pattern of Democratic attacks on prominent Republican appointees and officeholders such as Robert H. Bork and Attorney General Edwin Meese III, Gingrich sought to expose questionable financial dealings and ethics violations by Wright. The most prominent of these violations involved sales to lobbyists of a self-published autobiography, Reflections of a Public Man, and a job and other perquisites received by his wife. In the end, none of
1068
■
The Eighties in America
Wright, Jim
House Speaker Jim Wright, left, talks with Senator Alan Cranston and Representative Nancy Pelosi on Capitol Hill in June, 1987. (AP/ Wide World Photos)
these activities proved to be technically illegal, being based on various loopholes in the ethics rules, and they were not inconsistent with the practices of other House members at the time. Still, the appearance of misconduct, combined with the persistence of Gingrich’s attacks, eventually forced Wright to resign as Speaker on May 31, 1989, and to give up his House seat shortly thereafter. Impact Wright’s political success during the early 1980’s and his fall from power at the end of the decade were key events in the battle taking shape between liberal and conservative elements in government during the Reagan years. His story is also a study in the personal quest for power and the hubris that can be associated with it.
Further Reading
Barry, John M. The Ambition and the Power. New York: Viking Press, 1989. Taylor, Stuart, Jr. “Wright’s Deeds Pale Next to Systemic Corruption.” The New Jersey Law Journal 123, no. 23 (June 8, 1989): 12. Wright, Jim. Reflections of a Public Man. Fort Worth, Tex.: Madison, 1984. Scott Wright See also Bork, Robert H.; Congress, U.S.; Conservatism in U.S. politics; Iran-Contra affair; Liberalism in U.S. politics; Meese, Edwin, III; O’Neill, Tip; Reagan, Ronald; Reagan Revolution; Reaganomics; Scandals.
X ■ Xanadu Houses Definition
Experimental homes designed to showcase new architectural methods and home technology Place Wisconsin Dells, Wisconsin; Kissimmee, Florida; and Gatlinburg, Tennessee The three Xanadu Houses built in the United States in the 1980’s were meant to showcase and promote new architectural methods and home automation systems, though in reality the houses were merely tourist attractions, and their methods and ideologies were never widely adopted. The Xanadu House project was born in 1979, the brainchild of Bob Masters, who envisioned a future of ergonomically designed houses built with novel materials and featuring advanced computer technology. The first Xanadu House was designed by architect Stewart Gordon and built in Wisconsin Dells, Wisconsin. The second, and by far the best known of the houses, was designed by Roy Mason and built in 1983 in Kissimmee, Florida, to take advantage of the tourist population drawn to the area by Disney’s Experimental Prototype Community of Tomorrow (EPCOT) Center. The final house was located in Gatlinburg, Tennessee. The houses were designed to be energy efficient and very quickly built by spraying polyurethane insulating foam over inflatable balloon forms. The resulting structures were bright white and had gently curving lines, both inside and out, which, visitors often thought, resembled something from a sciencefiction film. An integrated computer system controlled virtually every aspect of the homes’ functioning, from watering plants in the greenhouses to suggesting nutritious menus and helping to prepare meals. The designers intended for such labor-saving devices to leave more time for families to come together around the “electronic hearth”—a hightechnology entertainment center featuring multiple
televisions, video games, stereo equipment, and, in at least one of the homes, a video screen showing an image of a cozy fire. Impact Though they were meant to showcase serious architectural possibilities for the future and to change the way people interacted with their shelters, in truth the Xanadu Houses never quite rose above the status of curiosity or tourist attraction. The Kissimmee Xanadu House, by far the most popular of the three, attracted more than one thousand visitors per day during its peak of popularity in the mid1980’s. Despite their grand vision, however, the architects never really reckoned with the tastes and preferences of ordinary home buyers. No one, in fact, ever lived in any of the Xanadu Houses. The rooms in the houses were small, and the curved walls could make them feel cramped and cavelike; the building materials were not well suited to stand up over the long term to the ravages of weather; and many people found the designs, reminiscent of science fiction, strange and even ugly. Perhaps most important, rapid developments in technology made many of the homes’ “futuristic” features quickly obsolete. The Wisconsin and Tennessee houses were demolished in the 1990’s, and even the once-popular Florida house closed in 1996 and was demolished in 2005. Further Reading
Mason, Roy, et al. “A Day at Xanadu.” Futurist 18 (February, 1984): 17-24. Mason, Roy, Lane Jennings, and Robert Evans. Xanadu: The Computerized Home of Tomorrow and How It Can Be Yours Today! New York: Acropolis Books, 1983. Janet E. Gardner See also
Architecture; CAD/CAM technology; Computers; Deconstructivist architecture.
Y ■ Yankovic, Weird Al Identification
American comedy songwriter and performer Born October 23, 1959; Downey, California Yankovic’s humorous songs portrayed and sometimes parodied popular culture and music of the 1980’s. Born Alfred Matthew Yankovic, “Weird Al” received his nickname as a deejay at his university’s radio station. After some minor successes, the singer-
Weird Al Yankovic poses for the press at the 1987 MTV Video Music Awards in Universal City, California. (Hulton Archive/ Getty Images)
songwriter became known through the help of deejay Dr. Demento, whose syndicated weekly radio show popularized novelty songs of the past and showcased new talent such as Yankovic. Yankovic’s first album, “Weird Al” Yankovic, appeared in 1983, followed by “Weird Al” Yankovic in 3-D (1984), Dare to Be Stupid (1985), Polka Party! (1986), and Even Worse (1988). Most of his work uses the music of popular songs whose lyrics he playfully alters, sometimes putting an ironic twist to the original. “Fat,” on the 1988 album, for example, mimics Michael Jackson’s 1987 “Bad” but substitutes bragging about girth. Many of Yankovic’s songs praise food, including “I Love Rocky Road,” based on the 1982 cover version of “I Love Rock ’n Roll” by Joan Jett and the Blackhearts, and “Addicted to Spuds,” based on Robert Palmer’s 1985 song “Addicted to Love.” Other songs comment on television and film, especially science fiction: “Yoda,” to the tune of the Kinks’ 1970 hit “Lola,” was also popular with fans. His lyrics also refer to mundane aspects of life, from sales jobs to having a hernia to paying alimony. Yankovic generally avoids political commentary, although his original song “Christmas at Ground Zero” (1986) reflects Cold War fears of nuclear war. Some of his best songs satirize the originals: “Dare to Be Stupid” does not parody any particular single song by New Wave band Devo but captures and exaggerates the group’s tone, while “(This Song’s Just) Six Words Long” summarizes “Got My Mind Set on You,” recorded by George Harrison in 1987. Many of Yankovic’s songs became excellent music videos, including “Eat It,” which parodied the video for Jackson’s 1982 hit single “Beat It,” as well as “Like a Surgeon,” based on “Like a Virgin” and featuring Madonna-like gyrations, and “I Lost on Jeopardy,” based on the 1983 song “Jeopardy” by the Greg Kihn Band. In 1989, Yankovic cowrote and starred in the film UHF, which satirized television and movies. Home Box Office (HBO) aired a “mockumentary” of Weird Al’s life, issued in 1991 as The Compleat Al.
The Eighties in America
Yellowstone National Park fires
■
1071
A bomber drops liquid fire retardant in Yellowstone National Park to combat the 1988 forest fires. (NPS photo by Jeff Henry)
Impact A number of Yankovic’s albums reached gold or platinum status in the United States and Canada in the 1980’s and later decades. Yankovic continued to produce successful albums, including 1992’s Off the Deep End, which parodied Nirvana’s Nevermind (1991) album cover and the rock band’s hit “Smells Like Teen Spirit,” and 2003’s Poodle Hat, which went gold and earned the Grammy Award for Best Comedy Album. Yankovic’s satire is accurate but good-humored; he does not anticipate trends but identifies current, major aspects of culture. Further Reading
http://www.weirdal.com. Insana, Tino, and Weird Al Yankovic, The Authorized Al. Chicago: Contemporary Books, 1985. Bernadette Lynn Bosky See also Comedians; Dance, popular; Devo; Jackson, Michael; Madonna; Music; Music videos; Pop music; This Is Spin¨al Tap.
■ Yellowstone National Park fires The Event
A devastating series of fires driven by drought and high winds burns thousands of acres Date June 22-September 11, 1988 Place Yellowstone National Park in Wyoming, Montana, and Idaho Causing $120 million in damages, the Yellowstone fires were the most costly in U.S. history. Media attention from the fires and a policy allowing natural fires to burn sparked intense public debate. In 1972, the U.S. National Park Service adopted the “natural-burn” policy allowing lightning-ignited fires to burn when there was no threat to human life or property. Between 1972 and the 1987, 235 naturally caused fires burned 33,759 acres within Yellowstone National Park. These fires were credited with reducing surplus fuel that accumulated following
1072
■
The Eighties in America
Yuppies
years of fire suppression and with restoring the natural role of fire in improving forest growth and wildlife habitat. In a normal year, rainfall contains fires. However, in 1988 Yellowstone experienced its driest season on record, with 32 percent of normal annual precipitation. The park’s fire season began with a lightning strike on June 22 that ignited a stand of lodgepole pine. Buildups of dry fuel combined with high winds spread the flames rapidly, and by the end of July almost 99,000 acres had burned. Reacting to increased media attention, park managers elected to suppress all fires in the park. The single worst day of the 1988 fire season was August 20, called “Black Saturday,” when 40-mile-per-hour winds pushed a firestorm across 150,000 acres. Flames reached two hundred feet in the air. As a result of danger associated with the fires, many of the park’s roads and facilities were closed to visitors. On September 6, fire swept through the Old Faithful area, destroying sixteen cabins but sparing the Old Faithful Inn. The first snowfall on September 11 helped contain the fires. Of fifty fires that burned within Yellowstone in 1988, forty-one were caused by lightning and nine by human activities. The total area burned within the park was 793,000 acres, amounting to about 36 percent of the park’s 2,221,800 acres. More than twentyfive thousand firefighters participated in efforts to save human life and property. Destruction was limited to sixty-seven structures worth more than $3 million. Remarkably, none of Yellowstone’s famous attractions or historic lodges was damaged by fire. Impacts on wildlife were also relatively low, given the magnitude of acreage burned. Field surveys revealed that the number of animals killed included 9 bison, 12 moose, 6 black bears, and 345 elk (out of an estimated elk population of 40,000). Ample precipitation during the years immediately following the fires led to rapid regeneration of trees in most burned areas. Impact Nearly twenty years after the 1988 fires in Yellowstone, many burned areas remained visible. The Yellowstone fires of that year created a national debate concerning the natural-burn policy. In the years immediately following the fires, public land managers across the United States revised fire management plans with strict guidelines for circumstances under which naturally occurring fires would be allowed to burn.
Further Reading
Patent, Dorothy H. Yellowstone Fires: Flames and Rebirth. New York: Holiday House, 1990. Wallace, Linda. After the Fires: The Ecology of Change in Yellowstone National Park. New Haven, Conn.: Yale University Press, 2004. Thomas A. Wikle See also
Environmental movement; Natural disas-
ters.
■ Yuppies Definition
Young, well-educated, well-paid urban professionals who live an affluent lifestyle
This group emerged as a growing middle class in the United States during the 1980’s. Yuppies became a dominant political and cultural force in society, focusing on successful careers, economic privilege, and materialism. American journalist Bob Greene of the Chicago Tribune is recognized as first individual to use the term “yuppies” (coined from “young urban professionals” and later associated with “young upwardly mobile professionals” as well) in his syndicated column in March, 1983. Yuppies were an ambitious, competitive, self-reliant, and upwardly mobile class between the ages of twenty-five and thirty-nine that earned salaries of more than $40,000 per year. Newsweek declared 1984 as the “Year of the Yuppie,” especially after Democratic senator Gary Hart ran his presidential campaign espousing yuppie values. The Yuppie Lifestyle Throughout the decade, an economic boom occurred in the United States. Careers in business administration, law, and medicine became the fastest ways to achieve a good salary and advancement. Universities and colleges that offered these programs experienced a dramatic increase in enrollment. Yuppies held high-paying white-collar jobs in metropolitan areas. As overachievers, they brought work home at night and on the weekends if necessary, living by schedules and appointment books. Because they spent so much time working, they needed to live in close proximity to their jobs. New housing markets sprang up in the inner cities, and developers began to renovate buildings, turning them into sleek condominiums or studio apartments designed with postmodern elements.
The Eighties in America
Yuppies were part of the “new rich” generation that lavished in extravagance by purchasing luxury items. Conspicuous consumption was typical for this segment of American society. Marketing campaigns and advertisements targeted this demographic group, raised on popular culture and rock music. Yuppies “dressed for success” by wearing the latest designer fashions; men wore suits from Brooks Brothers and shirts by Perry Ellis, purchased expensive Rolex watches, and drove BMW cars. Casual style was achieved by wearing clothes from Banana Republic and L.L.Bean, while women would often wear Nike running shoes while scurrying from one place to another, even if they were dressed in a tweed skirt and jacket. Yuppies, because of their hectic schedules, ate out at trendy ethnic restaurants while sipping the best house wine. However, despite their expensive tastes, these professionals remained health-conscious with low-fat diets, all-natural fruit drinks, and bottled water. For exercise, they were often found jogging. Self-help books became best sellers, and a New Age guru, the Reverend Terry Cole-Whittaker, spread the yuppie-inspired message, “You can have it all—now!” Yuppies were obsessed with technological gadgets that would make their hectic lives more efficient. They were voracious consumers of videocassette recorders (VCRs), personal computers, cordless phones, answering machines, microwave ovens, food processors, and fax machines. Television shows such as Dynasty and Dallas, which depicted powerful families who continued to amass wealth by any means, appealed to the yuppie generation. On the other hand, Hill Street Blues was popular among yuppies because the show centered on their liberal political ideology concerning social justice. Movies portrayed yuppies and their relationships in The Big Chill (1983), Baby Boom (1987), and When Harry Met Sally . . . (1989). Personal relationships were often secondary to career goals, and this dichotomy was portrayed as “Yuppie angst” in the television program thirtysomething. Yuppies had decided to defer marriage and children until they were firmly established in their professional positions. Those who married but decided not to have children were referred to as “dinks” (double income, no kids), while couples who did have children often hired nannies to care for them. Impact Radical antiwar activists who epitomized the counterculture movement in previous decades
Yuppies
■
1073
started to work for corporate America as the 1970’s drew to an end. The idealism of the late 1960’s began to be replaced as the baby-boom generation became older. The yuppies agreed with President Ronald Reagan’s supply-side economics and its promotion of free market capitalism, rejecting the socioeconomic liberalism of the New Deal. They favored cuts in social spending and rejected high taxes and government regulation, but, despite their fiscal conservatism, yuppies remained liberal on positions that involved personal freedom and lifestyle choices. Many yuppies were in favor of the Equal Rights Amendment (ERA), were pro-choice regarding abortion, and opposed discrimination in the workplace. The superficial and selfish nature of yuppiedom created fodder for parody by journalists and comedians. However, economic prosperity came to an abrupt halt when a stock market crash (Black Monday) occurred on October 19, 1987. The fast money that yuppies had accumulated in Wall Street investments suddenly disappeared, and by the early 1990’s businesses began to suffer financially as globalization, massive layoffs, and downsizing in the marketplace occurred. In 1991, Time magazine officially proclaimed the death of the yuppie. Further Reading
Adler, Jerry, et al. “The Year of the Yuppies.” Newsweek 104, no. 31 (December, 1984): 14-24. The popular magazine proclaimed 1984 as the year in which the yuppie generation dominated in politics, advertising, and business. Bondi, Victor, ed. “Baby Boomers Become Yuppies.” In American Decades: 1980-1989. Detroit: Gale Research, 1995. The entry provides good background information and a general overview of yuppies. Burnett, John, and Alan Bush. “Profiling the Yuppies.” Journal of Advertising Research 26 (April/ May, 1986): 27-35. Authors study the lifestyle differences, purchasing behavior, and media habits of yuppies in order to formulate strategies for advertising to this segment of the American population. Ehrenreich, Barbara. Fear of Falling: The Inner Life of the Middle Class. New York: HarperPerennial, 1989. Ehrenreich provides a social analysis of the insecurities and anxieties that plagued the middle class from 1970 to 1990. Hammond, John L. “Yuppies.” The Public Opinion
1074
■
Yuppies
Quarterly 50 (Winter, 1986): 487-501. Hammond analyzes the political persuasion of the yuppie population, noting the group’s liberal stance on personal life choices, but he also contends that the group is not always as conservative on social welfare issues as portrayed by the media. Hertzberg, Hendrik. “The Short Happy Life of the American Yuppie.” In Culture in an Age of Money: The Legacy of the 1980’s in America, edited by Nicolaus Mills. Chicago: Ivan R. Dee, 1990. A sa-
The Eighties in America
tirical essay about the moral and political decline of yuppiedom. Gayla Koerting See also Advertising; Big Chill, The; Business and the economy in the United States; Consumerism; Demographics of the United States; Dynasty; Fads; Food trends; Hill Street Blues; L.A. Law; Power dressing; Reaganomics; thirtysomething; When Harry Met Sally . . .
■ Entertainment: Major Films of the 1980’s The one hundred titles listed here are a representative sampling of 1980’s films that are regarded as significant because of their box-office success, their Academy Award honors, or their critical reputations. Entries that include “See also main entry” have a full essay in The Eighties in America. All references to awards refer to the Academy Awards given by the Academy of Motion Picture Arts and Sciences.
1980 Airplane! (Howard W. Koch/Paramount; dir. Jim Abrahams, David Zucker, Jerry Zucker) This hugely popular satire of disaster films inspired numerous other spoofs of film genres. Features deadpan performances by veteran actors Lloyd Bridges, Peter Graves, and Robert Stack, sparking a second career for Leslie Nielsen as a comic actor. See also main entry. Caddyshack (Orion/Warner Bros.; dir. Harold Ramis) Slapstick golf farce pokes fun at rich people played by Rodney Dangerfield and Chevy Chase. Bill Murray dominates the hilarity as psychotic groundskeeper trying, at all costs, to rid his course of gophers. Coal Miner’s Daughter (Universal; dir. Michael Apted) Sissy Spacek won an Oscar for her portrayal of country singer Loretta Lynn. The film traces Lynn’s rise from Kentucky poverty to fame and focuses on her happy marriage to Mooney Lynn (Tommy Lee Jones) and her friendship with ill-fated singer Patsy Cline (Beverly D’Angelo). The Elephant Man (Brooks Films/Paramount; dir. David Lynch) The true story of the hideously deformed John Merrick (John Hurt), a sideshow freak in Victorian London until rescued by Dr. Frederick Treves (Anthony Hopkins). The Empire Strikes Back (Lucasfilm/Twentieth Century-Fox; dir. Irvin Kershner) Considered by many fans to be the best of the Star Wars series, this sequel to the original finds Luke Skywalker (Mark Hamill) absorbing wisdom from Yoda (voice of Frank Oz) and discovering a secret about Darth Vader (voice of James Earl Jones, body of David Prowse). The film won Oscars for sound and special effects. See also main entry. Fame (MGM/United Artists; dir. Alan Parker) A singer (Irene Cara), a dancer (Gene Anthony Ray), and an actor (Paul McCrane) are among the students at New York’s High School for the Performing Arts. The film won Oscars for Michael Gore’s score and the title song by Gore
and Dean Pitchford. Some of the actors continued their roles in the 1982-1987 television series. Friday the 13th (Sean S. Cunningham/Paramount; dir. Sean S. Cunningham) A summer camp cook (Betsy Palmer) seeks revenge for the accidental death of her son, Jason (Ari Lehman), twenty-five years earlier. Followed by several sequels, this grisly horror film helped popularize the decapitationof-vacuous-teenagers genre. A young Kevin Bacon plays one victim. Ordinary People (Wildwood/Paramount; dir. Robert Redford) A teenager (Timothy Hutton) feels guilty for the drowning death of his brother and is not helped by his harsh parents (Mary Tyler Moore and Donald Sutherland). The adaptation of Judith Guest’s novel won Oscars for Best Picture, Best Director, and Best Supporting Actor (Hutton) and for Alvin Sargent’s screenplay. See also main entry. Raging Bull (United Artists; dir. Martin Scorsese) Scorsese’s portrait of brutal middleweight boxer Jake La Motta (Robert De Niro) is one of the director’s most acclaimed films. Some critics’ polls have named it the best film of the decade. De Niro and editor Thelma Schoonmaker won Oscars. See also main entry. The Shining (Stanley Kubrick/Warner Bros.; dir. Stanley Kubrick) Accompanied by his wife (Shelley Duvall) and young son (Danny Lloyd), a writer (Jack Nicholson) becomes winter caretaker of a remote mountain hotel and slowly sinks into madness. Kubrick’s adaptation of a Stephen King novel was critically lambasted but has slowly attained cult status.
1981 Arthur (Orion/Warner Bros.; dir. Steve Gordon) Arthur (Dudley Moore), an alcoholic millionaire, resists an arranged marriage to a socialite (Jill Eikenberry) and falls for a shoplifter (Liza Minnelli). The film was a hit despite its anachronistic 1930’s premise. As Arthur’s butler and father
1076
■
The Eighties in America
Entertainment: Major Films of the 1980’s
figure, John Gielgud won an Oscar, as did the theme song. Atlantic City (Cine Neighbor/France 3 Cinema/ Planfilm/SDICC/Selta Films/Paramount; dir. Louis Malle) An aging mobster (Burt Lancaster) who fears he has lost his touch falls for a casino croupier (Susan Sarandon). Inspired by Malle’s direction and John Guare’s script, Lancaster gives one of his best performances. Body Heat (Ladd Company/Warner Bros.; dir. Lawrence Kasdan) A mediocre lawyer (William Hurt) in a small Florida town is manipulated by a femme fatale (Kathleen Turner) into murdering her rich, older husband (Richard Crenna). Screenwriter Kasdan’s first directorial effort was one of the most satisfying tributes to film noir, made a star of Turner in her first film, and featured a moody score by John Barry. Chariots of Fire (Allied Stars/Enigma Productions/ Twentieth Century-Fox; dir. Hugh Hudson) The conflicting personalities of a devout Christian runner (Ian Charleson) and a Jewish sprinter (Ben Cross) are examined at the 1924 Olympics in Paris. With an outstanding performance by Ian Holm as a coach, the film won four Oscars, including Best Picture and Colin Welland’s original screenplay. Heaven’s Gate (United Artists; dir. Michael Cimino) The decade’s most notorious box-office failure depicts a range war between immigrant settlers and cattle barons. The film was blamed for ending the creative freedom given directors during the previous decade and for crippling United Artists. See also main entry. Modern Romance (Columbia; dir. Albert Brooks) A neurotic film editor (Brooks) breaks up with his girlfriend (Kathryn Harold) and does whatever he can to try to forget his troubles. An additional problem is his attempt to salvage a weakly conceived science-fiction film. One of Brooks’s most satisfying comedies, Modern Romance shows the influence of Woody Allen. On Golden Pond (Associated Film Distribution/ IPC Films/ITC Films/Universal; dir. Mark Rydell) A daughter (Jane Fonda) and her crusty father (Henry Fonda) come to an understanding during a summer at their New England cottage. Henry Fonda won an Oscar for his final film, and Katharine Hepburn, as the mother, won her fourth and last. Ernest Thompson also won for his adaptation of his play. See also main entry.
Prince of the City (Orion/Warner Bros.; dir. Sidney Lumet) As with Serpico (1973), Lumet bases his examination of corruption in the New York Police Department on an actual case. A Manhattan detective (Treat Williams) becomes an outcast for breaking his department’s so-called code of silence. Raiders of the Lost Ark (Lucasfilm/Paramount; dir. Steven Spielberg) The first film about archaeologist/adventurer Indiana Jones (Harrison Ford) is a large-scale version of the low-budget movie serials of the 1930’s and 1940’s. It won four Oscars. See also main entry. Reds (Paramount; dir. Warren Beatty) Beatty’s longtime dream project about John Reed, the only American buried in the Kremlin, focuses equally on his radical political activism during the Russian Revolution and his turbulent romance with fellow journalist Louise Bryant (Diane Keaton), involving a love triangle with playwright Eugene O’Neill (Jack Nicholson). Interspersed throughout the film are interviews with those who knew the couple and the period. The epic film won Oscars for Best Director, Best Supporting Actress (Maureen Stapleton as Emma Goldman), and Best Cinematography (Vittorio Storaro).
1982 Blade Runner (Ladd Company/Warner Bros.; dir. Ridley Scott) A cop (Harrison Ford) in 2019 Los Angeles deals with runaway androids. Scott’s visionary blend of science fiction and film noir has been highly influential, gaining considerably in reputation since its release. See also main entry. Diner (MGM/United Artists; dir. Barry Levinson) Levinson’s first nostalgic look back at the Baltimore of his youth finds six young men in 1959 taking their first awkward steps into adulthood. The impressive cast includes Mickey Rourke, Daniel Stern, Kevin Bacon, Steve Guttenberg, Tim Daly, Paul Reiser, and Ellen Barkin, in her first film. E.T.: The Extra-Terrestrial (Universal; dir. Steven Spielberg) Spielberg’s distinctive blend of science fiction, fairy tale, family drama, and comingof-age tale was one of the most popular and beloved films of the decade and won four Oscars. See also main entry. Fast Times at Ridgemont High (Universal; dir. Amy Heckerling) The decade’s most popular and influential teen sex comedy is notable for its cast:
The Eighties in America
Jennifer Jason Leigh, Phoebe Cates, Judge Reinhold, and Sean Penn as the stoned surfer. Nicolas Cage, Eric Stoltz, and Forest Whitaker also have small roles. See also main entry. Gandhi (Goldcrest Films International/Indo-British Films/International Film Investors/National Film Development Corporation of India/Columbia; dir. Richard Attenborough) Attenborough’s biography of the Indian independence leader (Ben Kingsley) won eight Oscars, including Best Picture, Best Director, and Best Actor. Shoot the Moon (MGM/United Artists; dir. Alan Parker) Influenced by the considerations of marriage and adultery in the films of Ingmar Bergman, Bo Goldman’s perceptive screenplay examines middle-class mores. Albert Finney and Diane Keaton give outstanding performances as the unhappy Marin County, California, couple. Sophie’s Choice (Associated Film Distribution/ITC Entertainment/Keith Barish Productions/Universal; dir. Alan J. Pakula) Meryl Streep won her second Oscar as a Polish refugee with a shocking secret in 1947 Brooklyn. The moody cinematography of Néstor Almendros highlights Pakula’s adaptation of William Styron’s 1979 novel. Tootsie (Mirage/Punch Productions/Columbia; dir. Sydney Pollack) Dustin Hoffman plays a character loosely based on his experiences as a hard-luck New York actor. Unlike the real Hoffman, his character dresses as a woman to win a role on a television soap opera. One of the decade’s most popular comedies features several outstanding performances, including Bill Murray as the actor’s playwright roommate and director Pollack as his frustrated agent. The film was nominated for ten Oscars but won only one, for Jessica Lange’s supporting performance as the confused object of the actor’s affections. Tron (Buena Vista/Walt Disney; dir. Steven Lisberger) A computer programmer (Jeff Bridges) becomes trapped in a video game in the first major film to deal with this new medium. See also main entry. The Verdict (Zanuck Company/Twentieth CenturyFox; dir. Sidney Lumet) Paul Newman gives one of his finest performances as an aging, failed Boston lawyer given a chance to redeem himself in a complicated malpractice case. James Mason is his adversary and Charlotte Rampling a woman with a secret. David Mamet adapted Barry Reed’s novel.
Entertainment: Major Films of the 1980’s
■
1077
1983 The Big Chill (Carson Production Group/Delphi Productions/Columbia; dir. Lawrence Kasdan) Kasdan followed Body Heat with an even bigger commercial success about a group of baby boomers, including Glenn Close, William Hurt, and Kevin Kline, who examine their dissatisfaction with their lives. See also main entry. Flashdance (Polygram/Paramount; dir. Adrian Lyne) The story of a Pittsburgh welder (Jennifer Beals) who performs erotic dances in a bar after work won an Oscar for the title song. See also main entry. Local Hero (Enigma Productions/Goldcrest Films International/Warner Bros.; dir. Bill Forsyth) Houston oilmen Peter Riegert and Burt Lancaster visit a small Scottish fishing village planning to exploit it and find themselves unexpectedly changed by their experiences. Forsyth’s delightful film may be the decade’s most charming. El Norte (Independent Productions/Island Alive; Cinecom Pictures; dir. Gregory Nava) Two Guatemalans (Zaide Silvia Gutiérrez and David Villalpando) flee the oppression of their government and travel through Mexico into California to begin new lives. Nava’s film, cowritten with Anna Thomas, is one of the most notable treatments of the hardships faced by illegal immigrants. Return of the Jedi (Lucasfilm/Twentieth CenturyFox; dir. Richard Marquand) Jabba the Hutt and the child-friendly Ewoks make their first appearances as Luke Skywalker (Mark Hamill) and friends continue the Star Wars saga and their battle against the Empire. Its special effects received an Oscar. The Right Stuff (Ladd Company/Warner Bros.; dir. Philip Kaufman) Kaufman’s adaptation of Tom Wolfe’s book about the first astronauts may be the decade’s best fact-based film. Stealing the film from Scott Glenn as Alan Shepard, Ed Harris as John Glenn, Fred Ward as Gus Grissom, and Dennis Quaid as Gordon Cooper is Sam Shepard as Chuck Yeager, the pilot considered to be too reckless for space travel. It won four Oscars. Risky Business (Geffen Pictures/Warner Bros.; dir. Paul Brickman) Tom Cruise became a star playing a Chicago high school senior involved with a prostitute (Rebecca De Mornay) while his parents are out of town. The decade’s best look at sexual awakening and the mores of upper-
1078
■
The Eighties in America
Entertainment: Major Films of the 1980’s
middle-class teenagers is also memorable for its use of one of the most popular songs of the 1980’s, Phil Collins’s “In the Air Tonight,” in an unusual romantic scene. Scarface (Universal; dir. Brian De Palma) Director Howard Hawks’s groundbreaking 1932 gangster drama is updated for the 1980’s by screenwriter Oliver Stone as the tale of a ruthless Cuban immigrant (Al Pacino) who becomes a Miami crime lord. Featuring graphic violence, including a famous buzz saw scene, the film has become one of the most quoted of all time for such lines as “Say hello to my little friend.” Its reputation has increased since its release, and the film has become a particular favorite in the hip-hop culture. Silkwood (ABC/Twentieth Century-Fox; dir. Mike Nichols) An Oklahoma factory worker (Meryl Streep) becomes outraged at her employer’s indifference to radiation contamination. She dies in a mysterious accident en route to give evidence to a New York Times reporter. Terms of Endearment (Paramount; dir. James L. Brooks) This look at the relationship between a mother (Shirley MacLaine) and daughter (Debra Winger) won Oscars for Best Picture, Best Director, Best Actress (MacLaine), and Best Supporting Actor (Jack Nicholson as one of MacLaine’s suitors) and for Brooks’s adaptation of the 1975 Larry McMurtry novel. See also main entry.
1984 Amadeus (Saul Zaentz Company/Orion; dir. Milos Forman) Based on Peter Shaffer’s play, Amadeus examines the jealousy of composer Antonio Salieri (F. Murray Abraham) over the success of the young Wolfgang Amadeus Mozart (Tom Hulce). It won eight Oscars, including Best Film, Best Director, and Best Actor (Abraham). Beverly Hills Cop (Paramount; dir. Martin Brest) Eddie Murphy became a superstar as a Detroit cop on the trail of a gangster (Steven Berkoff) in Beverly Hills. Murphy is at his most self-assured in one of the decade’s biggest commercial hits. Blood Simple (River Road Productions/Circle Releasing; dir. Joel Coen) Joel and Ethan Coen launched their careers with this black comedy. An unfaithful wife (Frances McDormand) and her lover (John Getz) plot to murder her barowner husband (Dan Hedaya), who has hired a private eye (M. Emmett Walsh) to kill her in this
affectionate send-up of film noir conventions. Ghostbusters (Black Rhino/Delphi Productions/ Columbia; dir. Ivan Reitman) Bill Murray and Dan Ackroyd look for spirits in Manhattan. This special-effects-laden hit was the most expensive comedy made to this point. See also main entry. A Nightmare on Elm Street (Smart Egg Pictures/ Media Home Entertainment/New Line Cinema; dir. Wes Craven) A horror franchise was launched as murder victim Freddy Krueger (Robert Englund) seeks revenge through dreams. Once upon a Time in America (Ladd Company/ PSO International/Warner Bros.; dir. Sergio Leone) Leone’s epic gangster yarn was drastically reedited by its producers, yet the story of betrayal starring Robert De Niro and James Woods still retained considerable power in its truncated form. Places in the Heart (TriStar; dir. Robert Benton) A Texas woman (Sally Field) struggles to maintain her family farm during the Depression. It won Oscars for Best Actress and Benton’s original screenplay and introduced actor John Malkovich. Stranger than Paradise (Samuel Goldwyn; dir. Jim Jarmusch) In this deadpan comedy, one of the decade’s most significant independent films, three aimless friends (John Lurie, Richard Edson, and Eszter Balint) drift from New York to Cleveland to Miami. A comic highlight is Lurie’s explanation of the significance of TV dinners. The Terminator (Hemdale/Pacific Western/Orion; dir. James Cameron) Cameron established himself as a major director and made Arnold Schwarzenegger a superstar in this science-fiction thriller. See also main entry. This Is Spin¨ al Tap (Embassy; dir. Rob Reiner) The mockumentary genre began with this account of the American tour of a has-been hard-rock group (Christopher Guest, Michael McKean, and Harry Shearer). See also main entry.
1985 Back to the Future (Amblin Entertainment/Universal; dir. Robert Zemeckis) A high school student (Michael J. Fox) is accidentally transported, in the time machine of an eccentric scientist (Christopher Lloyd), back to the 1950’s and into the romance of his parents (Lea Thompson and Crispin Glover). See also main entry. The Breakfast Club (A&M Films/Universal; dir. John Hughes) The most enduringly popular of the de-
The Eighties in America
cade’s many films about teenagers finds five students (Emilio Estevez, Anthony Michael Hall, Judd Nelson, Molly Ringwald, and Ally Sheedy) suffering Saturday detention in their school library. See also main entry. Dreamchild (PHF Limited/Thorn EMI/Universal; dir. Gavin Millar) Dennis Potter’s imaginative screenplay has the inspiration for Alice’s Adventures in Wonderland (1865) traveling to 1932 New York to celebrate the centenary of the birth of Lewis Carroll (Ian Holm). The film cuts between the present, the past, and events from the novel. Coral Browne plays the older Alice and Amelia Shankley the younger, and Jim Henson designed the Wonderland creatures. Kiss of the Spider Woman (HB Filmes/Island Alive/ Sugarloaf Films; dir. Hector Babenco) A political prisoner (Raul Julia) and a homosexual (William Hurt) share a prison cell in this adaptation of Manuel Puig’s 1976 novel. Hurt won an Oscar for his sensitive performance. See also main entry. Lost in America (Geffen Pictures/Warner Bros.; dir. Albert Brooks) Los Angeles yuppies David (Brooks) and Linda (Julie Hagerty) sell everything they own and hit the road to see America, only to lose all their savings and find themselves stranded in a small Arizona town. Brooks satirizes the decade’s obsession with money and the continuing adolescence of baby boomers. Out of Africa (Universal; dir. Sydney Pollack) Karen Blixen (Meryl Streep) marries Baron Bor Blixen (Klaus Maria Brandauer) for convenience in 1914 and moves to Kenya, only to fall in love with hunter Denys Finch Hatton (Robert Redford). This lush romantic drama, inspired by the writings of Isak Dinesen (Karen’s pseudonym), won seven Oscars, including Best Picture, Best Director, and Best Score (John Barry). Prizzi’s Honor (ABC/Twentieth Centur y-Fox; dir. John Huston) Veteran director Huston rebounded from a lengthy slump to make one of his best and most entertaining films. Mafia hit man Charlie Partanna (Jack Nicholson) falls for Irene Walker (Kathleen Turner), not knowing she is also an assassin. Nominated for eight Oscars, it won only for the star-making turn of the director’s daughter, Anjelica Huston, as Partanna’s spurned lover. The Purple Rose of Cairo (Orion; dir. Woody Allen) Allen’s tender fantasy finds a frustrated woman
Entertainment: Major Films of the 1980’s
■
1079
(Mia Farrow) escaping from her dismal 1930’s life by going to films. One day, a character in a film she has seen several times leaves the screen to romance her. Rambo: First Blood Part II (TriStar; dir. George Pan Cosmatos) The pulpy sequel to First Blood (1982) was a much bigger commercial success than the more realistic original. John Rambo (Sylvester Stallone) is released from prison and dispatched to Vietnam to rescue American prisoners of war. Witness (Paramount; dir. Peter Weir) An Amish boy (Lukas Haas) witnesses a murder in the restroom of a Philadelphia train station, and police detective John Book (Harrison Ford) links the crime to a conspiracy within his department. Fleeing to the Amish community, Book falls in love with the boy’s widowed mother (Kelly McGillis). One of the decade’s best thrillers won Oscars for the original screenplay by William Kelley, Earl W. Wallace, and Pamela Wallace and for Thom Noble’s editing.
1986 Aliens (Brandywine/Twentieth Century-Fox; dir. James Cameron) The first sequel to Ridley Scott’s Alien (1979) was an even bigger hit, with Ripley (Sigourney Weaver) becoming a surrogate mother to an orphan (Carrie Henn). Its special effects won an Oscar. See also main entry. Blue Velvet (De Laurentiis; dir. David Lynch) The corrupt underbelly of suburbia is exposed as allAmerican boy Kyle MacLachlan tries to protect innocent Laura Dern and unstable Isabella Rosselini from an especially vicious Dennis Hopper, in the best performance of his long career. See also main entry. Children of a Lesser God (Paramount; dir. Randa Haines) A speech teacher (William Hurt) falls for a difficult student (Marlee Matlin) in this adaptation of Mark Medoff’s play. Matlin won an Oscar as Best Actress. The Color of Money (Buena Vista/Touchstone; dir. Martin Scorsese) Paul Newman finally won an Oscar in this sequel to The Hustler (1961), which recounts how Fast Eddie Felson becomes a reluctant mentor to an arrogant young pool shark (Tom Cruise). Crocodile Dundee (Rimfire Productions; dir. Peter Faiman) Australian television personality Paul Hogan, who also cowrote the screenplay, became
1080
■
Entertainment: Major Films of the 1980’s
a star as the legendary crocodile hunter who finds a different set of dangers while visiting New York City. Ferris Bueller’s Day Off (Paramount; dir. John Hughes) A teenager (Matthew Broderick), famous for cutting class, stages one final, elaborate day off before graduation. One of Hughes’s biggest hits blends slapstick with social commentary. Hannah and Her Sisters (Orion; dir. Woody Allen) The prolific Allen’s best film of the decade presents the tangled personal lives of three quite different sisters (Mia Farrow, Dianne Wiest, and Barbara Hershey). Wiest and Michael Caine, as Farrow’s husband in love with Hershey, won their first Oscars for their supporting roles, and Allen won for his original screenplay. Platoon (Hemdale/Orion; dir. Oliver Stone) Stone’s first film about his Vietnam War experiences finds innocent Charlie Sheen affected by the contrasting personalities of his sergeants: the corrupt Tom Berenger and the saintly Willem Dafoe. It won four Oscars, including Best Picture and Best Director. See also main entry. She’s Gotta Have It (Forty Acres and a Mule Filmworks/Island Pictures; dir. Spike Lee) Groundbreaking for both independent and African American filmmakers, Lee’s comedy finds a young Brooklyn woman (Tracy Camila Johns) trying to maintain her personal freedom while juggling relationships with three men (Tommy Redmond Hicks, John Canada Terrell, and Lee himself). Top Gun (Paramount; dir. Tony Scott) Military clichés were reborn in the year’s box-office champion as navy pilot Tom Cruise carries on a romance with civilian consultant Kelly McGillis while conducting a rivalry with fellow pilot Val Kilmer. “Take My Breath Away” won the best-song Oscar.
1987 Broadcast News (Twentieth Century-Fox; dir. James L. Brooks) Television news producer Jane Craig (Holly Hunter) is good at her job but not her personal life. Reporter Aaron Altman (Albert Brooks) secretly yearns for her, while Jane falls for a dumb anchorman (William Hurt) against her better judgment. Jack Nicholson offers a delightful cameo as the smarmy lead anchor. The film was nominated for seven Oscars but did not win any. Dirty Dancing (Vestron; dir. Emile Ardolino) One
The Eighties in America
of the most popular teen films in a decade dominated by the genre finds Baby Houseman (Jennifer Grey) spending the summer of 1963 in the Catskills and learning sexy dance moves from Patrick Swayze. “The Time of My Life” won the bestsong Oscar. Fatal Attraction (Jaffe-Lansing Productions/Paramount; dir. Adrian Lyne) The one-night stand of a married man (Michael Douglas) and an unbalanced woman (Glenn Close) leads to terror. See also main entry. Full Metal Jacket (Warner Bros.; dir. Stanley Kubrick) Kubrick’s Vietnam drama focuses first on the training of Marine recruits and then on their combat experiences. Vincent D’Onofrio famously gained seventy pounds to play a Marine who cracks under pressure. See also main entry. Lethal Weapon (Warner Bros.; dir. Richard Donner) A reckless Los Angeles police detective (Mel Gibson) becomes partners with a family man (Danny Glover) whose goal is to stay alive. The film was one of the first to be even more popular on video than in theaters, leading to sequels. Moonstruck (MGM/United Artists; dir. Norman Jewison) Loretta (Cher), a widowed Brooklyn bookkeeper on the verge of marrying a man (Danny Aiello) she does not love, finds herself falling for his younger brother (Nicolas Cage). It won Oscars for Best Actress, Best Supporting Actress (Olympia Dukakis as Loretta’s mother), and Best Original Screenplay by John Patrick Shanley. The Princess Bride (Act III/Twentieth Century-Fox; dir. Rob Reiner) A pirate (Cary Elwes) strives to rescue his true love (Robin Wright) from an evil prince (Chris Sarandon). William Goldman’s adaptation of his tongue-in-cheek fairy tale did modest business in 1987 but has developed cult status. RoboCop (Orion; dir. Paul Verhoeven) A dead policeman (Peter Weller) is resurrected as a halfhuman, half-robot fighting force. See also main entry. The Untouchables (Paramount; dir. Brian De Palma) De Palma and screenwriter David Mamet transformed the popular 1959-1963 television series into a crime epic as Federal Bureau of Investigation (FBI) agent Eliot Ness (Kevin Costner) goes up against the powerful gangster Al Capone (Robert De Niro). The film’s set piece is a stairway shootout patterned after a scene in Sergei
The Eighties in America
Eisenstein’s The Battleship Potemkin (1925). Sean Connery’s role as an honest cop earned him an Oscar. Wall Street (American Entertainment Partners/ Twentieth Century-Fox; dir. Oliver Stone) Michael Douglas won an Oscar as an unprincipled corporate raider in Stone’s evisceration of 1980’s greed. See also main entry.
1988 Beetlejuice (Warner Bros.; dir. Tim Burton) The highly imaginative Burton’s first big hit presents dead newlyweds (Alec Baldwin and Geena Davis) who enlist the aid of rambunctious spirit Beetlejuice (Michael Keaton) to rid their house of an obnoxious couple (Catherine O’Hara and Jeffrey Jones), only for Beetlejuice to fall for the yuppies’ gloomy daughter (Winona Ryder). Beetlejuice’s distinctive makeup received an Oscar. Big (Twentieth Century-Fox; dir. Penny Marshall) Tom Hanks became a star playing a thirteen-yearold granted his wish to be “big,” becoming an adult overnight, getting a job with a toy company, and falling for a fellow employee (Elizabeth Perkins), only to discover that adulthood is not so wonderful. Bull Durham (Mount Company/Orion; dir. Ron Shelton) One of the decade’s sexiest romantic comedies, as well as one of the best baseball films ever, presents career minor-leaguer Crash Davis (Kevin Costner) and his romance with a baseball groupie (Susan Sarandon). Tim Robbins gives a star-making performance as Crash’s goofy rival, Nuke LaLoosh. Die Hard (Gordon Company/Silver Pictures/Twentieth Century-Fox; dir. John McTiernan) A New York cop (Bruce Willis) visits his estranged wife (Bonnie Bedelia) in her Los Angeles office building just as a ruthless criminal (Alan Rickman) and his gang take everyone in the skyscraper hostage. This huge hit spawned sequels and imitations. A Fish Called Wanda (MGM/United Artists; dir. Charles Crichton) A con artist (Jamie Lee Curtis) plots to obtain jewels stolen by her gangster lover (Tom Georgeson) and falls in love with his stuffy lawyer (John Cleese, who also wrote the screenplay). Kevin Kline won an Oscar for portraying her dim-witted henchman. The Last Temptation of Christ (Cineplex Odeon Films/Universal; dir. Martin Scorsese) The de-
Entertainment: Major Films of the 1980’s
■
1081
cade’s most controversial film offers a look at the human side of Jesus (Willem Dafoe). See also main entry. Rain Man (Guber-Peters Company/MGM/United Artists; dir. Barry Levinson) A selfish young man (Tom Cruise) learns he has an autistic older brother (Dustin Hoffman) and discovers his humanity as they travel across the country together. It won Oscars for Best Picture, Best Director, Best Actor (Hoffman), and Best Original Screenplay. The Thin Blue Line (American Playhouse/Third Floor/Miramax; dir. Errol Morris) Randall Dale Adams was released from prison in 1988 after Morris’s documentary proved he was innocent of a 1976 Texas murder. The failure of the film to earn an Oscar nomination called into question the Academy’s procedure for considering documentaries. The Unbearable Lightness of Being (Saul Zaentz Company/Orion; dir. Philip Kaufman) Adapted from Milan Kundera’s 1984 novel, the decade’s most erotic drama presents a womanizing Prague surgeon (Daniel Day-Lewis) and his relations with his wife (Juliette Binoche) and his mistress (Lena Olin) against the backdrop of the 1968 Soviet invasion of Czechoslovakia. Who Framed Roger Rabbit (Amblin Entertainment/Buena Vista/Silver Screen Partners III/ Touchstone; dir. Robert Zemeckis) Cooperation between Disney and Warner Bros. allowed most of the 1940’s cartoon characters to appear in this live-action and animation tribute to film noir. Its technical virtuosity earned four Oscars. See also main entry.
1989 Batman (Guber-Peters Company/Warner Bros.; dir. Tim Burton) The year’s box-office champion finds Burton creating a darker view of the superhero than previously seen in serial and television versions. The affection of Batman/Bruce Wayne (Michael Keaton) for reporter Vicki Vale (Kim Basinger) makes him vulnerable to the evil Joker (Jack Nicholson). Its art direction won an Oscar. Dead Poets Society (Buena Vista/Silver Screen Partners IV/Touchstone; dir. Peter Weir) Robin Williams subdues his manic style as an unconventional teacher at a 1959 prep school, though the teacher’s unorthodox approach to education
1082
■
The Eighties in America
Entertainment: Major Films of the 1980’s
has tragic consequences. Tom Shulman’s original screenplay won an Oscar. Do the Right Thing (Forty Acres and a Mule Filmworks/Universal; dir. Spike Lee) Lee’s look at racial tensions in his native Brooklyn earned him acclaim as a major American filmmaker. See also main entry. Driving Miss Daisy (Zanuck Company/Warner Bros.; dir. Bruce Beresford) A wealthy Jewish woman (Jessica Tandy) and her black chauffeur (Morgan Freeman) struggle to understand changes in the South during the civil rights era. Alfred Uhry’s adaptation of his Pulitzer Prizewinning play earned four Oscars, including Best Picture, Best Actress, and Best Screenplay. Drugstore Cowboy (Avenue Entertainment; dir. Gus Van Sant) Matt Dillon leads a gang of misfits who rob pharmacies to feed their drug habits in independent filmmaker Van Sant’s breakthrough film. Field of Dreams (Gordon Company/Universal; dir. Phil Alden Robinson) W. P. Kinsella’s 1982 novel Shoeless Joe is the basis of one of the most popular baseball films ever. An Iowa farmer (Kevin Costner) recruits a reclusive writer (James Earl Jones) and a doctor (Burt Lancaster) who played one game in the major leagues to witness a miracle in his cornfield. Indiana Jones and the Last Crusade (Lucasfilm/Paramount; dir. Steven Spielberg) Spielberg’s third Indiana Jones film finds Indy (Harrison Ford) joining his father (Sean Connery) on a quest for the Holy Grail in 1938. The film, which won a sound-effects Oscar, set a record by grossing fifty million dollars during its first week of American release. The Little Mermaid (Buena Vista/Silver Screen Partners IV/Walt Disney; dir. Ron Clements) Disney began recovering its reputation for quality animation with this box-office hit about a teenager whose father is king of the sea. Alan Menken won Oscars for best song and score. See also main entry.
sex, lies, and videotape (Outlaw Productions/ Miramax; dir. Steven Soderbergh) The decade’s most acclaimed independent film launched Soderbergh’s career and boosted those of stars Andie MacDowell, James Spader, Laura San Giacomo, and Peter Gallagher. See also main entry. When Harry Met Sally . . . (Castle Rock Entertainment/Nelson Entertainment/Columbia; dir. Rob Reiner) The most popular romantic comedy of the 1980’s shows how longtime friends (Meg Ryan and Billy Crystal) slowly fall in love. See also main entry. Further Reading
Biskind, Peter. Down and Dirty Pictures: Miramax, Sundance, and the Rise of Independent Film. New York: Simon & Schuster, 2004. Entertaining, informative look at the birth of independent American films. Brode, Douglas. The Films of the Eighties. Secaucus, N.J.: Carol, 1990. Heavily illustrated overview of the decade’s films. Diawara, Manthia, ed. Black American Cinema. New York: Routledge, 1993. Includes essays on the decade’s biracial buddy films and the rise of black independent films. Haines, Richard W. The Moviegoing Experience: 19682001. Jefferson, N.C.: McFarland, 2003. Explains how distribution changes, multiplexes, and home video affected the film industry. Nowlan, Robert A., and Gwendolyn Wright Nolan. The Films of the Eighties. Jefferson, N.C.: McFarland, 1991. Encyclopedic look at 3,400 films. Palmer, William J. The Films of the Eighties: A Social History. Carbondale: Southern Illinois University Press, 1993. Analysis of how the decade’s films reflected American society. Toplin, Robert Brent, ed. Oliver Stone’s USA: Film, History, and Controversy. Lawrence: University of Kansas Press, 2000. Essays provide a detailed examination of Stone’s work as a director and screenwriter. Michael Adams
■ Entertainment: Academy Awards A title or name followed by an asterisk (*) indicates the presence of a full-length essay within The Eighties in America.
1980 Best Picture: Ordinary People* Best Actor: Robert De Niro, Raging Bull* Best Actress: Sissy Spacek, Coal Miner’s Daughter Best Supporting Actor: Timothy Hutton, Ordinary People* Best Supporting Actress: Mary Steenburgen, Melvin and Howard Best Director: Robert Redford, Ordinary People* Best Original Screenplay: Bo Goldman, Melvin and Howard Best Adapted Screenplay: Alvin Sargent, Ordinary People* Best Cinematography: Geoffrey Unsworth and Ghislain Cloquet, Tess
Best Actress: Shirley MacLaine, Terms of Endearment* Best Supporting Actor: Jack Nicholson*, Terms of Endearment* Best Supporting Actress: Linda Hunt, The Year of Living Dangerously Best Director: James L. Brooks, Terms of Endearment* Best Original Screenplay: Horton Foote, Tender Mercies Best Adapted Screenplay: James L. Brooks, Terms of Endearment* Best Cinematography: Sven Nykvist, Fanny and Alexander
1981
1984
Best Picture: Chariots of Fire Best Actor: Henry Fonda, On Golden Pond* Best Actress: Katharine Hepburn, On Golden Pond* Best Supporting Actor: John Gielgud, Arthur Best Supporting Actress: Maureen Stapleton, Reds Best Director: Warren Beatty, Reds Best Original Screenplay: Colin Welland, Chariots of Fire Best Adapted Screenplay: Ernest Thompson, On Golden Pond* Best Cinematography: Vittorio Storaro, Reds
Best Picture: Amadeus Best Actor: F. Murray Abraham, Amadeus Best Actress: Sally Field, Places in the Heart Best Supporting Actor: Haing S. Ngor, The Killing Fields Best Supporting Actress: Peggy Ashcroft, A Passage to India Best Director: Milos Forman, Amadeus Best Original Screenplay: Robert Benton, Places in the Heart Best Adapted Screenplay: Peter Shaffer, Amadeus Best Cinematography: Chris Menges, The Killing Fields
1982 Best Picture: Gandhi Best Actor: Ben Kingsley, Gandhi Best Actress: Meryl Streep*, Sophie’s Choice Best Supporting Actor: Louis Gossett, Jr., An Officer and a Gentleman Best Supporting Actress: Jessica Lange, Tootsie Best Director: Richard Attenborough, Gandhi Best Original Screenplay: John Briley, Gandhi Best Adapted Screenplay: Costa-Gavras and Donald Stewart, Missing Best Cinematography: Billy Williams and Ronny Taylor, Gandhi
1983 Best Picture: Terms of Endearment* Best Actor: Robert Duvall, Tender Mercies
1985 Best Picture: Out of Africa Best Actor: William Hurt*, Kiss of the Spider Woman* Best Actress: Geraldine Page, The Trip to Bountiful Best Supporting Actor: Don Ameche, Cocoon Best Supporting Actress: Anjelica Huston, Prizzi’s Honor Best Director: Sydney Pollack, Out of Africa Best Original Screenplay: Earl W. Wallace, William Kelley, and Pamela Wallace, Witness Best Adapted Screenplay: Kurt Luedtke, Out of Africa Best Cinematography: David Watkin, Out of Africa
1084
■
The Eighties in America
Entertainment: Academy Awards
1986
1988
Best Picture: Platoon* Best Actor: Paul Newman, The Color of Money Best Actress: Marlee Matlin, Children of a Lesser God Best Supporting Actor: Michael Caine, Hannah and Her Sisters Best Supporting Actress: Dianne Wiest, Hannah and Her Sisters Best Director: Oliver Stone, Platoon* Best Original Screenplay: Woody Allen, Hannah and Her Sisters Best Adapted Screenplay: Ruth Prawer Jhabvala, A Room with a View Best Cinematography: Chris Menges, The Mission
Best Picture: Rain Man Best Actor: Dustin Hoffman*, Rain Man Best Actress: Jodie Foster, The Accused Best Supporting Actor: Kevin Kline, A Fish Called Wanda Best Supporting Actress: Geena Davis, The Accidental Tourist Best Director: Barry Levinson, Rain Man Best Original Screenplay: Ronald Bass and Barry Morrow; Rain Man Best Adapted Screenplay: Christopher Hampton, Dangerous Liaisons Best Cinematography: Peter Biziou, Mississippi Burning
1987 Best Picture: The Last Emperor Best Actor: Michael Douglas, Wall Street* Best Actress: Cher*, Moonstruck Best Supporting Actor: Sean Connery, The Untouchables Best Supporting Actress: Olympia Dukakis, Moonstruck Best Director: Bernardo Bertolucci, The Last Emperor Best Original Screenplay: John Patrick Shanley, Moonstruck Best Adapted Screenplay: Mark Peploe and Bernardo Bertolucci, The Last Emperor Best Cinematography: Vittorio Storaro, The Last Emperor
1989 Best Picture: Driving Miss Daisy Best Actor: Daniel Day-Lewis, My Left Foot Best Actress: Jessica Tandy, Driving Miss Daisy Best Supporting Actor: Denzel Washington, Glory Best Supporting Actress: Brenda Fricker, My Left Foot Best Director: Oliver Stone, Born on the Fourth of July Best Original Screenplay: Tom Schulman, Dead Poets Society Best Adapted Screenplay: Alfred Uhry, Driving Miss Daisy Best Cinematography: Freddie Francis, Glory
■ Entertainment: Major Broadway Plays and Awards This list contains all Broadway plays that ran for at least one full month between January 1, 1980, and December 31, 1989, and which had total runs of at least two hundred performances. It also includes plays with shorter runs that received major awards. An asterisk (*) next to a title or personage indicates that a full essay exists on the topic within The Eighties in America.
Plays Opening in 1979 Evita (opened September 25, 1979) 1,567 performances 1980 Tony Awards: Best Musical, Robert Stigwood (producer); Best Book of a Musical, Tim Rice; Best Original Score, Andrew Lloyd Webber and Tim Rice; Best Actress in a Musical, Patti LuPone; Best Featured Actor in a Musical, Mandy Patinkin; Best Direction of a Musical, Harold Prince 1980 New York Drama Critics Circle Award: Best Musical, Andrew Lloyd Webber and Tim Rice Sugar Babies (opened October 8, 1979) 1,208 performances Romantic Comedy (opened November 8, 1979) 396 performances Strider (opened November 14, 1979) 214 performances
The Shubert Organization, Dasha Epstein, and Ron Dante (producers); Best Actor in a Play, John Rubinstein; Best Actress in a Play, Phyllis Frelich I Ought to Be in Pictures (opened April 3, 1980) 324 performances 1980 Tony Awards: Best Featured Actress in a Play, Dinah Manoff Morning’s at Seven (opened April 10, 1980) 564 performances (revival) 1980 Tony Awards: Best Featured Actor in a Play, David Rounds; Best Direction of a Play, Vivian Matalon; Best Reproduction of a Play or Musical, Elizabeth I. McCann, Nelle Nugent, and Ray Larson (producers) Barnum (opened April 30, 1980) 854 performances 1980 Tony Awards: Best Actor in a Musical, Jim Dale
Bent (opened December 2, 1979) 241 performances Oklahoma! (opened December 13, 1979) 293 performances (revival)
Plays Opening in 1980 Betrayal (opened January 5, 1980) 170 performances 1980 New York Drama Critics Circle Award: Best Foreign Play, Harold Pinter (playwright) West Side Story (opened February 14, 1980) 333 performances (revival) Talley’s Folly (opened February 20, 1980) 286 performances 1980 Pulitzer Prize: Lanford Wilson (playwright) 1980 New York Drama Critics Circle Award: Best Play, Lanford Wilson Children of a Lesser God (opened March 30, 1980) 887 performances 1980 Tony Awards: Best Play, Emanuel Azenberg,
A Day in Hollywood/A Night in the Ukraine (opened May 1, 1980) 588 performances 1980 Tony Awards: Best Featured Actress in a Musical, Priscilla Lopez; Best Choreography, Tommy Tune and Thommie Walsh Home (opened May 7, 1980) 278 performances 42nd Street (opened August 25, 1980) 3,486 performances 1981 Tony Awards: Best Musical, David Merrick (producer); Best Choreography, Gower Champion Fifth of July (opened November 5, 1980) 511 performances 1981 Tony Awards: Best Featured Actress in a Play, Swoosie Kurtz Lunch Hour (opened November 12, 1980) 262 performances
1086
■
Entertainment: Major Broadway Plays and Awards
A Lesson from Aloes (opened November 17, 1980) 96 performances 1981 New York Drama Critics Circle Award: Best Play, Athol Fugard (playwright) Amadeus (opened December 17, 1980) 1,181 performances 1981 Tony Awards: Best Play, Peter Shaffer (playwright) and The Shubert Organization, Elizabeth I. McCann, Nelle Nugent, and Roger S. Berlind (producers); Best Actor in a Play, Ian McKellen; Best Direction of a Play, Peter Hall 1981 New York Drama Critics Circle Award: Best Play Runner-Up, Peter Shaffer
Plays Opening in 1981 The Pirates of Penzance (opened January 8, 1981) 787 performances (revival) 1981 Tony Awards: Best Actor in a Musical, Kevin Kline; Best Direction of a Musical, Wilford Leach; Best Reproduction of a Play or Musical, Joseph Papp (producer) 1981 New York Drama Critics Circle Award: Special Citation Piaf (opened February 5, 1981) 165 performances 1981 Tony Awards: Best Actress in a Play, Jane Lapotaire Sophisticated Ladies (opened March 1, 1981) 767 performances 1981 Tony Awards: Best Featured Actor in a Musical, Hinton Battle Woman of the Year (opened March 29, 1981) 770 performances 1981 Tony Awards: Best Book of a Musical, Peter Stone; Best Original Score, John Kander and Fred Ebb; Best Actress in a Musical, Lauren Bacall; Best Featured Actress in a Musical, Marilyn Cooper
The Eighties in America
The Life and Adventures of Nicholas Nickleby (October 4, 1981) 49 performances 1982 Tony Awards: Best Play, David Edgar (adapter) and James M. Nederlander, The Shubert Organization, Elizabeth I. McCann, and Nelle Nugent (producers); Best Actor in a Play, Roger Rees; Best Direction of a Play, Trevor Nunn 1982 New York Drama Critics Circle Award: Best Play, David Edgar Crimes of the Heart (opened November 4, 1981) 535 performances 1981 Pulitzer Prize: Beth Henley* (playwright) 1981 New York Drama Critics Circle Award: Best American Play, Beth Henley* Mass Appeal (opened November 12, 1981) 212 performances The Dresser (opened November 19, 1981) 200 performances Dreamgirls (opened December 20, 1981) 1,521 performances 1982 Tony Awards: Best Book of a Musical, Tom Eyen; Best Actor in a Musical, Ben Harney; Best Actress in a Musical, Jennifer Holliday; Best Featured Actor in a Musical, Cleavant Derricks; Best Choreography, Michael Bennett and Michael Peters
Plays Opening in 1982 Joseph and the Amazing Technicolor Dreamcoat (opened January 27, 1982) 747 performances Pump Boys and Dinettes (opened February 4, 1982) 573 performances Agnes of God (March 30, 1982) 599 performances 1982 Tony Awards: Best Featured Actress in a Play, Amanda Plummer
The Floating Light Bulb (opened April 27, 1981) 62 performances 1981 Tony Awards: Best Featured Actor in a Play, Brian Backer
Medea (opened May 2, 1982) 65 performances 1982 Tony Awards: Best Actress in a Play, Zoe Caldwell
Lena Horne: “The Lady and Her Music” (opened May 12, 1981) 333 performances 1981 New York Drama Critics Circle Award: Special Citation
“MASTER HAROLD” . . . and the Boys (opened May 4, 1982) 344 performances 1982 Tony Awards: Best Featured Actor in a Play, Zakes Mokae
The Eighties in America
Entertainment: Major Broadway Plays and Awards
1982 New York Drama Critics Circle Award: Best Play Runner-Up, Athol Fugard (playwright) Nine (opened May 9, 1982) 729 performances 1982 Tony Awards: Best Musical, Michel Stuart, Harvey J. Klaris, Roger S. Berlind, James M. Nederlander, Francie LeFrak, and Kenneth D. Greenblatt (producers); Best Original Score, Maury Yeston; Best Featured Actress in a Musical, Lillane Montevecchi; Best Direction of a Musical, Tommy Tune Torch Song Trilogy* (opened June 10, 1982) 1,222 performances 1983 Tony Awards: Best Play, Harvey Fierstein (playwright) and Kenneth Waissman, Martin Markinson, Lawrence Lane, John Glines, BetMar, and Donald Tick (producers); Best Actor in a Play, Harvey Fierstein 1982 New York Drama Critics Circle Award: Best American Play Runner-Up, Harvey Fierstein Little Shop of Horrors (opened July 27, 1982) 2,209 performances 1983 New York Drama Critics Circle Award: Best Musical, Alan Menken (music) and Howard Ashman (book and lyrics) Cats* (opened October 7, 1982) 7,485 performances 1983 Tony Awards: Best Musical, Cameron Mackintosh, The Really Useful Theatre Company, David Geffen, and the Shubert Organization (producers); Best Book of a Musical, T. S. Eliot; Best Original Score, Andrew Lloyd Webber and T. S. Eliot; Best Featured Actress in a Musical, Betty Buckley; Best Direction of a Musical, Trevor Nunn Foxfire (opened November 11, 1982) 213 performances 1983 Tony Awards: Best Actress in a Play, Jessica Tandy Steaming (December 12, 1982) 65 performances 1983 Tony Awards: Best Featured Actress in a Play, Judith Ivey
Plays Opening in 1983 Plenty (opened January 6, 1983) 92 performances 1983 New York Drama Critics Circle Award: Best Foreign Play, David Hare (playwright)
■
1087
On Your Toes (March 6, 1983) 505 performances (revival) 1983 Tony Awards: Best Actress in a Musical, Natalia Makarova; Best Reproduction of a Play or Musical, Alfred de Liagre, Jr., Roger L. Stevens, John Mauceri, Donald R. Seawall, and André Pastoria (producers) Brighton Beach Memoirs (opened March 27, 1983) 1,299 performances 1983 Tony Awards: Best Featured Actor in a Play, Matthew Broderick*; Best Direction of a Play, Gene Saks 1983 New York Drama Critics Circle Award: Best Play, Neil Simon (playwright) ’night, Mother (opened March 31, 1983) 380 performances 1983 Pulitzer Prize: Marsha Norman (playwright) 1983 New York Drama Critics Circle Award: Best Play Runner-Up, Marsha Norman You Can’t Take It With You (opened April 4, 1983) 312 performances (revival) My One and Only (opened May 1, 1983) 767 performances 1983 Tony Awards: Best Actor in a Musical, Tommy Tune; Best Featured Actor in a Musical, Charles Honi Coles; Best Choreography, Tommy Tune and Thommie Walsh The Caine Mutiny Court-Martial (opened May 5, 1983) 216 performances (revival) La Cage aux Folles (opened August 21, 1983) 1,761 performances 1984 Tony Awards: Best Musical, Allan Carr, Kenneth D. Greenblatt, Marvin A. Krauss, Stewart F. Lane, James M. Nederlander, Martin Richards, Barry Brown, and Fritz Holt (producers); Best Book of a Musical, Harvey Fierstein; Best Original Score, Jerry Herman; Best Actor in a Musical, George Hearn; Best Direction of a Musical, Arthur Laurents Zorba (opened October 16, 1983) 362 performances (revival) 1984 Tony Awards: Best Featured Actress in a Musical, Lila Kedrova
1088
■
Entertainment: Major Broadway Plays and Awards
Baby (opened December 4, 1983) 241 performances Noises Off (opened December 11, 1983) 553 performances The Tap Dance Kid (opened December 21, 1983) 669 performances 1984 Tony Awards: Best Featured Actor in a Musical, Hinton Battle; Best Choreography, Danny Daniels
Plays Opening in 1984 The Real Thing (opened January 5, 1984) 566 performances 1984 Tony Awards: Best Play, Tom Stoppard (playwright) and Emanuel Azenberg, The Shubert Organization, Icarus Productions, Byron Goldman, Ivan Bloch, Roger S. Berlind, and Michael Codron (producers); Best Actor in a Play, Jeremy Irons; Best Actress in a Play, Glenn Close*; Best Featured Actress in a Play, Christine Baranski; Best Direction of a Play, Mike Nichols 1984 New York Drama Critics Circle Award: Best Play, Tom Stoppard The Rink (opened February 9, 1984) 204 performances 1984 Tony Awards: Best Actress in a Musical, Chita Rivera Glengarry Glen Ross (opened March 25, 1984) 378 performances 1984 Tony Awards: Best Featured Actor in a Play, Joe Mantegna 1984 Pulitzer Prize: David Mamet* (playwright) 1984 New York Drama Critics Circle Award: Best American Play, David Mamet* Sunday in the Park with George (opened May 2, 1984) 604 performances 1985 Pulitzer Prize: Stephen Sondheim and James Lapine 1984 New York Drama Critics Circle Award: Best Musical, Stephen Sondheim (music and lyrics) and James Lapine (book) Design for Living (opened June 20, 1984) 245 performances (revival)
The Eighties in America
Hurlyburly (opened August 7, 1984) 343 performances 1985 Tony Awards: Best Featured Actress in a Play, Judith Ivey Ma Rainey’s Black Bottom (opened October 11, 1984) 276 performances 1985 New York Drama Critics Circle Award: Best Play, August Wilson* (playwright) Much Ado About Nothing (opened October 14, 1984) 53 performances (revival) 1985 Tony Awards: Best Actor in a Play, Derek Jacobi
Plays Opening in 1985 The Odd Couple (opened January 11, 1985) 295 performances (revival) Joe Egg (opened March 27, 1985) 93 performances (revival) 1985 Tony Awards: Best Actress in a Play, Stockard Channing; Best Reproduction of a Play or Musical, The Shubert Organization, Emanuel Azenberg, Roger S. Berlind, Ivan Bloch, and MTM Enterprises, Inc. (producers) Biloxi Blues (opened March 28, 1985) 524 performances 1985 Tony Awards: Best Play, Neil Simon (playwright) and Emanuel Azenberg and the Center Theater Group/Ahmanson Theater (producers); Best Featured Actor in a Play, Barry Miller; Best Direction of a Play, Gene Saks 1985: New York Drama Critics Circle Award: Best Play Runner-Up, Neil Simon Grind (opened April 16, 1985) 71 performances 1985 Tony Awards: Best Featured Actress in a Musical, Leilani Jones Big River (opened April 25, 1985) 1,005 performances 1985 Tony Awards: Best Musical, Rocco Landesman, Heidi Landesman, Rick Steiner, M. Anthony Fisher, and Dodger Theatricals (producers); Best Book of a Musical, William Hauptman; Best Original Score, Roger Miller; Best Featured Actor in a Musical, Ron Richardson; Best Direction of a Musical, Des McAnuff
The Eighties in America
Entertainment: Major Broadway Plays and Awards
As Is (opened May 1, 1985) 285 performances Doubles (opened May 8, 1985) 277 performances Singin’ in the Rain (opened July 2, 1985) 367 performances Song and Dance (opened September 18, 1985) 474 performances 1986 Tony Awards: Best Actress in a Musical, Bernadette Peters The Search for Signs of Intelligent Life in the Universe (opened September 26, 1985) 391 performances 1986 Tony Awards: Best Actress in a Play, Lily Tomlin 1986 New York Drama Critics Circle Award: Special Citation, Lily Tomlin and Jane Wagner I’m Not Rappaport (opened November 19, 1985) 891 performances 1986 Tony Awards: Best Play, Herb Gardner (playwright) and James Walsh, Lewis Allen, and Martin Heinfling (producers); Best Actor in a Play, Judd Hirsch The Mystery of Edwin Drood (opened December 2, 1985) 608 performances 1986 Tony Awards: Best Musical, Joseph Papp (producer); Best Book of a Musical, Rupert Holmes; Best Original Score, Rupert Holmes; Best Actor in a Musical, George Rose; Best Direction of a Musical, Wilford Leach Benefactors (opened December 22, 1985) 217 performances 1986 New York Drama Critics Circle Award: Best Foreign Play, Michael Frayn (playwright)
Plays Opening in 1986 Big Deal (opened April 10, 1986) 69 performances 1986 Tony Awards: Best Choreography, Bob Fosse Social Security (opened April 17, 1986) 388 performances Sweet Charity (opened April 27, 1986) 369 performances (revival) 1986 Tony Awards: Best Featured Actor in a Musical, Michael Rupert; Best Featured Actress in a
■
1089
Musical, Bebe Neuwirth; Best Reproduction of a Play or a Musical, Jerry Minskoff, James M. Nederlander, Arthur Rubin, and Joseph Harris (producers) The House of Blue Leaves (opened April 29, 1986) 398 performances 1986 Tony Awards: Best Featured Actor in a Play, John Mahoney; Best Featured Actress in a Play, Swoosie Kurtz; Best Direction of a Play, Jerry Zaks Arsenic and Old Lace (opened June 26, 1986) 221 performances (revival) Me and My Girl (opened August 10, 1986) 1,420 performances 1987 Tony Awards: Best Actor in a Musical, Robert Lindsay; Best Actress in a Musical, Maryann Plunkett; Best Choreography, Gillian Gregory 1987 New York Drama Critics Circle Award: Best Musical Runner-Up, Noel Gay (music), Douglas Furber (book and lyrics), and L. Arthur Rose (book and lyrics) Broadway Bound (opened December 4, 1986) 756 performances 1987 Tony Awards: Best Actress in a Play, Linda Lavin; Best Featured Actor in a Play, John Randolph Jackie Mason’s The World According to Me! (opened December 22, 1986) 367 performances 1987 Tony Awards: Special Award, Jackie Mason
Plays Opening in 1987 The Nerd (opened March 2, 1987) 441 performances Coastal Disturbances (opened March 4, 1987) 350 performances Les Misérables (opened March 12, 1987) 6,680 performances 1987 Tony Awards: Best Musical, Cameron Mackintosh (producer); Best Book of a Musical, Alain Boublil and Claude-Michel Schönberg; Best Original Score, Claude-Michel Schönberg, Herbert Kretzmer, and Alain Boublil; Best Featured Actor in a Musical, Michael Maguire; Best Featured Actress in a Musical, Frances Ruffelle; Best
1090
■
Entertainment: Major Broadway Plays and Awards
Direction of a Musical, Trevor Nunn and John Caird 1987 New York Drama Critics Circle Award: Best Musical, Claude-Michel Schönberg, Alain Boublil, and Herbert Kretzmer Starlight Express (opened March 15, 1987) 761 performances Fences (opened March 26, 1987) 525 performances 1987 Tony Awards: Best Play, August Wilson* (playwright) and Carole Shorenstein Hays and Yale Repertory Theatre (producers); Best Actor in a Play, James Earl Jones; Best Featured Actress in a Play, Mary Alice: Best Direction of a Play, Lloyd Richards 1987 Pulitzer Prize: August Wilson* 1987 New York Drama Critics Circle Award: Best Play, August Wilson* Les Liaisons Dangereuses (opened April 30, 1987) 149 performances 1987 New York Drama Critics Circle Award: Best Foreign Play, Christopher Hampton (playwright) Burn This (opened October 14, 1987) 437 performances 1988 Tony Awards: Best Actress in a Play, Joan Allen Anything Goes (opened October 19, 1987) 784 performances (revival) 1988 Tony Awards: Best Featured Actor in a Musical, Bill McCutcheon; Best Choreography, Michael Smuin; Best Revival, Lincoln Center Theater (producer) Cabaret (opened October 22, 1987) 261 performances (revival) Into the Woods (opened November 5, 1987) 765 performances 1988 Tony Awards: Best Book of a Musical, James Lapine; Best Original Score, Stephen Sondheim; Best Actress in a Musical, Joanna Gleason 1988 New York Drama Critics Circle Award: Best Musical, Stephen Sondheim and James Lapine
The Eighties in America
Plays Opening in 1988 The Phantom of the Opera* (opened January 26, 1988) 8,213 performances as of October 7, 2007 1988 Tony Awards: Best Musical, Cameron Mackintosh and the Really Useful Theatre Company (producers); Best Actor in a Musical, Michael Crawford; Best Featured Actress in a Musical, Judy Kaye; Best Direction of a Musical, Harold Prince 1988 New York Drama Critics Circle Award: Best Musical Runner-Up, Andrew Lloyd Webber, Charles Hart, and Richard Stilgoe Sarafina! (opened January 28, 1988) 597 performances M. Butterfly (opened March 20, 1988) 777 performances 1988 Tony Awards: Best Play, David Henr y Hwang* (playwright) and Stuart Ostrow and David Geffen (producers); Best Featured Actor in a Play, B. D. Wong; Best Direction of a Play, John Dexter 1988 New York Drama Critics Circle Award: Best Play Runner-Up, David Henry Hwang* Joe Turner’s Come and Gone (opened March 27, 1988) 105 performances 1988 Tony Awards: Best Featured Actress in a Play, L. Scott Caldwell 1988 New York Drama Critics Circle Award: Best Play, August Wilson* (playwright) Romance, Romance (opened May 1, 1988) 297 performances Jackie Mason’s The World According to Me! (opened May 2, 1988) 206 performances (return engagement) Speed-the-Plow (opened May 3, 1988) 279 performances 1988 Tony Awards: Best Actor in a Play, Ron Silver Rumors (opened November 17, 1988) 535 performances 1989 Tony Awards: Best Featured Actress in a Play, Christine Baranski
The Eighties in America
Entertainment: Major Broadway Plays and Awards
Plays Opening in 1989 Black and Blue (opened January 26, 1989) 829 performances 1989 Tony Awards: Best Actress in a Musical, Ruth Brown; Best Choreography, Cholly Atkins, Henry LeTang, Frankie Manning, and Fayard Nicholas Shirley Valentine (opened February 16, 1989) 324 performances 1989 Tony Awards: Best Actress in a Play, Pauline Collins Jerome Robbins’ Broadway (opened February 26, 1989) 633 performances 1989 Tony Awards: Best Musical, The Shubert Organization, Roger S. Berlind, Suntory International Corporation, Byron Goldman, and Emanuel Azenberg (producers); Best Actor in a Musical, Jason Alexander; Best Featured Actor in a Musical, Scott Wise; Best Featured Actress in a Musical, Debbie Shapiro; Best Direction of a Musical, Jerome Robbins Lend Me a Tenor (opened March 2, 1989) 476 performances 1989 Tony Awards: Best Actor in a Play, Philip Bosco; Best Direction of a Play, Jerry Zaks The Heidi Chronicles* (opened March 9, 1989) 622 performances 1989 Tony Awards: Best Play, Wendy Wasserstein (playwright) and The Shubert Organization and Playwrights Horizons (producers); Best Featured Actor in a Play, Boyd Gaines
■
1091
1989 Pulitzer Prize: Wendy Wasserstein 1989 New York Drama Critics Circle Award: Best Play, Wendy Wasserstein Aristocrats (opened April 11, 1989) 189 performances 1989 New York Drama Critics Circle Award: Best Foreign Play, Brian Friel (playwright) Largely New York (opened May 1, 1989) 144 performances 1989 New York Drama Critics Circle Award: Special Citation, Bill Irwin Meet Me in St. Louis (opened November 2, 1989) 253 performances Grand Hotel (opened November 12, 1989) 1,018 performances A Few Good Men (opened November 15, 1989) 497 performances Gypsy (opened November 16, 1989) 476 performances (revival) The Circle (opened November 20, 1989) 208 performances (revival) City of Angels (opened December 12, 1989) 878 performances Tru (opened December 19, 1989) 295 performances (revival)
■ Entertainment: Most-Watched U.S. Television Shows This list shows the top-ten U.S. television programs of each September-April season, as ranked by the Nielsen Media Company. The ratings in the right column indicate the average percentage of American homes with televisions watching each show. For example, during the 1980-1981 season, 34.5 percent of all American homes with a television watched Dallas on the evenings that it was broadcast. Titles followed by an asterisk (*) indicate that the program has its own full-length essay within The Eighties in America.
1980-1981 1. Dallas* 2. The Dukes of Hazzard 3. 60 Minutes 4. M*A*S*H 5. The Love Boat 6. The Jeffersons 7. Alice 8. House Calls Three’s Company 10. Little House on the Prairie
CBS CBS CBS CBS ABC CBS CBS CBS ABC NBC
34.5 27.3 27.0 25.7 24.3 23.5 22.9 22.4 22.4 22.1
CBS CBS CBS ABC CBS CBS ABC ABC CBS CBS
28.4 27.4 23.4 23.3 22.7 22.6 22.6 22.5 22.3 22.0
CBS CBS CBS CBS ABC ABC CBS CBS ABC NBC ABC
25.5 24.6 22.6 22.6 22.4 21.2 21.0 20.7 20.3 20.1 20.1
CBS CBS ABC
25.7 24.2 24.1
1981-1982 1. Dallas* 2. 60 Minutes 3. The Jeffersons 4. Three’s Company 5. Alice 6. The Dukes of Hazzard Too Close for Comfort 8. ABC Monday Night Movie 9. M*A*S*H 10. One Day at a Time
1982-1983 1. 60 Minutes 2. Dallas* 3. M*A*S*H Magnum, P.I.* 5. Dynasty* 6. Three’s Company 7. Simon and Simon 8. Falcon Crest 9. The Love Boat 10. The A-Team Monday Night Football
1983-1984 1. Dallas* 2. 60 Minutes 3. Dynasty*
4. The A-Team 5. Simon and Simon 6. Magnum, P.I.* 7. Falcon Crest 8. Kate and Allie 9. Hotel 10. Cagney and Lacey*
NBC CBS CBS CBS CBS ABC CBS
24.0 23.8 22.4 22.0 21.9 21.1 20.9
ABC CBS NBC CBS NBC NBC CBS CBS CBS CBS CBS
25.0 24.7 24.2 22.2 22.1 21.9 21.8 20.1 20.0 19.9 19.9
NBC NBC CBS CBS NBC CBS ABC NBC NBC ABC
33.7 30.0 25.3 23.9 23.7 21.9 21.8 21.8 21.3 21.1
NBC NBC CBS CBS NBC CBS NBC
34.9 32.7 27.2 25.4 24.5 23.3 23.2
1984-1985 1. Dynasty* 2. Dallas* 3. The Cosby Show* 4. 60 Minutes 5. Family Ties* 6. The A-Team 7. Simon and Simon 8. Murder, She Wrote 9. Knot’s Landing 10. Falcon Crest Crazy Like a Fox
1985-1986 1. The Cosby Show* 2. Family Ties* 3. Murder, She Wrote 4. 60 Minutes 5. Cheers* 6. Dallas* 7. Dynasty* The Golden Girls* 9. Miami Vice* 10. Who’s the Boss?
1986-1987 1. The Cosby Show* 2. Family Ties* 3. Cheers* 4. Murder, She Wrote 5. The Golden Girls* 6. 60 Minutes 7. Night Court
The Eighties in America
8. Growing Pains 9. Moonlighting* 10. Who’s the Boss?
Entertainment: Most-Watched U.S. Television Shows
ABC ABC ABC
22.7 22.4 22.0
NBC NBC NBC NBC ABC ABC NBC CBS CBS NBC ABC
27.8 25.0 23.4 21.8 21.3 21.2 20.8 20.6 20.2 18.8 18.8
NBC ABC
25.6 23.8
1987-1988 1. The Cosby Show* 2. A Different World 3. Cheers* 4. The Golden Girls* 5. Growing Pains 6. Who’s the Boss? 7. Night Court 8. 60 Minutes 9. Murder, She Wrote 10. Alf The Wonder Years*
1988-1989 1. The Cosby Show* 2. Roseanne
3. A Different World 4. Cheers* 5. 60 Minutes 6. The Golden Girls* 7. Who’s the Boss? 8. Murder, She Wrote 9. Empty Nest 10. Anything but Love
■
1093
NBC NBC CBS NBC ABC CBS NBC ABC
23.0 22.3 21.7 21.4 20.8 19.9 19.2 19.0
NBC ABC NBC NBC ABC NBC CBS ABC NBC ABC
23.1 23.1 22.7 21.1 20.9 20.1 19.7 19.2 18.9 18.1
1989-1990 1. The Cosby Show* Roseanne 3. Cheers* 4. A Different World 5. America’s Funniest Home Videos 6. The Golden Girls* 7. 60 Minutes 8. The Wonder Years* 9. Empty Nest 10. Monday Night Football
■ Entertainment: Emmy Awards The categories and titles of the Emmy Awards changed almost every year. This list contains a selection of the television awards generally considered to be the most important. Programs followed by an asterisk (*) are the subject of their own full-length essay within The Eighties in America.
1980-1981 Outstanding Drama Series: Hill Street Blues* (NBC) Outstanding Comedy Series: Taxi (ABC) Outstanding Limited Series: Shogun (NBC) Outstanding Drama Special: Playing for Time (CBS) Outstanding Variety, Music, or Comedy Program: Lily: Sold Out (CBS) Outstanding Lead Actor in a Drama Series: Daniel J. Travanti, Hill Street Blues* (NBC) Outstanding Lead Actress in a Drama Series: Barbara Babcock, Hill Street Blues* (NBC) Outstanding Lead Actor in a Comedy Series: Judd Hirsch, Taxi (ABC) Outstanding Lead Actress in a Comedy Series: Isabel Sanford, The Jeffersons (CBS) Outstanding Supporting Actor in a Drama Series: Michael Conrad, Hill Street Blues* (NBC) Outstanding Supporting Actress in a Drama Series: Nancy Marchand, Lou Grant (CBS) Outstanding Supporting Actor in a Comedy or Variety or Music Series: Danny De Vito, Taxi (ABC) Outstanding Supporting Actress in a Comedy or Variety or Music Series: Eileen Brennan, Private Benjamin (CBS) Outstanding Directing in a Drama Series: Robert Butler, Hill Street Blues* (NBC) Outstanding Directing in a Comedy Series: James Burrows, Taxi (ABC) Outstanding Directing in Variety, Music, or Comedy Program: Don Mischer, The Kennedy Center Honors: A National Celebration (CBS)
1981-1982 Outstanding Drama Series: Hill Street Blues* (NBC) Outstanding Comedy Series: Barney Miller (ABC) Outstanding Limited Series: Marco Polo (NBC) Outstanding Drama Special: A Woman Called Golda (syndicated) Outstanding Variety, Music, or Comedy Program: Night of 100 Stars (ABC) Outstanding Lead Actor in a Drama Series: Daniel J. Travanti, Hill Street Blues* (NBC)
Outstanding Lead Actress in a Drama Series: Michael Learned, Nurse (CBS) Outstanding Lead Actor in a Comedy Series: Alan Alda, M*A*S*H (CBS) Outstanding Lead Actress in a Comedy Series: Carol Kane, Taxi (ABC) Outstanding Supporting Actor in a Drama Series: Michael Conrad, Hill Street Blues* (NBC) Outstanding Supporting Actress in a Drama Series: Nancy Marchand, Lou Grant (CBS) Outstanding Supporting Actor in a Comedy or Variety or Music Series: Christopher Lloyd, Taxi (ABC) Outstanding Supporting Actress in a Comedy or Variety or Music Series: Loretta Swit, M*A*S*H (CBS) Outstanding Directing in a Drama Series: Harry Harris, Fame (NBC) Outstanding Directing in a Comedy Series: Alan Rafkin, One Day at a Time (CBS) Outstanding Directing in a Variety or Music Program: Dwight Hemion, Goldie and Kids . . . Listen to Us (ABC)
1982-1983 Outstanding Drama Series: Hill Street Blues* (NBC) Outstanding Comedy Series: Cheers* (NBC) Outstanding Limited Series: Nicholas Nickleby (syndicated) Outstanding Drama Special: Special Bulletin (NBC) Outstanding Variety, Music, or Comedy Program: Motown 25: Yesterday, Today, Forever (NBC) Outstanding Lead Actor in a Drama Series: Ed Flanders, St. Elsewhere* (NBC) Outstanding Lead Actress in a Drama Series: Tyne Daley, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: Judd Hirsch, Taxi (ABC) Outstanding Lead Actress in a Comedy Series: Shelley Long, Cheers* (NBC) Outstanding Supporting Actor in a Drama Series: James Coco, St. Elsewhere* (NBC) Outstanding Supporting Actress in a Drama Series: Doris Roberts, St. Elsewhere* (NBC)
The Eighties in America
Outstanding Supporting Actor in a Comedy, Variety, or Music Series: Christopher Lloyd, Taxi (ABC) Outstanding Supporting Actress in a Comedy, Variety, or Music Series: Carol Kane, Taxi (ABC) Outstanding Directing in a Drama Series: Jeff Bleckner, Hill Street Blues* (NBC) Outstanding Directing in a Comedy Series: James Burrows, Cheers* (NBC) Outstanding Directing in a Variety or Music Program: Dwight Hemion, Sheena Easton . . . Act One (NBC)
1983-1984 Outstanding Drama Series: Hill Street Blues* (NBC) Outstanding Comedy Series: Cheers* (NBC) Outstanding Limited Series: Concealed Enemies: American Playhouse (PBS) Outstanding Drama/Comedy Special: Something About Amelia: An ABC Theatre Presentation (ABC) Outstanding Variety, Music, or Comedy Program: The Sixth Annual Kennedy Center Honors: A Celebration of the Performing Arts (CBS) Outstanding Lead Actor in a Drama Series: Tom Selleck, Magnum, P.I.* (CBS) Lead Actress in a Drama Series: Tyne Daley, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: John Ritter, Three’s Company (ABC) Outstanding Lead Actress in a Comedy Series: Jane Curtin, Kate and Allie (CBS) Outstanding Supporting Actor in a Drama Series: Bruce Weitz, Hill Street Blues* (NBC) Outstanding Supporting Actress in a Drama Series: Alfre Woodard, Hill Street Blues* (NBC) Outstanding Supporting Actor in a Comedy Series: Pat Harrington, One Day at a Time (CBS) Outstanding Supporting Actress in a Comedy Series: Rhea Perlman, Cheers* (NBC) Outstanding Directing in a Drama Series: Corey Allen, Hill Street Blues* (NBC) Outstanding Directing in a Comedy Series: Bill Persky, Kate and Allie (CBS) Outstanding Directing in a Variety or Music Program: Dwight Hemion, Here’s Television Entertainment (NBC)
1984-1985 Outstanding Drama Series: Cagney and Lacey* (CBS) Outstanding Comedy Series: The Cosby Show* (NBC)
Entertainment: Emmy Awards
■
1095
Outstanding Limited Series: The Jewel in the Crown: Masterpiece Theatre (PBS) Outstanding Drama/Comedy Special: Do You Remember Love (CBS) Outstanding Variety, Music, or Comedy Program: Motown Returns to the Apollo (NBC) Outstanding Lead Actor in a Drama Series: William Daniels, St. Elsewhere* (NBC) Outstanding Lead Actress in a Drama Series: Tyne Daly, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: Robert Guillaume, Benson (ABC) Outstanding Lead Actress in a Comedy Series: Jane Curtin, Kate and Allie (CBS) Outstanding Supporting Actor in a Drama Series: Edward James Olmos, Miami Vice* (NBC) Outstanding Supporting Actress in a Drama Series: Betty Thomas, Hill Street Blues* (NBC) Outstanding Supporting Actor in a Comedy Series: John Larroquette, Night Court (NBC) Outstanding Supporting Actress in a Comedy Series: Rhea Perlman, Cheers* (NBC) Outstanding Directing in a Drama Series: Karen Arthur, Cagney and Lacey* (CBS) Outstanding Directing in a Comedy Series: Jay Sandrich, The Cosby Show* (NBC) Outstanding Directing in a Variety or Music Program: Terry Hughes, Sweeney Todd: Great Performances (PBS)
1985-1986 Outstanding Drama Series: Cagney and Lacey* (CBS) Outstanding Comedy Series: The Golden Girls* (NBC) Outstanding Miniseries: Peter the Great (NBC) Outstanding Drama/Comedy Special: Love Is Never Silent: Hallmark Hall of Fame (NBC) Outstanding Variety, Music, or Comedy Program: The Kennedy Center Honors: A Celebration of the Performing Arts (CBS) Outstanding Lead Actor in a Drama Series: William Daniels, St. Elsewhere* (NBC) Outstanding Lead Actress in a Drama Series: Sharon Gless, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: Michael J. Fox, Family Ties* (NBC) Outstanding Lead Actress in a Comedy Series: Betty White, The Golden Girls* (NBC) Outstanding Supporting Actor in a Drama Series: John Karlen, Cagney and Lacey* (CBS)
1096
■
The Eighties in America
Entertainment: Emmy Awards
Outstanding Supporting Actress in a Drama Series: Bonnie Bartlett, St. Elsewhere* (NBC) Outstanding Supporting Actor in a Comedy Series: John Larroquette, Night Court (NBC) Outstanding Supporting Actress in a Comedy Series: Rhea Perlman, Cheers* (NBC) Outstanding Directing in a Drama Series: Georg Stanford Brown, Cagney and Lacey* (CBS) Outstanding Directing in a Comedy Series: Jay Sandrich, The Cosby Show* (NBC) Outstanding Directing in a Variety or Music Program: Warris Hussein, Copacabana (CBS)
1986-1987 Outstanding Drama Series: L.A. Law* (NBC) Outstanding Comedy Series: The Golden Girls* (NBC) Outstanding Miniseries: A Year in the Life (NBC) Outstanding Drama/Comedy Special: Promise: Hallmark Hall of Fame (CBS) Outstanding Variety, Music, or Comedy Program: The 1987 Tony Awards (CBS) Outstanding Lead Actor in a Drama Series: Bruce Willis, Moonlighting* (ABC) Outstanding Lead Actress in a Drama Series: Sharon Gless, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: Michael J. Fox, Family Ties* (NBC) Outstanding Lead Actress in a Comedy Series: Rue McClanahan, The Golden Girls* (NBC) Outstanding Supporting Actor in a Drama Series: John Hillerman, Magnum, P.I.* (CBS) Outstanding Supporting Actress in a Drama Series: Bonnie Bartlett, St. Elsewhere* (NBC) Outstanding Supporting Actor in a Comedy Series: John Larroquette, Night Court (NBC) Outstanding Supporting Actress in a Comedy Series: Jackee Harry, 227 (NBC) Outstanding Directing in a Drama Series: Gregory Hoblit, L.A. Law* (NBC) Outstanding Directing in a Comedy Series: Terry Hughes, The Golden Girls* (NBC) Outstanding Directing in a Variety or Music Program: Don Mischer, The Kennedy Honors: A Celebration of the Performing Arts (CBS)
1987-1988 Outstanding Drama Series: thirtysomething* (ABC) Outstanding Comedy Series: The Wonder Years* (ABC)
Outstanding Miniseries: The Murder of Mary Phagan (NBC) Outstanding Drama/Comedy Special: Inherit the Wind (NBC) Outstanding Variety, Music, or Comedy Program: Irving Berlin’s 100th Birthday Celebration (CBS) Outstanding Lead Actor in a Drama Series: Richard Kiley, A Year in the Life (NBC) Outstanding Lead Actress in a Drama Series: Tyne Daly, Cagney and Lacey* (CBS) Outstanding Lead Actor in a Comedy Series: Michael J. Fox, Family Ties* (NBC) Outstanding Lead Actress in a Comedy Series: Beatrice Arthur, The Golden Girls* (NBC) Outstanding Supporting Actor in a Drama Series: Larry Drake, L.A. Law* (NBC) Outstanding Supporting Actress in a Drama Series: Patricia Wettig, thirtysomething* (ABC) Outstanding Supporting Actor in a Comedy Series: John Larroquette, Night Court (NBC) Outstanding Supporting Actress in a Comedy Series: Estelle Getty, The Golden Girls* (NBC) Outstanding Directing in a Drama Series: Mark Tinker, St. Elsewhere* (NBC) Outstanding Lead Actress in a Drama Series: Tyne Daly, Cagney and Lacey Outstanding Directing in a Comedy Series: Gregory Hoblit, Hooperman (ABC) Outstanding Directing in a Variety or Music Program: Patricia Birch and Humphrey Burton, Celebrating Gershwin: Great Performances (PBS)
1988-1989 Outstanding Drama Series: L.A. Law* (NBC) Outstanding Comedy Series: Cheers* (NBC) Outstanding Miniseries: War and Remembrance (ABC) Outstanding Drama/Comedy Special: Day One: AT&T Presents (CBS) Outstanding Variety, Music, or Comedy Program: The Tracey Ullman Show (FOX) Outstanding Lead Actor in a Drama Series: Carroll O’Connor, In the Heat of the Night (NBC) Outstanding Lead Actress in a Drama Series: Dana Delany, China Beach (ABC) Outstanding Lead Actor in a Comedy Series: Richard Mulligan, Empty Nest (NBC) Outstanding Lead Actress in a Comedy Series: Candice Bergen, Murphy Brown (CBS)
The Eighties in America
Outstanding Supporting Actor in a Drama Series: Larry Drake, L.A. Law* (NBC) Outstanding Supporting Actress in a Drama Series: Melanie Mayron, thirtysomething* (ABC) Outstanding Supporting Actor in a Comedy Series: Woody Harrelson, Cheers* (NBC) Outstanding Supporting Actress in a Comedy Series: Rhea Perlman, Cheers* (NBC) Outstanding Directing in a Drama Series: Robert Altman, Tanner ’88* (HBO) Outstanding Directing in a Comedy Series: Peter Baldwin, The Wonder Years* (ABC) Outstanding Directing in a Variety or Music Program: Jim Henson, The Jim Henson Hour (NBC)
1989-1990 Outstanding Drama Series: L.A. Law* (NBC) Outstanding Comedy Series: Murphy Brown (CBS) Outstanding Miniseries: Drug Wars: The Camarena Story (NBC) Outstanding Drama/Comedy Special (tie): Caroline? Hallmark Hall of Fame (CBS); The Incident: AT&T Presents (CBS) Outstanding Variety, Music, or Comedy Series: In Living Color (FOX)
Entertainment: Emmy Awards
■
1097
Outstanding Lead Actor in a Drama Series: Peter Falk, Columbo (ABC) Outstanding Lead Actress in a Drama Series: Patricia Wettig, thirtysomething* (ABC) Outstanding Lead Actor in a Comedy Series: Ted Danson, Cheers* (NBC) Outstanding Lead Actress in a Comedy Series: Candice Bergen, Murphy Brown (CBS) Outstanding Supporting Actor in a Drama Series: Jimmy Smits, L.A. Law* (NBC) Outstanding Supporting Actress in a Drama Series: Marg Helgenberger, China Beach (ABC) Outstanding Supporting Actor in a Comedy Series: Alex Rocco, The Famous Teddy Z (CBS) Outstanding Supporting Actress in a Comedy Series: Bebe Neuwirth, Cheers* (NBC) Outstanding Directing in a Drama Series (tie): Thomas Carter, Equal Justice (ABC); Scott Winant, thirtysomething* (ABC) Outstanding Directing in a Comedy Series: Michael Dinner, The Wonder Years* (ABC) Outstanding Directing in a Variety or Music Program: Dwight Hemion, The Kennedy Center Honors (CBS)
■ Legislation: Major U.S. Legislation Year Legislation
Significance
1980
Paperwork Reduction Act
Established the Office of Information and Regulatory Affairs within the Office of Management and Budget to oversee new paperwork requirements and to create means to reduce paperwork.
1980
Staggers Rail Act
Lessened restrictions on mergers and abandonment of rail lines; established policy to reduce regulation of railroads and permit the market to set rates; repealed antitrust immunity for collectively manipulated rates.
1980
Judicial Conduct and Disability Act
Allowed chief judge and governing council of each federal judicial circuit to investigate allegations of wrongdoing and to impose sanctions against judges or magistrates in that circuit.
1980
Refugee Act
Created the Office of U.S. Coordinator of Refugee Affairs within the Department of Health and Human Services; revised policies for admitting and resettling refugees in the United States.
1980
Superfund Act
Authorized the federal government to establish procedures for cleaning up toxic waste sites; earmarked $1.6 billion for an emergency fund to assist cleanup of targeted toxic waste sites.
1980
Debt Limit Law
Permitted the public debt limit to extend beyond June 30; eliminated the oil import fee imposed by the Carter administration; became law pursuant to override of President Jimmy Carter’s veto.
1980
Veterans Affairs Personnel Act
Promoted recruitment and retention of health care personnel in the Veterans Administration; became law pursuant to override of President Jimmy Carter’s veto.
1980
Motor Carrier Act
Loosened the procedures enabling truck firms to acquire operating authority from the Interstate Commerce Commission; eased restrictions on certain truck carriers; made it federal policy to ensure competition in the trucking industry; paved the way for deregulation.
1980
Chrysler Loan Guarantee Act
Specified how the federal government would approve and administer a $1.5 billion loan to the Chrysler Corporation; required a series of concessions by Chrysler, including development of an energy plan.
1980
Privacy Protection Act
Prohibited all law enforcement officers from using warrants to search the offices of legitimate news organizations; required the attorney general to draft new guidelines for federal searches of certain individuals.
1981
Economic Recovery Tax Act
Authorized universal cuts in tax rates; reduced or eliminated taxes on various forms of income.
1981
Social Security Amendments
Restored previously canceled minimum benefits for recipients of Social Security; allowed borrowing among trust funds to ensure payments.
1981
Veterans Health Care, Training, and Small Business Loan Act
Approved nursing home or hospital care for veterans who were exposed to Agent Orange or other herbicides; extended the period for Vietnam veterans to request readjustment counseling; permitted small businesses owned by disabled or Vietnam-era veterans to obtain loans from the Veterans Administration.
The Eighties in America
Legislation: Major U.S. Legislation
■
1099
Year Legislation
Significance
1981
International Security and Development Cooperation Act
Authorized two-year appropriations, including credits and loans, for security and development assistance programs; directed the president to review items on the U.S. Munitions List in order to eliminate unnecessary export controls.
1981
Uniformed Services Pay Act
Increased pay, allowances, and benefits to members of the armed services and their dependents; added three occupations to the list of hazardous-duty positions.
1981
Education Consolidation and Improvement Act
Consolidated programs, creating a single block grant to states for elementary and secondary schools; continued programs for some disadvantaged children.
1981
Municipal Wastewater Treatment Construction Grant Amendments
Reauthorized the federal sewer construction grant program though fiscal year 1985; directed states to conduct a review of water quality standards.
1981
National Tourism Policy Act
Created the U.S. Travel and Tourism Administration within the Department of Commerce; augmented the federal government’s role in promoting foreign travel in the United States.
1981
Steel Industry Compliance Extension Act
Enacted procedures to permit the steel industry to extend Clear Air Act compliance deadlines.
1982
Voting Rights Act Amendments
Renewed selected provisions of the 1965 Voting Rights Act for another twenty-five years; established guidelines for eliminating preclearance requirement; approved funding for bilingual election materials for specified groups.
1982
Boland Amendments
Prohibited the United States’ assistance to paramilitary groups trying to overthrow the government of Nicaragua or to provoke hostilities between Nicaragua and Honduras.
1982
Bus Regulatory Reform Act
Deregulated certain aspects of the passenger bus industry; mandated cooperation between the Interstate Commerce Commission and the states to monitor intrastate compliance with the act.
1982
Small Business Innovation Development Act
Reserved a specified portion of federal agency research and development budgets for small businesses; required participating agencies to establish small business innovation programs.
1982
Fiscal Year 1983 Supplemental Appropriations
Provided $14.2 billion in new budget authority; enacted after an override of President Ronald Reagan’s veto.
1982
Federal Courts Improvement Act
Created the Court of Appeals for the Federal Circuit (CAFC); granted CAFC jurisdiction over appeals of patent cases and of Merit Systems Protection Board decisions.
1982
Prompt Payment Act
Required federal agencies to pay an interest penalty on overdue payments for rental property or services.
1982
Intelligence Identities Protection Act
Designed to protect the identity of undercover intelligence agents; added a new provision to the National Security Act of 1947 specifying penalties for identifying secret agents.
1100
■
Legislation: Major U.S. Legislation
The Eighties in America
Year Legislation
Significance
1982
Export Trading Company Act
Established the Office of Export Trade in the Department of Commerce; promoted increased export of American goods and services; authorized Export-Import Bank to furnish loans to export trading companies.
1982
Copyright Law Amendment
Extended until 1986 the manufacturing clause of the U.S. Copyright Law; enacted pursuant to override of President’s Ronald Reagan’s veto.
1982
Garn-St. Germain Depository Institutions Act
Created a three-year program to enable failing federally insured financial institutions to receive government notes; expanded powers of the Federal Deposit Insurance Corporation and Federal Savings and Loan Insurance Corporation to arrange mergers of banks and savings and loan organizations.
1983
Surface Transportation Assistance Act
Authorized $53.6 billion for highway construction and repair and $17.8 billion for mass transit systems during fiscal years 1983-1986.
1983
Nuclear Waste Policy Act of 1982
Required the president to recommend two sites to be permanent federal repositories for nuclear waste; exempted nuclear waste produced by defense programs from most of the law’s features.
1983
Martin Luther King Day
Established the third Monday in January as a federal holiday honoring the legacy of civil rights leader Martin Luther King, Jr.
1983
Social Security Amendments
Raised retirement age of Social Security recipients from sixty-three to sixty-seven by 2027; made future cost-of-living adjustments payable in January; increased payroll taxes for employers and employees.
1983
Lebanon Emergency Assistance Act
Authorized supplemental assistance to Lebanon to promote stability and sovereignty; required the president to obtain statutory permission from Congress before expanding American armed forces in Lebanon.
1983
Dairy and Tobacco Adjustment Act
Initiated a program to halt milk production over a fifteen-month period; froze tobacco price support levels; permitted the secretary of agriculture to sell poor-quality feed corn at reduced rates to farmers and ranchers in drought areas.
1983
U.S. Commission on Civil Rights Act Restructured the Commission on Civil Rights and renewed the commission through October 31, 1989.
1983
Federal Anti-Tampering Act
Imposed federal penalties for persons found guilty of tampering with consumer products, labels, or containers that affect interstate or foreign commerce.
1983
Extension of Trade Adjustment Act
Two-year extension of the Trade Adjustment Assistance program under the Trade Act of 1974, which furnished financial assistance and training to workers and businesses hurt by competition from imports.
1983
Fiscal Year 1984 Department of Defense Authorization
Authorized $187 billion for most Department of Defense activities, including $2.1 billion for production of MX missiles.
1984
Water Resources Research Act
Released funds for research and development to protect water resources; authorized states to establish water resource centers at land grant institutions; enacted pursuant to override of President Ronald Reagan’s veto.
1984
Insider Trading Sanctions Act
Amended the Securities and Exchange Act of 1934 to increase penalties for buying stocks with nonpublic information.
The Eighties in America
Legislation: Major U.S. Legislation
■
1101
Year Legislation
Significance
1984
Cable Communications Policy Act
Forbade local television and telephone companies from owning local cable interests unless approved by the Federal Communications Commission; limited the authority of cities to regulate basic cable television rates to two years.
1984
National Organ Transplant Act
Earmarked funds for a national computerized network to match organ donors and recipients; prohibited the purchase or sale of human organs for transplantation.
1984
Retirement Equity Act
Lowered the age for employees to participate in private pension plans to twenty-one; permitted workers to retain pension rights after leaving and returning to a job.
1984
Child Abuse Amendments
Approved funds for child abuse prevention programs, to encourage adoption of disabled children, and for matching grants to assist states in providing domestic violence treatment and prevention programs.
1984
Hazardous and Solid Waste Amendments
Reauthorized the Solid Waste Disposal Act for four years; required the Environmental Protection Agency to enact regulations for certain generators of hazardous waste by March 31, 1986; prohibited disposal of hazardous waste close to underground drinking water sources.
1984
Deficit Reduction Act
Increased tax revenues by deferring certain tax reductions; amended several federal programs to reduce spending.
1984
Drug Price Competition and Patent Term Restoration Act
Loosened the Food and Drug Administration’s drug application procedures in order to expedite approval of generic drugs; specified patent protection provisions.
1984
Veterans Health Care Act
Authorized the Veterans Administration to establish treatment programs for Vietnam veterans suffering from post-traumatic stress disorder.
1985
Public Health Service Act Amendment
Revised and extended provisions under the Public Health Services Act relating to the National Institutes of Health and the National Research Institutes; enacted pursuant to override of President Ronald Reagan’s veto.
1985
Food Security Act
Reduced federal price supports and direct income subsidies to farmers for the 1986-1990 fiscal years; exempted certain Department of Agriculture export financing programs from federal shipment requirements.
1985
Gramm-Rudman-Hollings Act
Amended the 1974 Budget Control and Impoundment Act; required that the federal government meet annual targets to eliminate deficit within five years; specified procedures for cuts if annual deficit reduction targets were not met. The Supreme Court in 1986 ruled portions of this law unconstitutional.
1985
Clark Amendment Repeal
Repealed a section of the International Security and Development Act of 1980, which prohibited assistance for military operations in Angola.
1985
U.S.-China Nuclear Cooperation Agreement
Established a process for implementation of a nuclear cooperation agreement between the United States and China; mandated that a report be sent to Congress detailing China’s nonproliferation policies; cleared the way for sale of nonmilitary nuclear technology to China.
1102
■
Legislation: Major U.S. Legislation
The Eighties in America
Year Legislation
Significance
1986
Tax Reform Act
Simplified the federal tax system by replacing the existing fourteen tax brackets with two; revised the tax code by curtailing or eliminating dozens of tax breaks.
1986
Comprehensive Anti-Apartheid Act
Prohibited loans to and investments in South Africa; imposed a series of economic sanctions against South Africa; enacted pursuant to override of a veto by President Ronald Reagan.
1986
Age Discrimination in Employment Amendments
Forbade most employers from setting mandatory retirement ages.
1986
Anti-Drug Abuse Act
Increased penalties for drug-related crime; created new offenses; expanded enforcement in the United States and in drug-producing countries.
1986
Immigration Reform and Control Act
Required that employers verify the status of all workers; made it a crime for employers to knowingly recruit or hire illegal immigrants.
1986
Superfund Amendments and Reauthorization Act
Established a $8.5 billion fund to clean up the most dangerous hazardous waste sites; formulated a five-year schedule for the Environmental Protection Agency to start cleaning up the 375 worst hazardous waste sites; created new taxes on petroleum and raw chemicals.
1986
Safe Drinking Water Act Amendments
Renewed the Safe Drinking Water Act of 1974 for five years; directed the Environmental Protection Agency to set maximum containment standards for water supplies; created a schedule for regulating certain toxic pollutants within three years.
1986
Electronic Communications Privacy Act
Extended privacy protection to electronic mail, cellular phones, computer transmissions, paging devices, and private satellite transmissions.
1986
Firearms Owners Protection Act
Removed the ban on interstate sale of rifles and shotguns; barred the establishment of a firearms registration ban; restricted conditions under which federal officials could seize firearms or ammunition.
1986
Goldwater-Nixon Act
Reorganized the Department of Defense by designating the chairman of the joint chiefs of staff as chief military adviser to the president and the secretary of defense; authorized the chairman to assume strategic planning and budgeting responsibilities; consolidated duplicate functions.
1987
Water Quality Act
Renewed the Federal Water Pollution Control Amendments of 1972 for ten years; furnished construction grants for state and local sewage treatment facilities; enacted pursuant to override of presidential veto.
1987
Balanced Budget and Emergency Deficit Control Reaffirmation Act
Increased the public debt authority from $2.1 trillion to $2.8 trillion; established new federal budget deficit targets for 1988-1993 fiscal years.
1987
Airport and Airway Safety and Capacity Extension Act
Provided $20.1 billion for a airport renewal measures; set a target for hiring air traffic controllers; extended taxes and fees to finance the Airport and Airway Trust Fund.
1987
McKinney Homeless Assistance Act
Established the Interagency Council on the Homeless to coordinate homeless assistance; authorized funding for two years for a broad range of programs to help homeless persons.
The Eighties in America
Legislation: Major U.S. Legislation
■
1103
Year Legislation
Significance
1987
New G.I. Bill Continuation Act
Created a permanent educational benefits program for persons who began military service after June 30, 1985; specified active duty requirements to qualify for the program.
1987
Surface Transportation Act
Authorized funds for construction of highways; approved mass transit funds; expanded and improved the relocation assistance program; enacted pursuant to override of presidential veto.
1988
Trademark Law Revision Act
Permitted American companies to file a trademark application with the U.S. Patent and Trademark Office, with certain deadlines noted.
1988
Veterans Judicial Review Act
Created a new review procedure for veterans’ claim cases; authorized the Board of Veterans Appeals to make initial decisions in these cases; established the U.S. Court of Appeals for Veterans Claims to review the board’s decisions; specified membership requirements for the Board of Veterans Appeals.
1988
Civil Rights Restoration Act
Renewed the broad scope of coverage under Title IX of the Educational Amendments of 1972, Rehabilitative Act of 1973, Age Discrimination Act of 1975, and Title VI of the Civil Rights Act of 1964; enacted pursuant to override of veto by President Ronald Reagan.
1988
Fair Housing Amendments
Authorized the Department of Housing and Urban Development to penalize those who discriminated in the sale or rental of housing; barred discriminatory housing practices against the handicapped and families with young children.
1988
Women’s Business Ownership Act
Created a Small Business Administration program to provide bank loans to small businesses owned by women; established a Women’s Business Council to monitor government assistance to women-owned businesses.
1988
Family Support Act
Started a Job Opportunities and Basic Skills Program for recipients of Aid to Families with Dependent Children; strengthened child support enforcement procedures; approved projects to examine how to lessen welfare dependence.
1988
Berne Convention Implementation Act
Permitted the United States to participate in the Berne Convention for the Protection of Literary and Artistic Works; revised certain American copyright laws to conform with the convention.
1988
Video Privacy Act
Prohibited the disclosure of names, addresses, and other information about patrons who rented or purchased videotapes from video stores, with limited exceptions.
1988
Department of Veterans Affairs
Elevated the Veterans Administration to a cabinet department; specified an appointment process for a secretary of veterans affairs and certain department officials.
1988
Japanese American Reparations Act
Approved a payment of $20,000 to each surviving Japanese American who had been interned in a relocation camp during World War II; permitted payment to some internees’ descendants; offered a government apology for internment.
1989
Whistleblower Protection Act
Established the Office of Special Counsel as an independent federal agency to investigate allegations of retaliation against government employees who expose waste or fraud; permitted complaints to be filed with the Merits Systems Protection Board.
1104
■
Legislation: Major U.S. Legislation
The Eighties in America
Year Legislation
Significance
1989
Ethics Reform Act
Began an annual automatic pay adjustment procedure for members of Congress based on private sector pay; specified reduction or elimination of speaking fees and honoraria; eliminated a provision which allowed members of the House of Representatives who had been in office since 1980 to convert campaign funds for personal use.
1989
Financial Institutions Reform, Recovery, and Enforcement Act
Approved allotment of $50 billion over three years to sell or close insolvent savings and loan institutions; created the Resolution Trust Corporation to take over failed thrift organizations and sell their assets; abolished the Federal Home Loan Bank Board and Federal Savings and Loan Corporation; established the Office of Thrift Supervision within the Department of the Treasury to supervise thrifts.
1989
Bipartisan Accord on Central America Act of 1989
Authorized the president to transfer funds to the Agency for International Development to provide humanitarian assistance to Nicaraguan resistance forces.
1989
Omnibus Budget Reconciliation Act Approved $14.7 billion for deficit reduction; removed the U.S. Postal Service from deficit reduction act requirements.
1989
Department of Housing and Urban Development Reform Act
Required the Department of Housing and Urban Development (HUD) to distribute housing program funds according to a revised formula; mandated certain personnel who dealt with HUD to report their earnings, based on their salaries; permitted civil fines for violations of HUD mortgage programs.
Samuel B. Hoff
■ Legislation: U.S. Supreme Court Decisions Year Case
Significance
1980
Fullilove v. Klutznick
Chief Justice Warren Burger wrote the majority opinion, which upheld Congress’s right to set aside 10 percent of federal public works funding for minority contractors. This decision reaffirmed Congress’s right to set racial quotas to combat discrimination.
1980
Harris v. McRae
In what would become the first of a series of decisions on abortion during the 1980’s, the justices upheld a federal law barring the use of Medicaid funds for abortions, except when the mother’s life was in danger and in cases of rape or incest. The 5-4 decision held that a woman’s right to terminate a pregnancy did not entitle her to receive government funding for that choice.
1980
Lewis v. United States
In a 6-3 decision, the justices upheld Congress’s authority to prohibit convicted felons from owning firearms. The Court ruled that Congress could rationally conclude that any felony conviction is sufficient basis on which to prohibit the possession of a firearm.
1980
Richmond Newspapers v. Virginia
By a ruling of 7-1, the Court determined that a trial judge’s order to close the courtroom to the public and media during a murder trial was unconstitutional. The opinion maintained that the arbitrary closing of a courtroom to avoid unwanted publicity violated the First Amendment, and the closure of court hearings was permissible only under unusual circumstances.
1981
Heffron v. International Society for Krishna Consciousness
Chief Justice Burger’s majority opinion held that state fair organizers did not violate a religious organization’s First Amendment rights when it required the group to distribute its literature at a fixed location. Members of the Krishna religion argued that the rule suppressed their practice of distributing religious literature and soliciting donations in public places. The Court maintained that the religious and free speech rights of the Krishna members were not violated because fair organizers treated all groups the same, regardless of their religious or political affiliations.
1981
Metromedia v. City of San Diego
In his majority opinion, Justice Byron White ruled that an ordinance banning billboards within San Diego city limits violated the First Amendment. The Court determined that the city’s need for public safety and appearance was insufficient to justify a ban on outdoor advertising that was used by politicians and businesses.
1981
Rostker v. Goldberg
This case was a challenge to a federal law that required men—but not women—to register for possible military service. The majority opinion, written by Justice William Rehnquist, held that the law did not violate the Constitution or discriminate against women because the draft was based on the need for combat troops and not on equity.
1981
United States Postal Service v. Council of Greenburgh Civic Associations
The Court found that a federal law prohibiting the delivery of unstamped material to private mailboxes was constitutional and did not violate the First Amendment rights of groups seeking to deposit messages at private homes without paying postage.
1106
■
Legislation: U.S. Supreme Court Decisions
The Eighties in America
Year Case
Significance
1982
Island Trees School District v. Pico
In his majority opinion, Justice William Brennan ruled that a local school board violated the First Amendment when it ordered the removal of books from school libraries because the board found the books to be “anti-American, anti-Christian, anti-Semitic, and just plain filthy.” The justices determined that school officials could not remove books from school libraries simply because they disliked the ideas contained in those books.
1982
New York v. Ferber
In a unanimous opinion, the Court upheld a state law that made it a crime to own or sell child pornography. The justices defined child pornography as the visual depiction of sexual conduct by children without serious literary, artistic, political, or scientific value, and they ruled that child pornography—like obscenity—was not protected by the First Amendment.
1983
Akron v. Akron Center for Reproductive Health
One of three abortion decisions handed down on June 15, 1983. This 63 opinion overturned portions of an Akron, Ohio, ordinance that required the parents of unmarried minors under the age of fifteen to be notified and to give their consent before the minors could have an abortion. The decision also maintained it was unconstitutional to require a woman to sign a consent form and wait twenty-four hours before she could have an abortion.
1983
Planned Parenthood Association of Kansas City v. Ashcroft
Decided on the same day as Akron v. Akron Center for Reproductive Health, this 5-4 opinion upheld a Missouri state law requiring minors under the age of eighteen to obtain parental consent for their abortions. While it ruled that the Akron, Ohio, parental consent ordinance was unconstitutional, the Court upheld the Missouri parental consent law because it met the standard the Court had specified in a 1979 decision.
1983
Simopoulos v. Virginia
In the third abortion decision delivered on June 15, 1983, the Court upheld the criminal conviction of a physician for violating a Virginia law that required all post-first-trimester abortions to be performed in hospitals. By an 8-1 ruling, the justices determined the law was constitutional because it allowed for the licensing of clinics as well as full-care hospitals, which made the law less restrictive than the laws struck down in Akron v. Akron Center for Reproductive Health and Planned Parenthood v. Ashcroft. The doctor would have avoided criminal prosecution if his clinic had been licensed.
1983
Bob Jones University v. United States
Eight of the nine justices upheld the Internal Revenue Service’s (IRS) authority to deny tax-exempt status for private religious schools that practiced racial discrimination. The Court determined that the IRS did not violate Bob Jones University’s First Amendment rights because the government’s interest in the eradication of racial discrimination outweighed a school’s need for tax-exempt status when that school discriminated on the basis on race.
1983
Bolger v. Young Drug Products
Considered a major decision in the First Amendment protection of commercial speech, this ruling overturned a federal law that made it a crime to send unsolicited advertisements for contraceptives through the U.S. mail.
1983
Equal Employment Opportunity Commission v. Wyoming
This decision extended the federal law prohibiting discrimination on the basis of age to apply to employees of state government agencies.
The Eighties in America
Legislation: U.S. Supreme Court Decisions
■
1107
Year Case
Significance
1983
Metropolitan Edison v. People Against Nuclear Energy and Nuclear Regulatory Commission
The Court ruled that the Nuclear Regulatory Commission was not required to consider the psychological health and well-being of a community when deciding where to locate a nuclear power plant.
1983
Pacific Gas & Electric Co. v. State Energy Resources Conservation and Development Commission
This case challenged California’s authority to place a moratorium on the construction of nuclear power plants. The Court ruled that states were free to ban future nuclear power plants as long as the ban was motivated by economic reasons and not by considerations of safety, which were the responsibility of the federal government.
1983
Mueller v. Allen
The majority opinion, written by Chief Justice Burger, upheld a Minnesota law allowing parents of children in public or private schools to obtain a tuition tax reduction when paying their state income tax. This 5-4 decision held that a state tax deduction for education expenses was constitutional, even though parochial schools would reap most of the benefits.
1984
Federal Communications Commission v. League of Women Voters of California
In a 5-4 opinion, the Court struck down a federal regulation prohibiting any noncommercial educational station that received government funding from engaging in editorializing. The justices ruled that this regulation violated the free speech rights of public broadcasters because it curtailed the expression of editorial opinion that was at “the heart of First Amendment protection.”
1984
Grove City College v. Bell
The Court upheld a federal requirement that colleges and universities receiving federal funding must comply with a federal law prohibiting sex discrimination in “any education program or activity receiving federal financial assistance.” The justices ruled that this requirement did not violate the First Amendment rights of colleges and their students.
l984
Lynch v. Donnelly
By a 5-4 ruling, the Court held that an annual city park Christmas display featuring a nativity scene was constitutional because the scene was displayed with other Christmas symbols and was used to promote retail sales and goodwill—not to endorse a particular religion. The case arose because Daniel Donnelly, a resident of Pawtucket, Rhode Island, objected to the city’s display and sued Pawtucket’s mayor, Dennis Lynch.
1984
Regan v. Time, Inc.
Writing for a majority of the justices, Chief Justice Burger overturned as unconstitutional part of a federal law designed to curb counterfeiting. The Court ruled that a Time magazine could publish illustrations of United States’ currency as long as the illustrations were not in color and not shown in actual size.
1984
Roberts v. United States Jaycees
A unanimous Court upheld a Minnesota law barring private clubs from discriminating against women. The justices ruled that the United States Jaycees was not a private club, and, therefore, could not exclude women from its membership. The Court would reach the same conclusion in Rotary International v. Rotary Club of Duarte (1987).
1984
Sony Corp. of America v. Universal City Studios, Inc.
Justice John Paul Stevens delivered the 5-4 opinion in a case addressing entertainment corporations’ concerns about video piracy. The Court ruled that home use of videocassette recorders (VCRs) to tape television programs for later viewing did not violate federal copyright law. Justices maintained that a VCR manufacturer’s sale of home VCRs that were later used to record television programs did not violate the copyrights of these programs’ producers.
1108
■
Legislation: U.S. Supreme Court Decisions
The Eighties in America
Year Case
Significance
1985
New Jersey v. T.L.O.
This 6-3 opinion concluded that, in general, the Fourth Amendment ban on unreasonable searches applied to searches by public school officials, as well as by law enforcement personnel. However, the Court determined that in this case the search of a student’s purse by public school officials did not violate the student’s civil rights.
1985
Wallace v. Jaffree
This decision struck down an Alabama law that allowed public school teachers to hold a one-minute period of silence for “meditation or voluntary prayer” each day. The Court determined that the law had no secular purpose and endorsed religion in violation of the First Amendment’s establishment clause separating church and state.
1985
Thornton v. Caldor
This case challenged the constitutionality of Connecticut’s Sabbath laws, which prevented private companies from forcing employees to work on Sunday or any day that would be the employees’ Sabbath. The Court declared the law unconstitutional and ruled that private companies are free to fire any employees who refused to work on any day they considered to be their Sabbath because the First Amendment’s guarantee of freedom of religion applied only to the government and not to private employers.
1985
American Booksellers Association v. Hudnut
The Court struck down as unconstitutional a city ordinance that banned pornography on the grounds that pornography violated women’s civil rights by portraying them as sex objects.
1986
Batson v. Kentucky
In his majority opinion, Justice Lewis Powell ruled that attorneys who rejected prospective jurors solely on the basis of their race violated the Sixth Amendment. The Court concluded that racial discrimination in jury selection damaged the community by “undermining public confidence” in the justice system.
1986
Bethel School District v. Fraser
In a 7-2 decision, the Court upheld a school district’s suspension of a high school student for delivering a speech containing “elaborate, graphic, and explicit sexual” metaphors. The opinion determined that the First Amendment did not prevent school officials from prohibiting vulgar and lewd speech that would undermine the school’s basic educational mission.
1986
Bowers v. Hardwick
This controversial decision upheld a Georgia sodomy law that made it a crime to engage in homosexual acts, even in the privacy of the home. The case involved a homosexual man arrested in his bedroom. Gay rights groups referred to this case as their “Dred Scott decision,” comparing it to Dred Scott v. Sandford (1857), in which the Court ruled that African Americans were not “citizens” entitled to constitutional protection.
1986
Goldman v. Weinberger
Ruling 5-4, the justices upheld U.S. Air Force penalties against a Jewish chaplain who wore a yarmulke (skull cap) while on duty in defiance of the military’s uniform regulations. The Court ruled that the military’s interest in uniformity outweighed an individual’s religious beliefs.
The Eighties in America
Legislation: U.S. Supreme Court Decisions
■
1109
Year Case
Significance
1986
Renton v. Playtime Theatres
In this pornography case, the Court held that communities could restrict the location of X-rated movie theaters to sites away from homes, schools, churches, and parks. This ruling was consistent with its 1986 ruling in American Booksellers Association v. Hudnut, because in both decisions the justices reaffirmed that sexually explicit materials—unlike obscenity—deserved some First Amendment protection but less protection than other kinds of speech, especially political speech.
1986
Thornburgh v. American College of Obstetricians and Gynecologists
Citing its 1983 decisions in Akron v. Akron Center for Reproductive Health and Planned Parenthood v. Ashcroft, the Court overturned portions of the Pennsylvania Abortion Control Act of 1982, finding that these provisions infringed on a woman’s fundamental right to an abortion. The majority opinion by Justice Harry A. Blackmun stated that it was unconstitutional to give a woman information designed to dissuade her from having an abortion; the decision maintained it also was unconstitutional to invade a woman’s privacy by making information about her abortion available to the public.
1987
Edwards v. Aguillard
In a 7-2 opinion, the Court ruled that Louisiana could not require public schools that taught evolution to also teach creationism as “Creation Science.” The opinion concluded the law had no secular purpose and endorsed religion in violation of the Constitution’s establishment clause separating church and state.
1987
Rotary International v. Rotary Club of Duarte
The justices upheld a California law that required Rotary Clubs to admit women. The Court found that the state’s compelling interest in ending sexual discrimination outweighed the group’s right of association.
1987
South Dakota v. Dole
Chief Justice William Rehnquist wrote the majority opinion, in which he ruled as constitutional a federal law that withheld 5 percent of a state’s highway funds if that state did not raise its minimum drinking age to 21. The law was upheld because it was passed in the interest of the “general good” and by “reasonable means.”
1988
Hazelwood School District. v. Kuhlmeier
In a decision that First Amendment advocates considered a major setback in protecting students’ free speech rights, the Court concluded that a school principal could censor the contents of a student newspaper if that newspaper was part of a class assignment and not a forum for public discussion.
1988
Lyng v. Northwest Indian Cemetery Protective Association
Justice Sandra Day O’Connor’s majority opinion held that the Constitution’s free exercise of religion clause did not bar the federal government from harvesting timber in and building a road through a national forest area that Native Americans used for religious purposes.
1988
Hustler Magazine v. Falwell
This unanimous ruling by eight justices was considered a landmark case for freedom of speech. It involved a case in which the Reverend Jerry Falwell sued Hustler Magazine for publishing a fake satirical advertisement poking fun at him and his deceased mother. The Court ruled in favor of the magazine, maintaining that satire and parody were protected forms of free speech.
1110
■
Legislation: U.S. Supreme Court Decisions
The Eighties in America
Year Case
Significance
1988
Thompson v. Oklahoma
The Court, in a 5-3 ruling, vacated the death sentence of a fifteen-yearold who was tried as an adult and convicted of murder. The justices ruled that imposing the death penalty against juveniles under the age of sixteen was a form of cruel and unusual punishment.
1988
Webster v. Doe
This decision allowed a former Central Intelligence Agency (CIA) employee to sue the CIA for firing him because he was a homosexual, which the CIA claimed made him a threat to national security. In a 6-2 ruling, the justices concluded that dismissed employees could sue the CIA if they believed their constitutional rights had been violated.
1989
Allegheny County v. Greater Pittsburgh Writing for the majority, Justice Blackmun held that displaying a nativity American Civil Liberties Union (ACLU) scene inside a Pittsburgh, Pennsylvania, courthouse endorsed religion and, therefore, violated the Constitution’s establishment clause separating church and state. This decision differed from the 1984 ruling in Lynch v. Donnelly, which held that a nativity scene that was part of a secular Christmas display for commercial purposes was constitutional.
1989
Sable Communications of California, Inc. v. Federal Communications Commission
Sable Communications, a provider of “dial-a-porn” telephone services both in and outside the metropolitan Los Angeles area, challenged a federal law that banned these sexually oriented calls. The Court ruled that the federal law was unconstitutional because it violated the free speech rights of pornographers.
1989
Texas v. Johnson
In a 5-4 decision, the Court overturned a Texas law that made the desecration of the American flag illegal. The justices maintained that flag burning was a form of symbolic speech and protected under the First Amendment.
1989
Webster v. Reproductive Health Services
This opinion upheld a Missouri law that prohibited the use of state funds to pay for abortions; the law also determined that life began at conception and mandated that unborn children should have the same rights and privileges available to other persons. Prochoice advocates maintained that by upholding this law, the Court seriously compromised the landmark Roe v. Wade (1973) decision, which legalized abortion.
Eddith A. Dashiell
■ Literature: Best-Selling U.S. Books 1980 Fiction 1. The Covenant, James A. Michener 2. The Bourne Identity, Robert Ludlum 3. Rage of Angels, Sidney Sheldon 4. Princess Daisy, Judith Krantz 5. Firestarter, Stephen King 6. The Key to Rebecca, Ken Follett 7. Random Winds, Belva Plain 8. The Devil’s Alternative, Frederick Forsyth 9. The Fifth Horseman, Larry Collins and Dominique Lapierre 10. The Spike, Arnaud de Borchgrave and Robert Moss
1980 Nonfiction 1. Crisis Investing: Opportunities and Profits in the Coming Great Depression, Douglas R. Casey 2. Cosmos, Carl Sagan 3. Free to Choose: A Personal Statement, Milton and Rose Friedman 4. Anatomy of an Illness as Perceived by the Patient, Norman Cousins 5. Thy Neighbor’s Wife, Gay Talese 6. The Sky’s the Limit, Dr. Wayne W. Dyer 7. The Third Wave, Alvin Toffler 8. Craig Claiborne’s Gourmet Diet, Craig Claiborne with Pierre Franey 9. Nothing Down, Robert Allen 10. Shelley: Also Known as Shirley, Shelley Winters
4. A Light in the Attic, Shel Silverman 5. Cosmos, Carl Sagan 6. Better Homes and Gardens New Cook Book 7. Miss Piggy’s Guide to Life, Miss Piggy as told to Henry Beard 8. Weight Watchers 365-Day Diet Menu Cookbook 9. You Can Negotiate Anything, Herb Cohen 10. A Few Minutes with Andy Rooney, Andrew A. Rooney
1982 Fiction 1. E.T: The Extra-Terrestrial Storybook, William Kotzwinkle 2. Space, James A. Michener 3. The Parsifal Mosaic, Robert Ludlum 4. Master of the Game, Sidney Sheldon 5. Mistral’s Daughter, Judith Krantz 6. The Valley of Horses, Jean M. Auel 7. Different Seasons, Stephen King 8. North and South, John Jakes 9. 2010: Odyssey Two, Arthur C. Clarke 10. The Man from St. Petersburg, Ken Follett
1982 Nonfiction
1. Noble House, James Clavell 2. The Hotel New Hampshire, John Irving 3. Cujo, Stephen King 4. An Indecent Obsession, Colleen McCullough 5. Gorky Park, Martin Cruz Smith 6. Masquerade, Kit Williams 7. Goodbye, Janette, Harold Robbins 8. The Third Deadly Sin, Lawrence Sanders 9. The Glitter Dome, Joseph Wambaugh 10. No Time for Tears, Cynthia Freeman
1. Jane Fonda’s Workout Book, Jane Fonda 2. Living, Loving, and Learning, Leo Buscaglia 3. And More by Andy Rooney, Andrew A. Rooney 4. Better Homes and Gardens New Cook Book 5. Life Extension: Adding Years to Your Life and Life to Your Years—A Practical Scientific Approach, Durk Pearson and Sandy Shaw 6. When Bad Things Happen to Good People, Harold S. Kushner 7. A Few Minutes with Andy Rooney, Andrew A. Rooney 8. The Weight Watchers Food Plan Diet Cookbook, Jean Nidetch 9. Richard Simmons’ Never-Say-Diet Cookbook, Richard Simmons 10. No Bad Dogs: The Woodhouse Way, Barbara Woodhouse
1981 Nonfiction
1983 Fiction
1981 Fiction
1. The Beverly Hills Diet, Judy Mazel 2. The Lord God Made Them All, James Herriot 3. Richard Simmons’ Never-Say-Diet Book, Richard Simmons
1. Return of the Jedi Storybook, Joan D. Vinge, adapter 2. Poland, James A. Michener 3. Pet Sematary, Stephen King
1112
■
The Eighties in America
Literature: Best-Selling U.S. Books
4. The Little Drummer Girl, John Le Carré 5. Christine, Stephen King 6. Changes, Danielle Steel 7. The Name of the Rose, Umberto Eco 8. White Gold Wielder: Book Three of the Second Chronicles of Thomas Covenant, Stephen R. Donaldson 9. Hollywood Wives, Jackie Collins 10. The Lonesome Gods, Louis L’Amour
1983 Nonfiction 1. In Search of Excellence: Lessons from America’s BestRun Companies, Thomas J. Peters and Robert H. Waterman, Jr. 2. Megatrends: Ten New Directions Transforming Our Lives, John Naisbitt 3. Motherhood: The Second Oldest Profession, Erma Bombeck 4. The One Minute Manager, Kenneth Blanchard and Spencer Johnson 5. Jane Fonda’s Workout Book, Jane Fonda 6. The Best of James Herriot, James Herriot 7. The Mary Kay Guide to Beauty: Discovering Your Special Look 8. On Wings of Eagles, Ken Follett 9. Creating Wealth, Robert G. Allen 10. The Body Principal: The Exercise Program for Life, Victoria Principal
1984 Fiction 1. The Talisman, Stephen King and Peter Straub 2. The Aquitaine Progression, Robert Ludlum 3. The Sicilian, Mario Puzo 4. Love and War, John Jakes 5. The Butter Battle Book, Dr. Seuss 6. “ . . . And the Ladies of the Club,” Helen Hooven Santmyer 7. The Fourth Protocol, Frederick Forsyth 8. Full Circle, Danielle Steel 9. The Life and Hard Times of Heidi Abromowitz, Joan Rivers 10. Lincoln: A Novel, Gore Vidal
1984 Nonfiction 1. Iacocca: An Autobiography, Lee Iacocca with William Novak 2. Loving Each Other, Leo Buscaglia 3. Eat to Win: The Sports Nutrition Bible, Robert Haas, M.D. 4. Pieces of My Mind, Andrew A. Rooney
5. Weight Watchers Fast and Fabulous Cookbook 6. What They Don’t Teach You at Harvard Business School: Notes from a Street-Smart Executive, Mark H. McCormack 7. Women Coming of Age, Jane Fonda with Mignon McCarthy 8. Moses the Kitten, James Herriot 9. The One Minute Salesperson, Spencer Johnson, M.D., and Larry Wilson 10. Weight Watchers Quick Start Program Cookbook, Jean Nidetch
1985 Fiction 1. The Mammoth Hunters, Jean M. Auel 2. Texas, James A. Michener 3. Lake Wobegon Days, Garrison Keillor 4. If Tomorrow Comes, Sidney Sheldon 5. Skeleton Crew, Stephen King 6. Secrets, Danielle Steel 7. Contact, Carl Sagan 8. Lucky, Jackie Collins 9. Family Album, Danielle Steel 10. Jubal Sackett, Louis L’Amour
1985 Nonfiction 1. Iacocca: An Autobiography, Lee Iacocca with William Novak 2. Yeager: An Autobiography, Chuck Yeager and Leo Janos 3. Elvis and Me, Priscilla Beaulieu Presley with Sandra Harmon 4. Fit for Life, Harvey and Marilyn Diamond 5. The Be-Happy Attitudes, Robert Schuller 6. Dancing in the Light, Shirley MacLaine 7. A Passion for Excellence: The Leadership Difference, Thomas J. Peters and Nancy K. Austin 8. The Frugal Gourmet, Jeff Smith 9. I Never Played the Game, Howard Cosell with Peter Bonventre 10. Dr. Berger’s Immune Power Diet, Stuart M. Berger, M.D.
1986 Fiction 1. It, Stephen King 2. Red Storm Rising, Tom Clancy 3. Whirlwind, James Clavell 4. The Bourne Supremacy, Robert Ludlum 5. Hollywood Husbands, Jackie Collins 6. Wanderlust, Danielle Steel 7. I’ll Take Manhattan, Judith Krantz
The Eighties in America
8. Last of the Breed, Louis L’Amour 9. The Prince of Tides, Pat Conroy 10. A Perfect Spy, John Le Carré
1986 Nonfiction 1. Fatherhood, Bill Cosby 2. Fit for Life, Harvey and Marilyn Diamond 3. His Way: The Unauthorized Biography of Frank Sinatra, Kitty Kelley 4. The Rotation Diet, Martin Katahn 5. You’re Only Old Once, Dr. Seuss 6. Callanetics: Ten Years Younger in Ten Hours, Callan Pinckney 7. The Frugal Gourmet Cooks with Wine, Jeff Smith 8. Be Happy—You Are Loved!, Robert H. Schuller 9. Word for Word, Andrew A. Rooney 10. James Herriot’s Dog Stories, James Herriot
1987 Fiction 1. The Tommyknockers, Stephen King 2. Patriot Games, Tom Clancy 3. Kaleidoscope, Danielle Steel 4. Misery, Stephen King 5. Leaving Home: A Collection of Lake Wobegon Stories, Garrison Keillor 6. Windmills of the Gods, Sidney Sheldon 7. Presumed Innocent, Scott Turow 8. Fine Things, Danielle Steel 9. Heaven and Hell, John Jakes 10. The Eyes of the Dragon, Stephen King
1987 Nonfiction 1. Time Flies, Bill Cosby 2. Spycatcher: The Candid Autobiography of a Senior Intelligence Officer, Peter Wright with Paul Greengrass 3. Family: The Ties That Bind . . . and Gag!, Erma Bombeck 4. Veil: The Secret Wars of the CIA, 1981-1987, Bob Woodward 5. A Day in the Life of America, Rick Smolan and David Cohen 6. The Great Depression of 1990, Ravi Batra 7. It’s All in the Playing, Shirley MacLaine 8. Man of the House: The Life and Political Memoirs of Speaker Tip O’Neill, Thomas P. O’Neill, Jr., with William Novak 9. The Frugal Gourmet Cooks American, Jeff Smith 10. The Closing of the American Mind, Allan Bloom
Literature: Best-Selling U.S. Books
■
1113
1988 Fiction 1. The Cardinal of the Kremlin, Tom Clancy 2. The Sands of Time, Sidney Sheldon 3. Zoya, Danielle Steel 4. The Icarus Agenda, Robert Ludlum 5. Alaska, James A. Michener 6. Till We Meet Again, Judith Krantz 7. The Queen of the Damned, Anne Rice 8. To Be the Best, Barbara Taylor Bradford 9. One: A Novel, Richard Bach 10. Mitla Pass, Leon Uris
1988 Nonfiction 1. The Eight-Week Cholesterol Cure, Robert E. Kowalski 2. Talking Straight, Lee Iacocca with Sonny Kleinfield 3. A Brief History of Time: From the Big Bang to Black Holes, Steven W. Hawking 4. Trump: The Art of the Deal, Donald J. Trump with Tony Schwartz 5. Gracie: A Love Story, George Burns 6. Elizabeth Takes Off, Elizabeth Taylor 7. Swim with the Sharks Without Being Eaten Alive, Harvey MacKay 8. Christmas in America, David Cohen, editor 9. Weight Watchers Quick Success Program Book, Jean Nidetch 10. Moonwalk, Michael Jackson
1989 Fiction 1. Clear and Present Danger, Tom Clancy 2. The Dark Half, Stephen King 3. Daddy, Danielle Steel 4. Star, Danielle Steel 5. Caribbean, James A. Michener 6. The Satanic Verses, Salman Rushdie 7. The Russia House, John Le Carré 8. The Pillars of the Earth, Ken Follett 9. California Gold, John Jakes 10. While My Pretty One Sleeps, Mary Higgins Clark
1989 Nonfiction 1. All I Really Need to Know I Learned in Kindergarten: Uncommon Thoughts on Common Things, Robert Fulghum 2. Wealth Without Risk: How to Develop a Personal Fortune Without Going Out on a Limb, Charles J. Givens 3. A Woman Named Jackie, C. David Heymann
1114
■
Literature: Best-Selling U.S. Books
4. It Was on Fire When I Lay Down on It, Robert Fulghum 5. Better Homes and Gardens New Cook Book 6. The Way Things Work, David Macaulay 7. It’s Always Something, Gilda Radner
The Eighties in America
8. Roseanne: My Life as a Woman, Roseanne Barr 9. The Frugal Gourmet Cooks Three Ancient Cuisines: China, Greece, and Rome, Jeff Smith 10. My Turn: The Memoirs of Nancy Reagan, Nancy Reagan with William Novak
■ Literature: Major Literary Awards Nobel Prizes in Literature 1980: Czesuaw Miuosz, Poland and United States 1981: Elias Canetti, United Kingdom (born in Bulgaria) 1982: Gabriel García Márquez, Colombia 1983: William Golding, United Kingdom 1984: Jaroslav Seifert, Czechoslovakia 1985: Claude Simon, France 1986: Wole Soyinka, Nigeria 1987: Joseph Brodsky, United States (born in the Soviet Union) 1988: Naguib Mahfouz, Egypt 1989: Camilio José Cela, Spain
Pulitzer Prizes 1980 Fiction: The Executioner’s Song by Norman Mailer Drama: Talley’s Folly by Lanford Wilson History: Been in the Storm So Long: The Aftermath of Slavery by Leon F. Litwack Biography: The Rise of Theodore Roosevelt by Edmund Morris Poetry: Selected Poems by Donald Justice
1981 Fiction: A Confederacy of Dunces by John Kennedy Toole Drama: Crimes of the Heart by Beth Henley History: American Education: The National Experience, 1783-1876 by Lawrence A. Cremin Biography: Peter the Great: His Life and World by Robert K. Massie Poetry: The Morning of the Poem by James Schuyler
1982 Fiction: Rabbit Is Rich by John Updike Drama: A Soldier’s Play by Charles Fuller History: Mary Chestnut’s Civil War edited by C. Vann Woodward Biography: Grant: A Biography by William McFeely Poetry: The Collected Poems by Sylvia Plath
1983 Fiction: The Color Purple by Alice Walker Drama: ’night, Mother by Marsha Norman
History: The Transformation of Virginia, 1740-1790 by Rhys L. Isaac Biography: Growing Up by Russell Baker Poetry: Selected Poems by Galway Kinnell
1984 Fiction: Ironweed by William Kennedy Drama: Glengarry Glen Ross by David Mamet History: No award Biography: Booker T. Washington: The Wizard of Tuskegee, 1901-1915 by Louis R. Harlan Poetry: American Primitive by Mary Oliver
1985 Fiction: Foreign Affairs by Alison Lurie Drama: Sunday in the Park with George by Stephen Sondheim and James Lapine History: Prophets of Regulation by Thomas McCraw Biography: The Life and Times of Cotton Mather by Kenneth Silverman Poetry: Yin by Caroline Kizer
1986 Fiction: Lonesome Dove by Larry McMurtry Drama: No award History: . . . the Heavens and the Earth: A Political History of the Space Age by Walter A. McDougall Biography: Louise Bogan: A Portrait by Elizabeth Frank Poetry: The Flying Change by Henry Taylor
1987 Fiction: A Summons to Memphis by Peter Taylor Drama: Fences by August Wilson History: Voyagers to the West: A Passage in the Peopling of America on the Eve of the Revolution by Bernard Bailyn Biography: Burning the Cross: Martin Luther King, Jr., and the Southern Christian Leadership Conference by David J. Garrow Poetry: Thomas and Beulah by Rita Dove
1988 Fiction: Beloved by Toni Morrison Drama: Driving Miss Daisy by Alfred Uhry History: The Launching of Modern American Science, 1846-1876 by Robert V. Bruce
1116
■
The Eighties in America
Literature: Major Literary Awards
Biography: Look Homeward: A Life of Thomas Wolfe by David Herbert Donald Poetry: Partial Accounts: New and Selected Poems by William Meredith
1989 Fiction: Breathing Lessons by Anne Tyler Drama: The Heidi Chronicles by Wendy Wasserstein History: Battle Cry of Freedom: The Civil War Era by James M. McPherson; Parting the Waters: America in the King Years, 1954-1963 by Taylor Branch Biography: Oscar Wilde by Richard Ellmann Poetry: New and Collected Poems by Richard Wilbur
National Book Awards 1980 Autobiography, Hardcover: Lauren Bacall by Myself by Lauren Bacall Autobiography, Paperback: And I Worked at the Writer’s Trade: Chapters of Literary History, 19181978 by Malcolm Cowley Biography, Hardcover: The Rise of Theodore Roosevelt by Edmund Morris Biography, Paperback: Max Perkins: Editor of Genius by A. Scott Berg Children’s Book, Hardcover: A Gathering of Days: A New England Girl’s Journal, 1830-1832 by Joan W. Blos Children’s Book, Paperback: A Swiftly Tilting Planet by Madeleine L’Engle Current Interest, Hardcover: Julia Child and More Company by Julia Child Current Interest, Paperback: The Culture of Narcissism by Christopher Lasch Fiction, Hardcover: Sophie’s Choice by William Styron Fiction, Paperback: The World According to Garp by John Irving First Novel: Birdy by William Wharton General Nonfiction, Hardcover: The Right Stuff by Tom Wolfe General Nonfiction, Paperback: The Snow Leopard by Peter Matthiessen General Reference Book, Hardcover: The Complete Directory edited by Elder Witt General Reference, Paperback: The Complete Directory of Prime Time Network TV Shows: 1946Present by Tim Brooks and Earle Marsh History, Hardcover: The White House Years by Henry A. Kissinger
History, Paperback: A Distant Mirror: The Calamitous Fourteenth Century by Barbara W. Tuchman Mystery, Hardcover: The Green Ripper by John D. MacDonald; Stained Glass by William F. Buckley, Jr. Poetry: Ashes by Philip Levine Religion/Inspiration, Hardcover: The Gnostic Gospels by Elaine Pagels Religion/Inspiration, Paperback: A Severe Mercy by Sheldon Vanauken Science, Hardcover: Godel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter Science, Paperback: The Dancing Wu Li Masters: An Overview of the New Physics by Gary Zukav Science Fiction, Hardcover: Jem by Frederik Pohl Science Fiction, Paperback: The Book of the Dun Crow by Walter Wangerin, Jr. Translation: Hard Labor by Cesare Pavese, edited by William Arrowsmith; Complete Critical Prose and Letters by Osip Mandelstam, edited by Jane Gary Harris and Constance Link Western: Bendigo Shafter by Louis L’Amour
1981 Autobiography/Biography, Hardcover: Walt Whitman by Justin Kaplan Autobiography/Biography, Paperback: Samuel Beckett by Deirdre Bair Children’s Book, Fiction, Hardcover: The Night Swimmers by Betsy Byars Children’s Book, Fiction, Paperback: Ramona and Her Mother by Beverly Cleary Children’s Book, Nonfiction, Hardcover: Mali—Oh Boy! Babies by Alison Cragin Herzig and Jane Lawrence Fiction, Hardcover: Plains Song by Wright Morris Fiction, Paperback: The Stories of John Cheever by John Cheever First Novel: Sister Wolf by Anne Arensberg General Nonfiction, Hardcover: China Men by Maxine Hong Kingston General Nonfiction, Paperback: The Last Cowboy by Jane Kramer History, Hardcover: Christianity, Social Tolerance, and Homosexuality by John Boswell History, Paperback: Been in the Storm So Long: The Aftermath of Slavery by Leon F. Litwack Poetry: The Need to Hold Still by Lisel Mueller
The Eighties in America
Science, Hardcover: The Panda’s Thumb: More Reflections on Natural History by Stephen Jay Gould Science, Paperback: The Medusa and the Snail by Lewis Thomas Translation: The Letters of Gustave Flaubert by Gustave Flaubert, translated by Francis Steegmuller; Evening Edged in Gold by Arno Schmidt, translated by John E. Woods
1982 Autobiography/Biography, Hardcover: Mornings on Horseback by David McCullough Autobiography/Biography, Paperback: Walter Lippmann and the American Century by Ronald Steel Children’s Book, Fiction, Hardcover: Westmark by Lloyd Alexander Children’s Book, Fiction, Paperback: Words by Heart by Ouida Sebestyen Children’s Book, Nonfiction: A Penguin Year by Susan Bonners Children’s Book, Picture Book, Hardcover: Outside Over There by Maurice Sendak Children’s Book, Picture Book, Paperback: Noah’s Ark by Peter Spier Fiction, Hardcover: Rabbit Is Rich by John Updike Fiction, Paperback: So Long, See You Tomorrow by William Maxwell First Novel: Dale Loves Sophie to Death by Robb Forman Dew General Nonfiction, Hardcover: The Soul of a New Machine by Tracy Kidder General Nonfiction, Paperback: Naming Names by Victor S. Navasky History, Hardcover: People of the Sacred Mountain: A History of the Northern Cheyenne Chiefs and Warrior Societies, 1830-1879 by Father Peter John Powell History, Paperback: The Generation of 1914 by Robert Wohl Poetry: Life Supports: New and Collected Poems by William Bronk Science, Hardcover: Lucy: The Beginnings of Humankind by Donald C. Johanson and Maitland A. Edey Science, Paperback: Taking the Quantum Leap: The New Physics for Nonscientists by Fred Alan Wolf Translation: In the Shade of Spring Leaves by Higuchi Ichiyo, translated by Robert Lyons Danly; The
Literature: Major Literary Awards
■
1117
Ten Thousand Leaves: A Translation of The Man’Yoshu, Japan’s Premier Anthology of Classical Poetry translated by Ian Hideo Levy
1983 Autobiography/Biography, Hardcover: Isak Dinesen: The Life of a Storyteller by Judith Thurman Autobiography/Biography, Paperback: Nathaniel Hawthorne in His Time by James R. Mellow Children’s Book, Fiction, Hardcover: Homesick: My Own Story by Jean Fritz Children’s Book, Fiction, Paperback: A Place Apart by Paula Fox; Marked by Fire by Joyce Carol Thomas Children’s Book, Nonfiction: Chimney Sweeps by James Cross Giblin Children’s Book, Picture Book, Hardcover: Miss Rumphius by Barbara Cooney; Doctor De Soto by William Steig Children’s Book, Picture Book, Paperback: A House Is a House for Me by Mary Ann Hoberman, illustrated by Betty Fraser Fiction, Hardcover: The Color Purple by Alice Walker Fiction, Paperback: Collected Stories of Eudora Welty by Eudora Welty First Novel: The Women of Brewster Place by Gloria Naylor General Nonfiction, Hardcover: China: Alive in the Bitter Sea by Fox Butterfield General Nonfiction, Paperback: National Defense by James Fallows History, Hardcover: Voices of Protest: Huey Long, Father Couglin, and the Great Depression by Alan Brinkley History, Paperback: Utopian Thought in the Western World by Frank E. Manuel and Fritzie P. Manuel Original Paperback: The Red Magician by Lisa Goldstein Poetry: Selected Poems by Galway Kinnell; Country Music: Selected Early Poems by Charles Wright Science, Hardcover: “Subtle Is the Lord . . .” : The Science and Life of Albert Einstein by Abraham Pais Science, Paperback: The Mathematical Experience by Philip J. Davis and Reuben Hersh Translation: Les Fleurs du Mal by Charles Baudelaire, translated by Richard Howard
1118
■
Literature: Major Literary Awards
1984 Fiction: Victory over Japan: A Book of Stories by Ellen Gilchrist First Work of Fiction: Stones for Ibarra by Harriet Doerr Nonfiction: Andrew Jackson and the Course of American Democracy, 1833-1845 by Robert V. Remini
1985 Fiction: White Noise by Don DeLillo First Work of Fiction: Easy in the Islands by Bob Shacochis Nonfiction: Common Ground: A Turbulent Decade in the Lives of Three American Families by J. Anthony Lukas
1986 Fiction: World’s Fair by E. L. Doctorow Nonfiction: Arctic Dreams by Barry Lopez
1987 Fiction: Paco’s Story by Larry Heinemann Nonfiction: The Making of the Atom Bomb by Richard Rhodes
1988 Fiction: Paris Trout by Pete Dexter Nonfiction: A Bright Shining Lie: John Paul Vann and America in Vietnam by Neil Sheehan
1989 Fiction: Spartina by John Casey Nonfiction: From Beirut to Jerusalem by Thomas L. Friedman
The Eighties in America
Newbery Medal for Best Children’s Book of the Year 1980: A Gathering of Days: A New England Girl’s Journal, 1830-1832 by Joan W. Blos 1981: Jacob Have I Loved You by Katherine Paterson 1982: A Visit to William Blake’s Inn: Poems for Innocent and Experienced Travelers by Nancy Willard 1983: Dicey’s Song by Cynthia Voigt 1984: Dear Mr. Henshaw by Beverly Cleary 1985: The Hero and the Crown by Robin McKinley 1986: Sarah, Plain and Tall by Patricia MacLachlan 1987: The Whipping Boy by Sid Fleishman 1988: Lincoln: A Photobiography by Russell Freedman 1989: Joyful Noise: Poems for Two Voices by Paul Fleischman
Canadian Library Association Book of the Year for Children 1980: River Runners by James Houston 1981: The Violin-Maker’s Gift by Donn Kushner 1982: The Root Cellar by Janet Lunn 1983: Up to Low by Brian Doyle 1984: Sweetgrass by Jan Hudson 1985: Mama’s Going to Buy You a Mockingbird by Jean Little 1986: Julie by Cora Taylor 1987: Shadow in Hawthorn Bay by Janet Lunn 1988: A Handful of Time by Kit Pearson 1989: Easy Avenue by Brian Doyle
■ Music: Popular Musicians Groups and performers followed by an asterisk (*) are subjects of their own entries in The Eighties in America. Act
Members
Notable 1980’s Songs
Notable Facts
Paula Abdul
“Forever Your Girl,” “Opposites Attract,” “Straight Up”
Originally a Los Angeles Lakers cheerleader and a choreographer, Abdul would later star as a judge with Simon Cowell and Randy Jackson on the hit television talent show American Idol.
Bryan Adams*
“Cuts like a Knife,” “Heaven,” “Summer of ’69”
Adams’s first single was a 1979 disco song, “Let Me Take You Dancing.”
Aerosmith
Steven Tyler, Joe Perry, Tom “Angel,” “Janie’s Got a Gun,” In 1986, Tyler and Perry appeared on Run-D.M.C.’s rap Hamilton, Joey Kramer, Brad “Love in an Elevator” cover of Aerosmith’s 1976 hit Whitford “Walk This Way.”
Air Supply
Russell Hitchcock, Graham Russell
Asia
John Wetton, Geoff Downes, “Don’t Cry,” “Heat of the Steve Howe, Carl Palmer Moment,” “Only Time Will Tell”
Asia was a “supergroup” consisting of members of the Buggles, Yes, King Crimson, and Emerson, Lake & Palmer.
Bananarama
Sarah Dallin, Siobhan Fahey, “Cruel Summer,” “I Heard a Keren Woodward Rumor,” “Venus”
Bananarama’s first single, “Aie A Mwana,” was produced by the ex-Sex Pistol Paul Cook.
The Bangles
Susanna Hoffs, Debbi Peterson, Vicki Peterson, Michael Steele
“If She Knew What She Wants,” “Manic Monday,” “Walk like an Egyptian”
Vicki Peterson joined Susan Cowsill of the Cowsills and Peter Holsapple, a former member of the dBs, to form the Continental Drifters during the 1990’s.
Beastie Boys
Mike Diamond, Adam Horovitz, Adam Yauch
“Hey Ladies,” “She’s on It,” “(You Gotta) Fight for Your Right (to Party!)”
The Beastie Boys’ 1986 album License to Ill was one of the biggest-selling debuts of all time.
Pat Benatar
Blondie*
“All Out of Love,” “Making Love out of Nothing at All,” “The One That You Love”
Hitchcock and Russell met in a production of Jesus Christ Superstar.
In the early 1970’s, Benatar “Hit Me with Your Best Shot,” “Love Is a Battlefield,” studied with a voice teacher from the Juilliard School of “We Belong” Music. “Call Me,” “The Tide Is Debbie Harry, Chris Stein, Frank Infante, Jimmy Destri, High,” “Rapture” Clem Burke, Nigel Harrison
Lead singer Harry made a successful transition to acting, enjoying featured roles in several films of the 1980’s.
1120
■
The Eighties in America
Music: Popular Musicians
Act
Members
Notable 1980’s Songs
Bon Jovi*
Jon Bon Jovi, Dave Bryan, Richie Sambora, Alec John Such, Tico Torres
“Born to Be My Baby,” “Livin’ on a Prayer,” “You Give Love a Bad Name”
Jon Bon Jovi addressed the Oxford Union debating society in 2001.
David Bowie
“China Girl,” “Let’s Dance,” “Modern Love”
Bowie starred in the stage play The Elephant Man from July, 1980, to January, 1981.
Bobby Brown
“My Prerogative,” “Roni,” “Rock Wit’cha”
A former member of New Edition, Brown later became tabloid fodder as the (eventually former) husband of Whitney Houston.
Jackson Browne
Browne’s 1985 hit “You’re a “Lawyers in Love,” “Somebody’s Baby,” “Tender Friend of Mine” featured the playing of Bruce Springsteen’s Is the Night” saxophonist Clarence Clemons and the backing vocals of Browne’s thengirlfriend, actress Daryl Hannah.
Belinda Carlisle
“Heaven Is a Place on Earth,” “I Get Weak,” “Mad About You”
Carlisle was the lead singer of the Go-Go’s and later married Morgan Mason, the son of actor James Mason.
“Drive,” “Shake It Up,” “You Might Think”
The cover of the Cars’ CandyO album featured a painting by the pin-up girl artist Alberto Vargas.
Peter Cetera
“After All,” “Glory of Love,” “The Next Time I Fall”
Cetera was the lead singer of Chicago from 1969 to 1985.
Cher*
Cher won the Academy Award “If I Could Turn Back for Best Actress for her role in Time,” “Just like Jesse James,” “We All Sleep Alone” the 1987 film Moonstruck.
The Cars
Chicago
Phil Collins
Ric Ocasek, Benjamin Orr, Elliot Easton, Greg Hawkes, David Robinson
Robert Lamm, Lee Loughnane, James Pankow, Walter Parazaider, Danny Seraphine, Chris Pinnick, Bill Champlin, Jason Scheff
Notable Facts
“Hard Habit to Break,” “Hard to Say I’m Sorry,” “You’re the Inspiration”
Chicago was originally named Chicago Transit Authority, after the city’s rail and bus management agency.
“Against All Odds (Take a Look at Me Now),” “One More Night,” “Sussudio”
With the help of a supersonic Concorde airplane, Collins performed at both London’s Wembley Stadium and Philadelphia’s JFK Stadium as part of the fund-raising concert Live Aid on July 13, 1985.
The Eighties in America
Music: Popular Musicians
Act
Members
Notable 1980’s Songs
The Commodores
Lionel Richie, William King, “Lady (You Bring Me Up),” “Nightshift,” “Oh No” Ronald LaPread, Thomas McClary, Walter Orange, Milan Williams
■
1121
Notable Facts The Commodores’ 1985 hit “Nightshift” was a tribute to the late soul singers Marvin Gaye and Jackie Wilson.
“Arthur’s Theme (Best That Cross won five Grammy You Can Do),” “Ride like the Awards in 1981, including Best New Artist, Album of the Wind,” “Sailing” Year, and Best Song and Record of the Year (for “Sailing”).
Christopher Cross
Before forming the group that would become Culture Club, Boy George was briefly a member of the band Bow Wow Wow.
Culture Club*
“Boy George” O’Dowd, Michael Craig, Roy Hay, Jon Moss
“Church of the Poison Mind,” “Do You Really Want to Hurt Me?,” “Karma Chameleon”
Def Leppard
Joe Elliott, Rick Allen, Steve Clark, Phil Collen, Rick Savage
Drummer Allen played on a “Armageddon It,” “Love Bites,” “Pour Some Sugar on specially created drum kit after losing his left arm in a Me” 1984 car accident; guitarist Clark died of an alcoholrelated illness in 1991.
Dire Straits
Mark Knopfler, Alan Clark, Guy Fletcher, John Illsley, David Knopfler, Terry Williams, Pick Withers
Dire Straits’s 1985 hit “Money “Money for Nothing,” “Romeo and Juliet,” “Walk of for Nothing” became famous for the high-rotation Life” screening of its computeranimated video on MTV and notorious for the appearance of the word “faggot” in one of the verses.
Thomas Dolby
“Hyperactive,” “I Scare Myself,” “She Blinded Me with Science”
Contrary to rumor, Dolby was not born in Cairo, Egypt, but in London, England.
Duran Duran*
Simon LeBon, Nick Rhodes, Andy Taylor, John Taylor, Roger Taylor
“Hungry like the Wolf,” “The Reflex,” “Rio”
The unrelated Andy and John Taylor were also members of the “supergroup” Power Station.
Gloria Estefan & Miami Sound Machine
Gloria Estefan, Emilio Estefan, Juan Avila, Enrique Gracia
“Anything for You,” “1-2-3,” “Falling in Love (Uh-Oh)”
Estefan’s father was a bodyguard for the former Cuban president Fulgencio Batista.
Eurythmics
Annie Lennox, Dave Stewart “Sweet Dreams (Are Made of This),” “Here Comes the Rain Again,” “Would I Lie to You?”
The husband of Bananarama’s Siobhan Fahey, Stewart has produced recordings for Bob Dylan, Tom Petty, Mick Jagger, and the Ramones.
1122
■
The Eighties in America
Music: Popular Musicians
Act
Members
Notable 1980’s Songs
Fleetwood Mac
Lindsey Buckingham, Mick Fleetwood, John McVie, Christine McVie, Stevie Nicks, Billy Burnette, Rick Vito
“Everywhere,” “Hold Me,” “Little Lies”
Fleetwood Mac’s “Don’t Stop” was used as the theme of Bill Clinton’s 1992 presidential campaign.
Foreigner
Lou Gramm, Dennis Elliott, Mick Jones, Rick Wills
“I Want to Know What Love Is,” “Urgent,” “Waiting for a Girl like You”
Two of Foreigner’s biggest hits feature contributions from the saxophonist Junior Walker and the New Jersey Mass Choir,“Urgent” and “I Want to Know What Love Is,” respectively.
Samantha Fox
“I Wanna Have Some Fun,” “Naughty Girls (Need Love Too),” “Touch Me (I Want Your Body)”
Fox first came to fame as a topless model in England.
Aretha Franklin
“Freeway of Love,” “I Knew You Were Waiting (for Me),” “Sisters Are Doin’ It for Themselves”
During the 1980’s, Franklin recorded hit duets with the Eurythmics, George Michael, and Elton John.
Glenn Frey
“The Heat Is On,” “Smuggler’s Blues,” “You Belong to the City”
A member of the Eagles, Frey appeared as the owner of a professional football team in the 1996 film Jerry Maguire.
Peter Gabriel
“Big Time,” “Shock the Monkey,” “Sledgehammer”
The original lead singer of Genesis, Gabriel’s first three solo albums were each titled simply Peter Gabriel.
The J. Geils Band
“Centerfold,” “FreezePeter Wolf, Jerome Geils, Frame,” “Love Stinks” Stephen Jo Bladd, Seth Justman, Danny Klein, Magic Dick Salwitz
Genesis
Tony Banks, Phil Collins, Mike Rutherford
Debbie Gibson
The Go-Go’s*
Notable Facts
Lead singer Wolf was married to actress Faye Dunaway from 1974 to 1979.
Rutherford also scored several “That’s All,” “Tonight, Tonight, Tonight,” “Invisible 1980’s hits as the leader of Mike + the Mechanics. Touch” “Foolish Beat,” “Lost in Your Gibson won $1,000 in a Eyes,” “Only in My Dreams” songwriting contest with a song that she had written at age twelve,“I Come from America.”
Belinda Carlisle, Charlotte Caffey, Gina Schock, Kathy Valentine, Jane Wiedlin
“Our Lips Are Sealed,” “Vacation,” “We Got the Beat”
The Go-Go’s portrayed an allmale dance band in their 1984 “Turn to You” video.
The Eighties in America
Music: Popular Musicians
■
1123
Act
Members
Notable 1980’s Songs
Guns n’ Roses*
Axl Rose, Saul “Slash” Hudson, Izzy Stradlin, Steven Adler, Duff McKagan
“Paradise City,” “Sweet Child Guns n’ Roses’ 1993 album The Spaghetti Incident? consists o’ Mine,” “Welcome to the of covers of the band’s favorite Jungle” punk songs.
Hall & Oates
Heart
Ann Wilson, Nancy Wilson, Mark Andes, Denny Carmassi, Howard Leese
Don Henley
Bruce Hornsby & the Range
Philip Oakey, Joanne Catherall, Suzanne Sulley
Billy Idol
INXS
Janet Jackson
“Kiss on My List,” “Maneater,” “Out of Touch”
By 1984, Daryl Hall and John Oates had eclipsed the Everly Brothers as the most successful duo in rock history.
“All I Wanna Do Is Make Love to You,” “Never,” “These Dreams”
Nancy Wilson is married to the journalist, author, and filmmaker Cameron Crowe.
“The Boys of Summer,” “Dirty Laundry,” “The End of the Innocence”
Along with fellow members of the Eagles, Henley was inducted into the Rock and Roll Hall of Fame in 1998.
“Mandolin Rain,” “The During the 1990’s Hornsby Bruce Hornsby, David Mansfield, George Marinelli, Valley Road,” “The Way It Is” occasionally played keyboards on tour with the Grateful John Molo, Joe Puerta Dead.
Whitney Houston*
The Human League
Notable Facts
Michael Hutchence, Garry Beers, Andy Farris, Jon Farris, Tim Farris, Rick Pengilly
“How Will I Know,” “I Wanna Dance with Somebody (Who Loves Me),” “Saving All My Love for You”
Houston’s 1985 hit “The Greatest Love of All” was originally a hit for George Benson and the theme song to the Muhammed Ali biopic The Greatest in 1977.
“Don’t You Want Me,” “(Keep Feeling) Fascination,” “Human”
Before breaking through in the United States, the Human League had enjoyed a string of hits, including “Boys and Girls” and “Love Action,” in England.
“Dancing with Myself,” “Rebel Yell,” “White Wedding”
Idol was the lead singer of the English punk band Generation X.
“Devil Inside,” “Need You Tonight,” “New Sensation”
The cue-card-tossing sequence of INXS’s 1987 video “Need You Tonight/Mediate” was a take-off on “Subterranean Homesick Blues,” the opening sequence of Don’t Look Back, a documentary about Bob Dylan released in 1965.
“Control,” “Miss You Much,” “Nasty”
Jackson acted in the television situation comedies Good Times and Diff’rent Strokes from 1977 to 1982.
1124
■
The Eighties in America
Music: Popular Musicians
Act
Members
Michael Jackson*
Joan Jett and the Blackhearts
Notable 1980’s Songs “Beat It,” “Billie Jean,” “Wanna Be Startin’ Somethin’”
Notable Facts Jackson’s 1982 album Thriller is the second-biggest-selling album of all time.
Jett costarred with Michael J. “Crimson and Clover,” “I Joan Jett, Ricky Byrd, Lee Crystal, Thommy Price, Gary Hate Myself for Loving You,” Fox, Michael McKean, and Gena Rowlands in the 1987 “I Love Rock ’n Roll” Ryan film Light of Day.
Billy Joel
“Tell Her About It,” “Uptown Girl,” “We Didn’t Start the Fire”
Joel played piano on the Shangri-Las’ 1964 hit “Leader of the Pack.”
Elton John
“I Don‘t Want to Go on with You like That,” “I Guess That’s Why They Call It the Blues,” “I’m Still Standing”
Originally a hit in the 1970’s as a tribute to Marilyn Monroe, John’s “Candle in the Wind” would become an even bigger hit in 1997 when rewritten as a tribute to the late Princess Diana.
Journey*
Steve Perry, Jonathan Cain, Gregg Rolie, Neal Schon, Steve Smith, Ross Valary
“Don’t Stop Believin’,” “Open Arms,” “Who’s Crying Now”
Schon and Rolie first played together as members of Santana; Cain, who replaced Rolie in 1981, was also a member of The Babys and Bad English.
Kool & the Gang
James “J. T.” Taylor, Robert “Kool” Bell, Ronald Bell, George Brown, Robert Mickens, Claydes Smith, Dennis Thomas, Earl Toon
“Celebrate,” “Cherish,” “Misled”
Kool & the Gang’s “Celebrate” was used as the theme song of the 2004 Democratic Convention.
Cyndi Lauper*
“Girls Just Want to Have Fun,” “Time After Time,” “True Colors”
Lauper’s “True Colors” was featured prominently in a 1980’s Kodak advertisement; a version by Kasey Chambers was used as the theme song of the 2003 Rugby World Cup.
John Lennon*
“(Just like) Starting Over,” “Watching the Wheels,” “Woman”
Lennon, a former Beatle, was assassinated on December 8, 1980, in New York City.
Huey Lewis & the News
Kenny Loggins
“Do You Believe in Love,” “I Huey Lewis, Johnny Calla, Mario Cipollina, Bill Gibson, Want a New Drug,” “The Power of Love” Chris Hayes, Sean Hopper
Lewis played harmonica on 1970’s solo albums by the pubrock pioneers Nick Lowe and Dave Edmunds.
“Danger Zone,” “Footloose,” During the 1970’s, Loggins “I’m Alright” was one half of the hit duo Loggins & Messina.
The Eighties in America
Music: Popular Musicians
■
1125
Act
Members
Notable 1980’s Songs
Loverboy
Mike Reno, Paul Dean, Matt Frenette, Doug Johnson, Scott Smith
“Hot Girls in Love,” “Lovin’ Every Minute of It,” “Working for the Weekend”
Loverboy is one of the biggestselling Canadian bands of all time.
Paul McCartney
“Coming Up (Live at Glasgow),” “Ebony and Ivory,” “No More Lonely Nights”
Beginning with 1997’s Standing Stone, McCartney composed and released a series of orchestral albums that became classical music best-sellers.
Madonna*
“Into the Groove,” “Like a Prayer,” “Live to Tell”
Madonna was the top female singles artist of the 1980’s, with twenty top-forty hits, seventeen of which reached the top ten and seven of which reached number one.
Richard Marx
“Angelia,” “Hold on to the Nights,” “Right Here Waiting”
Prior to his solo success, Marx sang backup on albums by Lionel Richie, Chicago, Peabo Bryson, Teddy Pendergrass, and Julio Iglesias.
John Mellencamp*
“Hurts So Good,” “Jack and Diane,” “R.O.C.K. in the U.S.A. (A Salute to 60’s Rock)”
From 1976 to 1982, Mellencamp released albums under the name John Cougar, a pseudonym selected by his manager Tony DeFries.
“Down Under,” “It’s a Mistake,” “Who Can It Be Now?”
“Down Under” became famous for introducing Australian terms such as “crombie,” “chunder,” and “Vegemite sandwich” to the American pop charts.
“Faith,” “Father Figure,” “I Want Your Sex”
Michael performed “Somebody to Love” with the surviving members of Queen at the Freddie Mercury Tribute in 1992.
Men at Work
Colin Hay, Greg Ham, John Rees, Jerry Speiser, Ron Strykert
George Michael*
Notable Facts
“Blame It on the Rain,” “Girl Morvan and Pilatus became notorious for not singing on I’m Gonna Miss You,” “Girl “their” recordings and having You Know It’s True” to return their 1989 Best New Artist Grammy Award. Pilatus died of a drug overdose in 1998.
Milli Vanilli
Fabrice Morvan, Rob Pilatus (actually John Davis, Brad Howe, Charles Shaw)
The Moody Blues
Justin Hayward, John Lodge, “I Know You’re Out There Somewhere,” “The Voice,” Graeme Edge, Patrick “Your Wildest Dreams” Moraz, Ray Thomas
In 1986 members of the British band Mood Six portrayed the young Moody Blues in the video for “Your Wildest Dreams.”
1126
■
The Eighties in America
Music: Popular Musicians
Act
Members
Notable 1980’s Songs
Mötley Crüe*
Vince Neil, Tommy Lee, Mick Mars, Nikki Sixx
“Dr. Feelgood,” “Girls, Girls, Girls,” “Smokin’ in the Boys Room”
Notable Facts Lee became tabloid fodder in the 1990’s because of his marriages to—and divorces from—actresses Heather Locklear and Pamela Anderson.
In the 1980’s and 1990’s, “Always on My Mind,” “On the Road Again,” “To All the Nelson, Johnny Cash, Waylon Jennings, and Kris Girls I’ve Loved Before” Kristofferson were members of The Highwaymen, a country music band.
Willie Nelson
“Hangin’ Tough,” “This One’s for the Children,” “You Got It (The Right Stuff)”
New Kids on the Block’s 1989 album Hangin’ Tough sold more than eight million copies and included five topten hits.
Olivia Newton-John
“Heart Attack,” “Physical,” “Suddenly”
Newton-John is the granddaughter of the Nobel Prize-winning physicist Max Born.
Stevie Nicks
“Edge of Seventeen (Just like a White Winged Dove),” “Leather and Lace,” “Stop Draggin’ My Heart Around”
As a member of Fleetwood Mac, Nicks was inducted into the Rock and Roll Hall of Fame in 1998.
Billy Ocean
“Caribbean Queen (No More Love on the Run),” “Get Outta My Dreams, Get into My Car,” “There’ll Be Sad Songs (to Make You Cry)”
Although considered a 1980’s artist, Ocean scored his first top-forty hit, “Love Really Hurts Without You,” in 1976.
Robert Palmer
“Addicted to Love,” “I Didn’t The trademark of Palmer’s popular 1980’s videos was the Mean to Turn You On,” use of glamorous models as “Simply Irresistible” his backing band.
Ray Parker, Jr.
“Ghostbusters,” “I Still Can’t Get Over Loving You,” “The Other Woman”
In 1984, Parker was sued for by Huey Lewis, who claimed that Parker’s “Ghostbusters” plagiarized Lewis’s “I Want a New Drug.”
“Always on My Mind,” “West End Girls,” “What Have I Done to Deserve This?”
Prior to his musical career, Tennant was an editor at Marvel Comics and Smash Hits magazine.
New Kids on the Block
Pet Shop Boys
Jon Knight, Jordan Knight, Joey McIntyre, Donnie Wahlberg, Danny Wood
Chris Lowe, Neil Tennant
The Eighties in America
Music: Popular Musicians
■
1127
Act
Members
Notable 1980’s Songs
Tom Petty & the Heartbreakers
Mike Campbell, Howie Epstein, Stan Lynch, Tom Petty, Benmont Tench
“Free Fallin’,” “I Won’t Back Down,” “Refugee”
With Bob Dylan, George Harrison, Jeff Lynne, and Roy Orbison, Petty also was a member of the Traveling Wilburys.
Pointer Sisters
Anita Pointer, June Pointer, Ruth Pointer
“I‘m So Excited,” “Jump,” “Neutron Dance”
During the 1970’s the Pointer Sisters became the first black female act to perform at the Grand Ole Opry.
The Police*
Gordon “Sting” Sumner, Stewart Copeland, Andy Stewart
“Don’t Stand So Close to Me,” “Every Breath You Take,” “Every Little Thing She Does Is Magic”
Sumner acquired the nickname Sting because of his preference for wearing a yellow and black shirt in his early days as a musician.
“Little Red Corvette,” “1999,” “When Doves Cry”
Under pseudonyms such as “Christopher and Alexander Nevermind,” Prince wrote hits for the Bangles, Sheena Easton, and other performers.
“The One I Love,” “Radio Free Europe,” “Stand”
R.E.M. became figureheads of the Athens (Georgia) sound, the popularity of which contributed to the popularity of other locality-specific underground acts, such as the Replacements (Minneapolis) and Nirvana (Seattle).
Prince*
Notable Facts
R.E.M.*
Michael Stipe, Peter Buck, Bill Berry, Mike Mills
REO Speedwagon
REO Speedwagon took its Kevin Cronin, Neal Doughty, “Can’t Fight This Feeling,” “Keep on Loving You,” “Live name from a high-speed fire Alan Gratzer, Bruce Hall, engine. Every Moment” Gary Richrath
Lionel Richie*
The Rolling Stones
Sade
Bob Seger and the Silver Bullet Band
“All Night Long (All Night),” “Penny Lover,” “Stuck on You” Mick Jagger, Keith Richards, Ron Wood, Charlie Watts, Bill Wyman
Before launching his solo career in 1982, Richie was the lead singer of The Commodores.
“Emotional Rescue,” “Mixed The Rolling Stones’ 1981 hit Emotions,” “Start Me Up,” “Waiting on a Friend” featured the saxophone playing of jazz great Sonny Rollins. “Never as Good as the First Time,” “Smooth Operator,” “The Sweetest Taboo”
“Against the Wind,” “Like a Bob Seger, Drew Abbott, Chris Campbell, Craig Frost, Rock,” “Shakedown” Charlie Martin, Alto Reed, Robyn Robbins
Sade (pronounced “Sharday”) was born Helen Folasade Adu in Ibadan, Nigeria. Seger’s “Like a Rock” was used in a long-running series of advertisements for Chevrolet trucks.
1128
■
The Eighties in America
Music: Popular Musicians
Act
Members
Rick Springfield
Notable 1980’s Songs “Don’t Talk to Strangers,” “I’ve Done Everything for You,” “Jessie’s Girl”
Besides being a teen idol in the 1970’s, Springfield also portrayed Dr. Noah Drake in the soap opera General Hospital.
“Born in the U.S.A.,” “Hungry Heart,” “Glory Days”
In 1986 Springsteen’s fortysong Live/1975-1985 became the only boxed set to reach number one on Billboard’s album chart.
Bruce Springsteen & the E-Street Band*
Bruce Springsteen, Roy Bittan, Clarence Clemons, Gary Tallent, “Miami Steve” Van Zant, Max Weinberg
Starship
Mickey Thomas, Grace Slick, “Nothing’s Gonna Stop Us Now,” “Sara,” “We Built This Don Baldwin, Craig City” Chaquico, Aynsley Dunbar, Pete Sears
Styx
Dennis DeYoung, Tommy Shaw, Chuck Panozzo, John Panozzo, James Young
Donna Summer
Notable Facts
Thomas’s first Top 40 appearance was as the lead singer on Elvin Bishop’s 1976 hit “Fooled Around and Fell in Love.”
“The Best of Times,” “Mr. Roboto,” “Too Much Time on My Hands”
Styx’s 1983 album Kilroy Was Here was an anticensorship concept album, inspired in part by accusations that the band was encoding evil messages into its songs.
“Cold Love,” “She Works Hard for the Money,” “The Wanderer”
Summer revamped her 1970’s bad-girl image by publicizing her conversion to Christianity with the song “I Believe in Jesus” in 1980 and recording two albums with the Christian producer Michael Omartian in 1983 and 1984.
“Everybody Wants to Rule the World,” “Shout,” “Sowing the Seeds of Love”
Orzabal and Smith based their group’s name and some of their songs on the primal scream theories of psychotherapist Arthur Janev.
Tears for Fears
Roland Orzabal, Curt Smith
Thompson Twins
Tom Bailey, Alannah Currie, “Doctor! Doctor!,” “Hold Me The Thompson Twins named Joe Leeway Now,” “Lay Your Hands” themselves after the detectives Thompson and Thompson in the European comic strip The Adventures of Tintin.
Toto
Bobby Kimball, David Hungate, Steve Lukather, David Paich, Jeff Porcaro, Steve Porcaro, Fergie Fredericksen
Tina Turner*
“Africa,” “Rosanna,” “Stranger in Town”
The real-life subject of “Rosanna” was the actress Rosanna Arquette.
“Break Every Rule,” “Private Dancer,” “What’s Love Got to Do with It”
Turner costarred with Mel Gibson in the 1985 film Mad Max Beyond Thunderdome.
The Eighties in America
Music: Popular Musicians
■
1129
Act
Members
Notable 1980’s Songs
U2
Paul “Bono” Hewson, Dave “The Edge” Evans, Adam Clayton, Larry Mullen, Jr.
“I Still Haven’t Found What I’m Looking For,” “Where the Streets Have No Name,” “With or Without You”
The briefcase with the lyrics for U2’s 1981 album October was stolen, requiring Bono to come up with lyrics in the studio. The briefcase was found and returned to Bono in 2004.
Van Halen*
David Lee Roth, Eddie Van Halen, Michael Anthony, Alex Van Halen, Sammy Hagar
“Jump,” “Panama,” “Why Can’t This Be Love”
Both Van Halen brothers were trained in classical music as children.
Steve Winwood
“Higher Love,” “Valerie,” “While You See a Chance”
As a teenager Winwood sang lead and played organ on the Spencer Davis Group’s 1967 hits “Gimme Some Lovin’” and “I’m a Man.”
“Weird Al” Yankovic*
“Eat It,” “I Love Rocky Yankovic’s career as rock’s Road,” “I Want a New Duck” premier parodist was launched when his “My Bologna,” a parody of the Knack’s “My Sharona,” was played on the nationally syndicated Dr. Demento radio show in 1979.
ZZ Top
Billy Gibbons, Dusty Hill, Frank Beard
Notable Facts
“Gimme All Your Lovin,” ZZ Top’s best-known videos “Legs,” “Sharp Dressed Man” were mini triumph-of-theunderdog narratives that featured the band members and a trio of glamorous women as fairy godparents.
Arsenio Orteza
■ Music: Grammy Awards This list includes winners of Grammy Awards in major categories. “Album of the Year” awards the artist who performed the album. “Record of the Year” awards the producer and artist, while “Song of the Year” awards the songwriter. An asterisk (*) following a name or group indicates the presence of a full-length entry in The Eighties in America.
1980 Album of the Year: Christopher Cross, Christopher Cross Record of the Year: “Sailing,” Michael Omartian (producer), Christopher Cross (artist) Song of the Year: “Sailing,” Christopher Cross (songwriter and artist) Best New Artist: Christopher Cross Best Pop Vocal Performance, Female: “The Rose,” Bette Midler Best Pop Vocal Performance, Male: “This Is It,” Kenny Loggins Best Pop Performance by a Duo or Group with Vocal: “Guilty,” Barbra Streisand and Barry Gibb Best Rock Vocal Performance, Female: Crimes of Passion, Pat Benatar Best Rock Vocal Performance, Male: Glass Houses, Billy Joel Best Rock Performance by a Duo or Group with Vocal: Against the Wind, Bob Seger and the Silver Bullet Band Best R&B Vocal Performance, Female: “Never Knew Love Like This Before,” Stephanie Mills Best R&B Vocal Performance, Male: “Give Me the Night,” George Benson Best R&B Performance by a Duo or Group with Vocal: “Shining Star,” Manhattans Best R&B Song: “Never Knew Love Like This Before,” James Mtume and Reggie Lucas (songwriters), Stephanie Mills (artist) Best Country Vocal Performance, Female: “Could I Have This Dance?,” Anne Murray Best Country Vocal Performance, Male: “He Stopped Loving Her Today,” George Jones Best Country Performance, Duo or Group: “That Lovin’ You Feeling Again,” Emmylou Harris and Roy Orbison Best Country Song: “On the Road Again,” Willie Nelson (songwriter and artist) Best Jazz Fusion Performance, Instrumental or Vocal: “Birdland,” The Manhattan Transfer Best Jazz Vocal Performance, Female: A Perfect Match: Ella and Basie, Ella Fitzgerald
Best Jazz Vocal Performance, Male: “Moody’s Mood,” George Benson Best Jazz Instrumental Performance, Soloist: I Will Never Say Goodbye, Bill Evans Best Jazz Instrumental Performance, Group: We Will Meet Again, Bill Evans Best Jazz Instrumental Performance, Big Band: On the Road Again, Count Basie
1981 Album of the Year: Double Fantasy, Jack Douglas, John Lennon*, and Yoko Ono (producers), John Lennon* and Yoko Ono (artists) Record of the Year: “Bette Davis Eyes,” Val Garay (producer), Kim Carnes (artist) Song of the Year: “Bette Davis Eyes,” Donna Weiss and Jackie DeShannon (songwriters), Kim Carnes (artist) Best New Artist: Sheena Easton Best Pop Vocal Performance, Female: Lena Horne: The Lady and Her Music, Lena Horne (artist) Best Pop Vocal Performance, Male: Breakin’ Away, Al Jarreau Best Pop Performance by a Duo or Group with Vocal: “Boy from New York City,” The Manhattan Transfer Best Rock Vocal Performance, Female: “Fire and Ice,” Pat Benatar Best Rock Vocal Performance, Male: “Jessie’s Girl,” Rick Springfield Best Rock Performance by a Duo or Group with Vocal: “Don’t Stand So Close to Me,” The Police Best R&B Vocal Performance, Female: “Hold On, I’m Comin’,” Aretha Franklin Best R&B Vocal Performance, Male: “One Hundred Ways,” James Ingram Best R&B Performance by a Duo or Group with Vocal: The Dude, Quincy Jones Best R&B Song: “Just the Two of Us,” Bill Withers, Ralph MacDonald, and William Salter (songwriters), Grover Washington, Jr., and Bill Withers (artists)
The Eighties in America
Best Country Vocal Performance, Female: “9 to 5,” Dolly Parton Best Country Vocal Performance, Male: “(There’s) No Getting Over Me,” Ronnie Milsap Best Country Performance, Duo or Group: “Elvira,” The Oak Ridge Boys Best Country Song: “9 to 5,” Dolly Parton (songwriter and artist) Best Jazz Fusion Performance, Instrumental or Vocal: Winelight, Grover Washington, Jr. Best Jazz Vocal Performance, Female: Digital III at Montreaux, Ella Fitzgerald Best Jazz Vocal Performance, Male: “Blue Rondo a la Turk,” Al Jarreau Best Jazz Vocal Performance, Duo or Group: “Until I Met You (Corner Pocket),” The Manhattan Transfer Best Jazz Instrumental Performance, Soloist: Bye Bye Blackbird, John Coltrane Best Jazz Instrumental Performance, Group: Chick Corea and Gary Burton in Concert, Zurich, October, 28, 1979, Chick Corea and Gary Burton Best Jazz Instrumental Performance, Big Band: Walk on the Water, Gerry Mulligan Video of the Year: Michael Nesmith in Elephant Parts, Michael Nesmith
1982 Album of the Year: Toto IV, Toto (producer and artist) Record of the Year: “Rosanna,” Toto (producer and artist) Song of the Year: “Always on My Mind,” Johnny Christopher, Mark James, and Wayne Carson (songwriters), Willie Nelson (artist) Best New Artist: Men at Work Best Pop Vocal Performance, Female: “You Should Hear How She Talks About You,” Melissa Manchester Best Pop Vocal Performance, Male: “Truly,” Lionel Richie* Best Pop Performance by a Duo or Group with Vocal: “Up Where We Belong,” Joe Cocker and Jennifer Warnes (artists) Best Rock Vocal Performance, Female: “Shadows of the Night,” Pat Benatar Best Rock Vocal Performance, Male: “Hurts So Good,” John Cougar Mellencamp* Best Rock Performance by a Duo or Group with Vocal: “Eye of the Tiger,” Survivor
Music: Grammy Awards
■
1131
Best R&B Vocal Performance, Female: “And I Am Telling You I’m Not Going,” Jennifer Holliday Best R&B Vocal Performance, Male: “Sexual Healing,” Marvin Gaye Best R&B Performance by a Duo or Group with Vocal (tie): “Let It Whip,” Dazz Band, and “Wanna Be with You,” Earth, Wind, & Fire Best R&B Song: “Turn Your Love Around,” Bill Champlin, Jay Graydon, and Steve Lukather (songwriters), George Benson (artist) Best Country Vocal Performance, Female: “Break It to Me Gently,” Juice Newton Best Country Vocal Performance, Male: “Always on My Mind,” Willie Nelson Best Country Performance, Duo or Group: Mountain Music, Alabama Best Country Song: “Always on My Mind,” Johnny Christopher, Mark James, and Wayne Carson (songwriters), Willie Nelson (artist) Best Jazz Fusion Performance, Instrumental or Vocal: Offramp, Pat Metheny Best Jazz Vocal Performance, Female: Gershwin Live!, Sarah Vaughn Best Jazz Vocal Performance, Male: An Evening with George Shearing and Mel Tormé, Mel Tormé Best Jazz Vocal Performance, Duo or Group: “Route 66,” The Manhattan Transfer Best Jazz Instrumental Performance, Soloist: We Want Miles, Miles Davis Best Jazz Instrumental Performance, Group: More Live, Phil Woods Quartet Best Jazz Instrumental Performance, Big Band: Warm Breeze, Count Basie Video of the Year: Physical, Olivia Newton-John
1983 Album of the Year: Thriller, Michael Jackson* and Quincy Jones (producers), Michael Jackson* (artist) Record of the Year: “Beat It,” Michael Jackson* and Quincy Jones (producers), Michael Jackson* (artist) Song of the Year: “Every Breath You Take,” Sting* (songwriter), The Police (artists) Best New Artist: Culture Club* Best Pop Vocal Performance, Female: “Flashdance: What a Feeling,” Irene Cara Best Pop Vocal Performance, Male: Thriller, Michael Jackson*
1132
■
Music: Grammy Awards
Best Pop Performance by a Duo or Group with Vocal: “Every Breath You Take,” The Police Best Rock Vocal Performance, Female: “Love Is a Battlefield,” Pat Benatar Best Rock Vocal Performance, Male: “Beat It,” Michael Jackson* Best Rock Performance by a Duo or Group with Vocal: Synchronicity, The Police Best R&B Vocal Performance, Female: Chaka Khan, Chaka Khan Best R&B Vocal Performance, Male: “Billie Jean,” Michael Jackson* Best R&B Performance by a Duo or Group with Vocal: “Ain’t Nobody,” Rufus and Chaka Khan Best R&B Song: “Billie Jean,” Michael Jackson* (songwriter and artist) Best Country Vocal Performance, Female: “A Little Good News,” Anne Murray Best Country Vocal Performance, Male: “I.O.U.,” Lee Greenwood Best Country Performance, Duo or Group: The Closer You Get, Alabama Best Country Song: “Stranger in My House,” Ronnie Milsap (songwriter) Best Jazz Fusion Performance, Instrumental or Vocal: Travels, Pat Metheny Best Jazz Vocal Performance, Female: The Best Is Yet to Come, Ella Fitzgerald Best Jazz Vocal Performance, Male: Top Drawer, Mel Tormé Best Jazz Vocal Performance, Duo or Group: “Why Not!,” The Manhattan Transfer Best Jazz Instrumental Performance, Soloist: Think of One, Wynton Marsalis Best Jazz Instrumental Performance, Group: At the Vanguard, Phil Woods Quartet Best Jazz Instrumental Performance, Big Band: All in Good Time, Rob McConnell and the Boss Brass Best Video, Short Form: “Girls on Film/Hungry Like the Wolf,” Duran Duran* Best Video Album: Duran Duran, Duran Duran*
1984 Album of the Year: Can’t Slow Down, James Anthony Carmichael and Lionel Richie* (producers), Lionel Richie* (artist) Record of the Year: “What’s Love Got to Do with It,” Terry Britten (producer), Tina Turner* (artist)
The Eighties in America
Song of the Year: “What’s Love Got to Do with It,” Graham Lyle and Terry Britten (songwriters), Tina Turner (artist) Best New Artist: Cyndi Lauper* Best Pop Vocal Performance, Female: “What’s Love Got to Do With It,” Tina Turner* Best Pop Vocal Performance, Male: “Against All Odds (Take a Look at Me Now),” Phil Collins Best Pop Performance by a Duo or Group with Vocal: “Jump (For My Love),” The Pointer Sisters Best Rock Vocal Performance, Female: “Better Be Good to Me,” Tina Turner* Best Rock Vocal Performance, Male: “Dancing in the Dark,” Bruce Springsteen* Best Rock Performance by a Duo or Group with Vocal: Purple Rain: Music from the Motion Picture, Prince* and the Revolution Best R&B Vocal Performance, Female: “I Feel for You,” Chaka Khan Best R&B Vocal Performance, Male: “Caribbean Queen (No More Rain on My Love),” Billy Ocean Best R&B Performance by a Duo or Group with Vocal: “Yah Mo B There,” James Ingram and Michael McDonald Best R&B Song: “I Feel for You,” Prince* (songwriter), Chaka Khan (artist) Best Country Vocal Performance, Female: “In My Dreams,” Emmylou Harris Best Country Vocal Performance, Male: “That’s the Way Love Goes,” Merle Haggard Best Country Performance, Duo or Group: “Mama, He’s Crazy,” The Judds Best Country Song: “City of New Orleans,” Steve Goodman (songwriter), Willie Nelson (artist) Best Jazz Fusion Performance, Instrumental or Vocal: First Circle, Pat Metheny Group Best Jazz Vocal Performance: Nothin’ but the Blues, Joe Williams Best Jazz Instrumental Performance, Soloist: Hot House Flowers, Wynton Marsalis Best Jazz Instrumental Performance, Group: New York Scene, Art Blakey and the Jazz Messengers Best Jazz Instrumental Performance, Big Band: 88 Basie Street, Count Basie Best Video, Short Form: “David Bowie,” David Bowie Best Video Album: Making Michael Jackson’s “Thriller,” Michael Jackson*
The Eighties in America
1985 Album of the Year: No Jacket Required, Hugh Padgham and Phil Collins (producers), Phil Collins (artist) Record of the Year: “We Are the World,” Quincy Jones (producer) and USA for Africa* (artist) Song of the Year: “We Are the World,” Lionel Richie* and Michael Jackson* (songwriters), USA for Africa* (artist) Best New Artist: Sade Best Pop Vocal Performance, Female: “Saving All My Love for You,” Whitney Houston* Best Pop Vocal Performance, Male: No Jacket Required, Phil Collins Best Pop Performance by a Duo or Group with Vocal: “We Are the World,” Quincy Jones (producer), USA for Africa* (artist) Best Rock Vocal Performance, Female: “One of the Living,” Tina Turner* Best Rock Vocal Performance, Male: “The Boys of Summer,” Don Henley Best Rock Performance by a Duo or Group with Vocal: “Money for Nothing,” Dire Straits Best R&B Vocal Performance, Female: “Freeway of Love,” Aretha Franklin Best R&B Vocal Performance, Male: In Square Circle, Stevie Wonder Best R&B Performance by a Duo or Group with Vocal: “Nightshift,” Commodores Best R&B Song: “Freeway of Love,” Jeffrey Cohen and Narada Michael Walden (songwriters), Aretha Franklin (artist) Best Country Vocal Performance, Female: “I Don’t Know Why You Don’t Want Me,” Roseanne Cash Best Country Vocal Performance, Male: “Lost in the Fifties Tonight (In the Still of the Night),” Ronnie Milsap Best Country Performance, Duo or Group: Why Not Me, The Judds Best Country Song: “Highwayman,” Jimmy L. Webb (songwriter), Waylon Jennings, Willie Nelson, Johnny Cash, and Kris Kristofferson (artists) Best Jazz Fusion Performance, Instrumental or Vocal: Straight to the Heart, David Sanborn Best Jazz Vocal Performance, Female: Cleo Laine at Carnegie: The Tenth Anniversary Concert, Cleo Laine
Music: Grammy Awards
■
1133
Best Jazz Vocal Performance, Male: “Another Night in Tunisia,” Bobby McFerrin and Jon Hendricks Best Jazz Vocal Performance, Duo or Group: Vocalese, The Manhattan Transfer Best Jazz Instrumental Performance, Soloist: Black Codes from the Underground, Wynton Marsalis Best Jazz Instrumental Performance, Group: Black Codes from the Underground, Wynton Marsalis Group Best Jazz Instrumental Performance, Big Band: The Cotton Club: Original Motion Picture Soundtrack, Bob Wilber and John Barry Best Music Video, Short Form: “We Are the World: The Video Event,” Tom Trbovich (director), Quincy Jones (producer), USA for Africa* (artist) Best Music Video, Long Form: Huey Lewis and the News: The Heart of Rock and Roll, Bruce Gowers (director), Huey Lewis and the News (artist)
1986 Album of the Year: Graceland, Paul Simon (producer and artist) Record of the Year: “Higher Love,” Russ Titleman and Steve Winwood (producers), Steve Winwood (artist) Song of the Year: “That’s What Friends Are For,” Burt Bacharach and Carole Bayer Sager (songwriters), Dionne Warwick, Elton John, Gladys Knight, and Stevie Wonder (artists) Best New Artist: Bruce Hornsby and the Range Best Pop Vocal Performance, Female: The Broadway Album, Barbra Streisand Best Pop Vocal Performance, Male: “Higher Love,” Steve Winwood Best Pop Performance by a Duo or Group with Vocal: “That’s What Friends Are For,” Dionne Warwick, Elton John, Gladys Knight, and Stevie Wonder (artists) Best Rock Vocal Performance, Female: “Back Where You Started,” Tina Turner* Best Rock Vocal Performance, Male: “Addicted to Love,” Robert Palmer Best Rock Performance by a Duo or Group with Vocal: “Missionary Man,” Eurythmics Best R&B Vocal Performance, Female: Rapture, Anita Baker Best R&B Vocal Performance, Male: “Living in America,” James Brown
1134
■
Music: Grammy Awards
Best R&B Performance by a Duo or Group with Vocal: “Kiss,” Prince* and the Revolution Best R&B Song: “Sweet Love,” Anita Baker, Gary Bias, and Louis A. Johnson (songwriters), Anita Baker (artist) Best Country Vocal Performance, Female: “Whoever’s in New England,” Reba McEntire Best Country Vocal Performance, Male: Lost in the Fifties Tonight, Ronnie Milsap Best Country Performance, Duo or Group: “Grandpa (Tell Me ’bout the Good Old Days),” The Judds Best Country Song: “Grandpa (Tell Me ’bout the Good Old Days),” Jamie O’Hara (songwriter), The Judds (artist) Best Jazz Fusion Performance, Instrumental or Vocal: Double Vision, David Sanborn and Bob James Best Jazz Vocal Performance, Female: Timeless, Diane Schuur Best Jazz Vocal Performance, Male: “’Round Midnight,” Bobby McFerrin Best Jazz Vocal Performance, Duo or Group: “Free Fall,” 2 + 2 Plus Best Jazz Instrumental Performance, Soloist: Tutu, Miles Davis Best Jazz Instrumental Performance, Group: J Mood, Wynton Marsalis Best Jazz Instrumental Performance, Big Band: The Tonight Show Band with Doc Severinsen, The Tonight Show Band with Doc Severinsen Best Music Video, Short Form: “Brothers in Arms,” Dire Straits Best Music Video, Long Form: Bring on the Night, Michael Apted (director), Sting* (producer and artist)
1987 Album of the Year: The Joshua Tree, Brian Eno and Daniel Lanois (producers), U2* (artist) Record of the Year: “Graceland,” Paul Simon (producer and artist) Song of the Year: “Somewhere out There,” Barry Mann, Cynthia Weil, and James Horner (songwriters), Linda Ronstadt and James Ingram (artists) Best New Artist: Jody Watley Best Pop Vocal Performance, Female: “I Wanna Dance with Somebody (Who Loves Me),” Whitney Houston*
The Eighties in America
Best Pop Vocal Performance, Male: Bring on the Night, Sting* Best Pop Performance by a Duo or Group with Vocal: “I’ve Had (The Time of My Life),” Jennifer Warnes and Bill Medley Best Rock Vocal Performance, Solo: Tunnel of Love, Bruce Springsteen* Best Rock Performance by a Duo or Group with Vocal: The Joshua Tree, U2* Best R&B Vocal Performance, Female: Aretha, Aretha Franklin Best R&B Vocal Performance, Male: “Just to See Her,” Smokey Robinson Best R&B Performance by a Duo or Group with Vocal: “I Knew You Were Waiting (for Me), ” Aretha Franklin and George Michael* Best R&B Song: “Lean on Me,” Bill Withers (songwriter), Club Nouveau (artist) Best Country Vocal Performance, Female: “’80’s Ladies,” K. T. Oslin Best Country Vocal Performance, Male: Always and Forever, Randy Travis Best Country Performance, Duo or Group: Trio, Dolly Parton, Linda Ronstadt, and Emmylou Harris Best Country Performance, Duet: “Make No Mistake, She’s Mine,” Ronnie Milsap and Kenny Rogers Best Country Song: “Forever and Ever, Amen,” Don Schlitz and Paul Overstreet (songwriters), Randy Travis (artist) Best Jazz Fusion Performance, Instrumental or Vocal: Still Life (Talking), Pat Metheny Group Best Jazz Vocal Performance, Female: Diane Schuur and the Count Basie Orchestra, Diane Schuur Best Jazz Vocal Performance, Male:”What Is This Thing Called Love,” Bobby McFerrin Best Jazz Instrumental Performance, Soloist: The Other Side of Round Midnight, Dexter Gordon Best Jazz Instrumental Performance, Group: Marsalis Standard Time, Volume I, Wynton Marsalis Best Jazz Instrumental Performance, Big Band: Digital Duke, Mercer Ellington Best Performance, Music Video: The Prince’s Trust All-Star Rock Concert, Anthony Eaton (producer), various artists Best Concept, Music Video: “Land of Confusion,” Jim Yukich and John Lloyd (directors), Jon Blair (producer), Genesis (artist)
The Eighties in America
1988 Album of the Year: Faith, George Michael* (producer and artist) Record of the Year: “Don’t Worry, Be Happy,” Linda Goldstein (producer), Bobby McFerrin (artist) Song of the Year: “Don’t Worry, Be Happy,” Bobby McFerrin (songwriter and artist) Best New Artist: Tracy Chapman Best Pop Vocal Performance, Female: “Fast Car,” Tracy Chapman Best Pop Vocal Performance, Male: “Don’t Worry, Be Happy,” Bobby McFerrin Best Pop Performance by a Duo or Group with Vocal: Brasil, The Manhattan Transfer Best Rock Vocal Performance, Female: Tina Live in Europe, Tina Turner* Best Rock Vocal Performance, Male: “Simply Irresistible,” Robert Palmer Best Rock Performance by a Duo or Group with Vocal: “Desire,” U2* Best Hard Rock/Metal Performance, Vocal or Instrumental: Crest of a Knave, Jethro Tull Best Rap Performance: “Parents Just Don’t Understand,” DJ Jazzy Jeff and the Fresh Prince Best R&B Vocal Performance, Female: “Giving You the Best That I Got,” Anita Baker Best R&B Vocal Performance, Male: Introducing the Hardline According to Terence Trent D’Arby, Terence Trent D’Arby Best R&B Performance by a Duo or Group with Vocal: “Love Overboard,” Gladys Knight and the Pips Best R&B Song: “Giving You the Best That I Got,” Anita Baker, Randy Holland, and Skip Scarborough (songwriters) and Anita Baker (artist) Best Country Vocal Performance, Female: “Hold Me,” K. T. Oslin Best Country Vocal Performance, Male: Old 8 × 10, Randy Travis Best Country Performance, Duo or Group: “Give a Little Love,” The Judds Best Country Vocal Collaboration: “Crying,” Roy Orbison and k. d. lang Best Country Song: “Hold Me,” K. T. Oslin (songwriter and artist) Best Jazz Fusion Performance, Instrumental or Vocal: Politics, Yellowjackets
Music: Grammy Awards
■
1135
Best Jazz Vocal Performance, Female: Look What I Got!, Betty Carter Best Jazz Vocal Performance, Male: “Brothers,” Bobby McFerrin Best Jazz Vocal Performance, Duo or Group: “Spread Love,” Take 6 Best Jazz Instrumental Performance, Soloist: Don’t Try This at Home, Michael Brecker Best Jazz Instrumental Performance, Group: Blues for Coltrane: A Tribute to John Coltrane, Cecil McBee, David Murray, McCoy Tyner, Pharoah Sanders, and Roy Haynes Best Jazz Instrumental Performance, Big Band: Bud and Bird, Gil Evans and the Monday Night Orchestra Best Performance, Music Video: “Where the Streets Have No Name,” Meiert Avis (director), Ben Dossett and Michael Hamlyn (producers), U2* (artists) Best Concept, Music Video: “Fat,” Jay Levey (director), Susan Zwerman (producer), Weird Al Yankovic* (artist)
1989 Album of the Year: Nick of Time, Don Was (producer), Bonnie Raitt (artist) Record of the Year: “Wind Beneath My Wings,” Arif Mardin (producer), Bette Midler (artist) Song of the Year: “Wind Beneath My Wings,” Jeff Silbar and Larry Henley (songwriters) and Bette Midler (artist) Best New Artist: Milli Vanilli (revoked) Best Pop Vocal Performance, Female: “Nick of Time,” Bonnie Raitt Best Pop Vocal Performance, Male: “How Am I Supposed to Live Without You,” Michael Bolton Best Pop Performance by a Duo or Group with Vocal: “Don’t Know Much,” Linda Ronstadt and Aaron Neville Best Rock Vocal Performance, Female: Nick of Time, Bonnie Raitt Best Rock Vocal Performance, Male: The End of the Innocence, Don Henley Best Rock Performance by a Duo or Group with Vocal: Traveling Wilburys, Volume I, Traveling Wilburys Best Hard Rock Performance: “Cult of Personality,” Living Colour Best Metal Performance: “One,” Metallica Best Rap Performance: “Bust a Move,” Young MC
1136
■
Music: Grammy Awards
Best R&B Vocal Performance, Female: Giving You the Best That I Got, Anita Baker Best R&B Vocal Performance, Male: “Every Little Step,” Bobby Brown Best R&B Performance by a Duo or Group with Vocal: “Back To Life,” Soul II Soul featuring Caron Wheeler Best R&B Song: “If You Don’t Know Me by Now,” Kenny Gamble and Leon Huff (songwriters) and Simply Red (artist) Best Country Vocal Performance, Female: Absolute Torch and Twang, k. d. lang Best Country Vocal Performance, Male: Lyle Lovett and His Large Band, Lyle Lovett Best Country Vocal Performance, Duo or Group: Will the Circle Be Unbroken, Volume II, Nitty Gritty Dirt Band Best Country Vocal Collaboration: “There’s a Tear in My Beer,” Hank Williams, Jr., and Hank Williams, Sr. Best Country Song: “After All This Time,” Rodney Crowell (songwriter and artist) Best Jazz Fusion Performance, Instrumental or Vocal: Letter from Home, Pat Metheny Group
The Eighties in America
Best Jazz Vocal Performance, Female: Blues on Broadway, Ruth Brown Best Jazz Vocal Performance, Male: When Harry Met Sally, Harry Connick, Jr. Best Jazz Vocal Performance, Duo or Group: “Makin’ Whoopee,” Dr. John and Rickie Lee Jones Best Jazz Instrumental Performance, Soloist: Aura, Miles Davis Best Jazz Instrumental Performance, Group: Chick Corea Akoustic Band, Chick Corea Akoustic Band Best Jazz Instrumental Performance, Big Band: Aura, Miles Davis Best Music Video, Short Form: “Leave Me Alone,” Michael Jackson*, Jim Blashfield (director), Frank DiLeo, Jerry Kramer, Jim Blashfield, and Paul Diener (producers), Michael Jackson* (artist) Best Music Video, Long Form: Rhythm Nation 1814, Dominic Sena, Jonathan Dayton, and Valerie Faris (directors), Aris McGarry, Jonathan Dayton, and Valerie Faris (producers), Janet Jackson (artist)
■ Sports: Winners of Major Events Athletes whose names appear with an asterisk (*) are subjects of their own full-length essays within The Eighties in America.
Major League Baseball World Series 1980: Philadelphia Phillies (National League) 4, Kansas City Royals (American League), 2 1981: Los Angeles Dodgers (NL) 4, New York Yankees (AL) 2 1982: St. Louis Cardinals (NL) 4, Milwaukee Brewers (AL) 3 1983: Baltimore Orioles (AL) 4, Philadelphia Phillies (NL) 1 1984: Detroit Tigers (AL) 4, San Diego Padres (NL) 1 1985: Kansas City Royals (AL) 4, St. Louis Cardinals (NL) 3 1986: New York Mets (NL) 4, Boston Red Sox (AL) 3 1987: Minnesota Twins (AL) 4, St. Louis Cardinals (NL) 3 1988: Los Angeles Dodgers (NL) 4, Oakland A’s (AL) 1 1989: Oakland A’s (AL) 4, San Francisco Giants (NL) 0
All-Star Games 1980: National League 4, American League 2 1981: National League 5, American League 4 1982: National League 4, American League 1 1983: American League 13, National League 3 1984: National League 3, American League 1 1985: National League 6, American League 1 1986: American League 3, National League 2 1987: National League 2, American League 0 (13 innings) 1988: American League 2, National League 1 1989: American League 5, National League 3
American League Most Valuable Players 1980: George Brett*, Kansas City Royals 1981: Rollie Fingers, Milwaukee Brewers 1982: Robin Yount, Milwaukee Brewers 1983: Cal Ripken, Jr., Baltimore Orioles 1984: Willie Hernandez, Detroit Tigers
1985: Don Mattingly, New York Yankees 1986: Roger Clemens, Boston Red Sox 1987: George Bell, Toronto Blue Jays 1988: Jose Conseco, Oakland A’s 1989: Robin Yount, Milwaukee Brewers
National League Most Valuable Players 1980: Mike Schmidt, Philadelphia Phillies 1981: Mike Schmidt, Philadelphia Phillies 1982: Dale Murphy, Atlanta Braves 1983: Dale Murphy, Atlanta Braves 1984: Ryne Sandberg, Chicago Cubs 1985: Willie McGee, St. Louis Cardinals 1986: Mike Schmidt, Philadelphia Phillies 1987: Andre Dawson, Chicago Cubs 1988: Kirk Gibson*, Los Angeles Dodgers 1989: Kevin Mitchell, San Francisco Giants
American League Rookies of the Year 1980: Joe Charboneau, Cleveland Indians 1981: Dave Righetti, New York Yankees 1982: Cal Ripken, Jr., Baltimore Orioles 1983: Ron Kittle, Chicago White Sox 1984: Alvin Davis, Seattle Mariners 1985: Ozzie Guillen, Chicago White Sox 1986: Jose Canseco, Oakland A’s 1987: Mark McGwire, Oakland A’s 1988: Walt Weiss, Oakland A’s 1989: Gregg Olson, Baltimore Orioles
National League Rookies of the Year 1980: Steve Howe, Los Angeles Dodgers 1981: Fernando Valenzuela*, Los Angeles Dodgers 1982: Steve Sax, Los Angeles Dodgers 1983: Darryl Strawberry, New York Mets 1984: Dwight Gooden, New York Mets 1985: Vince Coleman, St. Louis Cardinals 1986: Todd Worrell, St. Louis Cardinals 1987: Benito Santiago, San Diego Padres 1988: Chris Sabo, Cincinnati Reds 1989: Jerome Walton, Chicago Cubs
1138
■
The Eighties in America
Sports: Winners of Major Events
National Basketball Association (NBA) Championships 1980: Los Angeles Lakers 4, Philadelphia 76ers 2 1981: Boston Celtics 4, Houston Rockets 2 1982: Los Angeles Lakers 4, Philadelphia 76ers 2 1983: Philadelphia 76ers 4, Los Angeles Lakers 0 1984: Boston Celtics 4, Los Angeles Lakers 3 1985: Los Angeles Lakers 4, Boston Celtics 2 1986: Boston Celtics 4, Houston Rockets 2 1987: Los Angeles Lakers 4, Boston Celtics 2 1988: Los Angeles Lakers 4, Detroit Pistons 3 1989: Detroit Pistons 4, Los Angeles Lakers 0
NBA Most Valuable Players 1980: Kareem Abdul-Jabbar, Los Angeles Lakers 1981: Julius Erving, Philadelphia 76ers 1982: Moses Malone, Houston Rockets 1983: Moses Malone, Philadelphia 76ers 1984: Larry Bird*, Boston Celtics
1985: Larry Bird*, Boston Celtics 1986: Larry Bird*, Boston Celtics 1987: Magic Johnson*, Los Angeles Lakers 1988: Michael Jordan, Chicago Bulls 1989: Magic Johnson*, Los Angeles Lakers
NBA Rookies of the Year 1980: Larry Bird*, Boston Celtics 1981: Darrell Griffith, Utah Jazz 1982: Buck Williams, New Jersey Nets 1983: Terry Cummings, San Diego Clippers 1984: Ralph Sampson, Houston Rockets 1985: Michael Jordan, Chicago Bulls 1986: Patrick Ewing, New York Knicks 1987: Chuck Person, Indiana Pacers 1988: Mark Jackson, New York Knicks 1989: Mitch Richmond, Golden State Warriors
College Basketball National Collegiate Athletic Association (NCAA) Championships 1980: Louisville 59, UCLA 54 1981: Indiana 63, North Carolina 50 1982: North Carolina 63, Georgetown 62 1983: North Carolina State 54, Houston 52 1984: Georgetown 84, Houston 75 1985: Villanova 66, Georgetown 64 1986: Louisville 72, Duke 69 1987: Indiana 74, Syracuse 73 1988: Kansas 83, Oklahoma 79 1989: Michigan 80, Seton Hall 79 (overtime)
National Invitational Tournament (NIT) 1980: Virginia 58, Minnesota 55 1981: Tulsa 86, Syracuse 84 1982: Bradley 67, Purdue 58 1983: Fresno State 69, DePaul 60 1984: Michigan 83, Notre Dame 63 1985: UCLA 65, Indiana 62 1986: Ohio State 73, Wyoming 63 1987: Southern Mississippi 84, La Salle 80 1988: Connecticut 72, Ohio State 67 1989: St. John’s 73, Saint Louis 65
The Eighties in America
Sports: Winners of Major Events
■
1139
Professional Football National Football League (NFL) Championships
1988: Boomer Esiason, Cincinnati Bengals 1989: Joe Montana*, San Francisco 49ers
1980: Oakland Raiders 27, Philadelphia Eagles 10 1981: San Francisco 49ers 26, Cincinnati Bengals 21 1982: Washington Redskins 27, Miami Dolphins 17 1983: Los Angeles Raiders 38, Washington Redskins 9 1984: San Francisco 49ers 38, Miami Dolphins 16 1985: Chicago Bears 46, New England Patriots 10 1986: New York Giants 39, Denver Broncos 20 1987: Washington Redskins 42, Denver Broncos 10 1988: San Francisco 49ers 20, Cincinnati Bengals 16 1989: San Francisco 49ers 55, Denver Broncos 10
NFL Most Valuable Players 1980: Brian Sipe, Cleveland Browns 1981: Ken Anderson, Cincinnati Bengals 1982: Mark Moseley, Washington Redskins 1983: Joe Theismann, Washington Redskins 1984: Dan Marino, Miami Dolphins 1985: Marcus Allen, Los Angeles Raiders 1986: Lawrence Taylor*, New York Giants 1987: John Elway*, Denver Broncos
Canadian Football League (CFL) Gray Cup Winners 1980: Edmonton Eskimos 48, Hamilton Tiger-Cats 10 1981: Edmonton Eskimos 26, Ottawa Renegades 23 1982: Edmonton Eskimos 32, Toronto Argonauts 16 1983: Toronto Argonauts 18, British Columbia Lions 17 1984: Winnipeg Blue Bombers 47, Hamilton Tiger-Cats 17 1985: British Columbia Lions 37, Hamilton Tiger-Cats 24 1986: Hamilton Tiger-Cats 39, Edmonton Eskimos 15 1987: Edmonton Eskimos 38, Toronto Argonauts 36 1988: Winnipeg Blue Bombers 22, British Columbia Lions 21 1989: Saskatchewan Rough Riders 43, Hamilton Tiger-Cats 40
College Football Heisman Trophy Winners 1980: George Rogers, South Carolina 1981: Marcus Allen, University of Southern California 1982: Herschel Walker, Georgia 1983: Mike Rozier, Nebraska 1984: Doug Flutie, Boston College 1985: Bo Jackson*, Auburn 1986: Vinny Testaverde, Miami (Florida) 1987: Tim Brown, Notre Dame 1988: Barry Sanders, Oklahoma State 1989: Andre Ware, Houston
1140
■
The Eighties in America
Sports: Winners of Major Events
National Hockey League (NHL) Stanley Cup Winners
Hart Memorial Trophy (NHL MVP)
1980: New York Islanders 4, Philadelphia Flyers 2 1981: New York Islanders 4, Minnesota North Stars 1 1982: New York Islanders 4, Vancouver Canucks, 0 1983: New York Islanders 4, Edmonton Oilers 0 1984: Edmonton Oilers 4, New York Islanders 1 1985: Edmonton Oilers 4, Philadelphia Flyers 1 1986: Montreal Canadiens 4, Calgary Flames 1 1987: Edmonton Oilers 4, Philadelphia Flyers 3 1988: Edmonton Oilers 4, Boston Bruins 0 1989: Calgary Flames 4, Montreal Canadiens 2
1980: Wayne Gretsky*, Edmonton Oilers 1981: Wayne Gretsky*, Edmonton Oilers 1982: Wayne Gretsky*, Edmonton Oilers 1983: Wayne Gretsky*, Edmonton Oilers 1984: Wayne Gretsky*, Edmonton Oilers 1985: Wayne Gretsky*, Edmonton Oilers 1986: Wayne Gretsky*, Edmonton Oilers 1987: Wayne Gretsky*, Edmonton Oilers 1988: Mario Lemieux*, Pittsburgh Penguins 1989: Wayne Gretsky*, Los Angeles Kings
Boxing World Heavyweight Champions John Tate (October 20, 1979-March 31, 1980) Mike Weaver (March 31, 1980-December 10, 1982) Michael Dokes (December 10, 1982-September 23, 1983) Gerrie Coetzee (September 23, 1983-December 1, 1984)
Greg Page (December 1, 1984-April 29, 1985) Tony Tubbs (April 29, 1985-January 17, 1986) Tim Witherspoon (January 17, 1986-December 12, 1986) James Smith (December 12, 1986-March 7, 1987) Mike Tyson* (March 7, 1987-February 11, 1990)
Auto Racing Indianapolis 500 Winners 1980: Johnny Rutherford 1981: Bobby Unser 1982: Gordon Johncock 1983: Tom Sneva 1984: Rick Mears
1985: Danny Sullivan 1986: Bobby Rahal 1987: Al Unser 1988: Rick Mears 1989: Emerson Fittipaldi
The Eighties in America
Sports: Winners of Major Events
■
1141
Tennis Major Tournament Champions Year
Australian Open
French Open
Wimbledon
U.S. Open
Men 1980
Brian Teacher
Björn Borg
Björn Borg
John McEnroe*
1981
Johan Kriek
Björn Borg
John McEnroe*
John McEnroe*
1982
Johan Kriek
Mats Wilander
Jimmy Connors
Jimmy Connors
1983
Mats Wilander
Yannick Noah
John McEnroe*
Jimmy Connors
1984
Mats Wilander
Ivan Lendl
John McEnroe*
John McEnroe*
1985
Stefan Edberg
Mats Wilander
Boris Becker
Ivan Lendl
1986
no competition
Ivan Lendl
Boris Becker
Ivan Lendl
1987
Stefan Edberg
Ivan Lendl
Pat Cash
Ivan Lendl
1988
Mats Wilander
Mats Wilander
Stefan Edberg
Mats Wilander
1989
Ivan Lendl
Michael Chang
Boris Becker
Boris Becker
1980
Hana Mandlikova
Chris Evert
Evonne Goolagong
Chris Evert
1981
Martina Navratilova*
Hana Mandlikova
Chris Evert
Tracy Austin
1982
Chris Evert
Martina Navratilova*
Martina Navratilova*
Chris Evert
1983
Martina Navratilova*
Chris Evert
Martina Navratilova*
Martina Navratilova*
1984
Chris Evert
Martina Navratilova*
Martina Navratilova*
Martina Navratilova*
1985
Martina Navratilova*
Chris Evert
Martina Navratilova*
Hana Mandlikova
1986
no competition
Chris Evert
Martina Navratilova*
Martina Navratilova*
1987
Hana Mandlilkova
Steffi Graf
Martina Navratilova*
Martina Navratilova*
1988
Steffi Graf
Steffi Graf
Steffi Graf
Steffi Graf
1989
Steffi Graf
Arantxa S. Vicario
Steffi Graf
Steffi Graf
Women
1142
■
The Eighties in America
Sports: Winners of Major Events
Golf Major Tournament Champions (Men)
Year
British Open
Professional Golf Association (PGA) Championship
1980
Tom Watson*
Jack Nicklaus
Seve Ballesteros
Jack Nicklaus
1981
Bill Rogers
Larry Nelson
Tom Watson*
David Graham
1982
Tom Watson*
Raymond Floyd
Craig Stadler
Tom Watson*
1983
Tom Watson*
Hal Sutton
Seve Ballesteros
Larry Nelson
1984
Seve Ballesteros
Lee Trevino
Ben Crenshaw
Fuzzy Zoeller
1985
Sandy Lyle
Hubert Green
Bernhard Langer
Andy North
1986
Greg Norman
Bob Tway
Jack Nicklaus
Raymond Floyd
1987
Nick Faldo
Larry Nelson
Larry Mize
Scott Simpson
1988
Seve Ballesteros
Jeff Sluman
Sandy Lyle
Curtis Strange
1989
Mark Calcavecchia
Payne Stewart
Nick Faldo
Curtis Strange
The Masters
U.S. Open
Major Tournament Champions (Women) Year
U.S. Open
Ladies Professional Golf Association (LPGA) Championship
1980
Amy Alcott
Sally Little
1981
Pat Bradley
Dnna Caponi
1982
Janet Alex
Jan Stephenson
1983
Jan Stephenson
Patty Sheehan
1984
Hollis Stacy
Patty Sheehan
1985
Kathy Baker Guadagnino
Nancy Lopez
1986
Jane Geddes
Pat Bradley
1987
Laura Davies
Jane Geddes
1988
Liselotte Neumann
Sherri Turner
1989
Betsy King
Nancy Lopez
The Eighties in America
Sports: Winners of Major Events
■
1143
Horse Racing Triple Crown Races Year
Kentucky Derby
Preakness
Belmont Stakes
1980
Genuine Risk
Codex
Temperance Hill
1981
Pleasant Colony
Pleasant Colony
Summing
1982
Gato Del Sol
Aloma’s Ruler
Conquistador Cielo
1983
Sunny’s Halo
Deputed Testamony
Caveat
1984
Swale
Gate Dancer
Swale
1985
Spend a Buck
Tank’s Prospect
Creme Fraiche
1986
Ferdinand
Snow Chief
Danzig Connection
1987
Alysheba
Alysheba
Bet Twice
1988
Winning Colors
Risen Star
Risen Star
1989
Sunday Silence
Sunday Silence
Easy Goer
■ Time Line Additional dates on legislation, U.S. Supreme Court cases, films, television shows, plays, literature, popular music, and sports can be found in other appendixes.
1980 International events: (Mar. 24) Archbishop Oscar Romero is killed by gunmen while celebrating mass in San Salvador; at his funeral six days later, forty-two people are killed amid gunfire and bombs. (Apr. 7) The United States severs diplomatic relations with Iran and imposes economic sanctions in response to Iran’s capture of fifty-two American hostages on November 4, 1979. (Apr. 15-Oct. 31) About 125,000 Cubans enter the United States, arriving in boats departing from Mariel Harbor; many of the exiles had been released from Cuban jails and mental health facilities. (Aug. 7-14) Lech Wauòsa leads the first of many strikes at the Gdansk shipyard in Poland; the strikes will spur the formation of Solidarity, an independent trade union. (Sept. 22) Saddam Hussein, prime minister of Iraq, invades Iran, setting off an eight-year war between the two countries. (Nov. 20) The Gang of Four, a group of Chinese Communist Party leaders, is tried for its role in the Cultural Revolution. Government and politics: (Feb. 2) The National Broadcasting Company (NBC) breaks the news that the Federal Bureau of Investigation (FBI) has set up Abscam, a sting operation in which FBI agents posed as Arab businessmen and offered American politicians money to perform favors for a nonexistent Arab sheik. (Mar. 3) Pierre Trudeau returns to office as prime minister of Canada after a nine-month absence. (May 20) In a referendum, Quebec voters reject a proposal for the province to become independent from Canada. (Jul. 16) Former California governor and actor Ronald Reagan is nominated for president at the Republican National Convention in Detroit. (Aug. 14) At the Democratic National Convention in New York City, President Jimmy Carter accepts his party’s nomination for another term in office. (Nov. 4) Reagan carries forty-four states to defeat Carter in the presidential election. Military and war: (Apr. 24-25) Operation Eagle Claw, a commando mission in Iran to rescue American hostages, is aborted after mechanical problems ground the rescue helicopters; the failed rescue
operation results in the deaths of eight American servicemen. Society: An advertisement featuring fifteen-year-old model Brooke Shields—in which she whispers “You know what comes between me and my Calvins? Nothing!”—is banned from the airwaves. Mattel introduces black and Hispanic Barbie dolls. (Jun. 1) Comedian Richard Pryor is badly burned as he tries to freebase cocaine. Business and economics: 3-M introduces a new product, Post-it note, and begins selling it throughout the United States. By the end of the year, the unemployment rate in the United States exceeds 10 percent. (Dec. 12) Apple Computer makes its initial public offering, trading its stock at twelve dollars per share. Transportation and communications: Japan passes the United States as the world’s largest automaker. (Jun. 1) Cable News Network (CNN), the first all-news network, goes on the air. Science and technology: (Mar. 1) The Voyager 1 space probe sends the first high-resolution images of Saturn back to scientists and confirms the existence of Janus, one of Saturn’s moons. (May 18) Mount St. Helens erupts in Washington, killing fifty-seven people and causing three billion dollars in damage. (Jun. 16) In its ruling in Diamond v. Chakrabarty, the Supreme Court allows patents to be issued on living organisms. Environment and health: RU-486, the abortion pill, is released in France. The U.S. Food and Drug Administration (FDA) warns pregnant women to restrict or eliminate caffeine consumption. (Sept. 22) Procter and Gamble Company announces a recall of its Rely brand tampons after federal studies conclude their use increases chances of toxic shock syndrome. Arts and literature: The Covenant, a novel by James A. Michener, and Crisis Investing: Opportunities and Profits in the Coming Great Depression, by Douglas R. Casey, are the year’s best-selling fiction and nonfiction books. (Apr.) Norman Mailer’s The Executioner’s Song, playwright Lanford Wilson’s Talley’s Folly, and Selected Poems by Donald Justice are among the year’s Pulitzer Prize winners. (Dec. 17)
The Eighties in America
Amadeus opens on Broadway, where it runs for 1,181 performances; the drama will receive 1981 Tony Awards for Best Play (playwright Peter Shaffer), Best Actor (Ian McKellan), and Best Director (Peter Hall). Popular culture: The Empire Strikes Back earns $290 million at the box office. (Sept. 23) Reggae musician Bob Marley plays his final live performance at the Stanley Theater in Pittsburgh, Pennsylvania. (Nov. 21) Millions of viewers tune into the television program Dallas to learn who shot lead character J. R. Ewing, the largest audience for a television show up to that point. Sports: (Jan. 20) The Pittsburgh Steelers become the first National Football League (NFL) team to win four Super Bowls, defeating the Los Angeles Rams, 31-19. (Feb. 22) In what is called the Miracle on Ice, the U.S. hockey team defeats the Soviet Union in the semifinals of the Winter Olympics; the United States goes on to win the gold medal. (Mar. 21) President Jimmy Carter announces that the United States will boycott the 1980 Summer Olympics in Moscow. (Apr. 21) Rosie Ruiz wins the Boston Marathon but is later exposed as a fraud and stripped of her medal. (Jul. 19-Aug. 3) The Summer Olympic Games are held in Moscow. (Oct. 21) The Philadelphia Phillies win their first World Series, beating the Kansas City Royals, 4-1, in game 6. Crime: (May 7) Paul Geidel, convicted of seconddegree murder in 1911, is released from prison in Beacon, New York, after 68 years and 245 days, the longest time ever served by an American inmate. (Oct. 15) Terrorist James Hoskins forces his way into a Cincinnati, Ohio, television studio, holding nine employees hostage for several hours before releasing them and taking his own life. (Dec. 8) Former Beatle John Lennon is shot and killed outside his New York City apartment by Mark David Chapman, a deranged fan.
1981 International events: (May 13) Pope John Paul II is shot and nearly killed by Mehmet Ali Agca, a Turkish gunman, as he enters St. Peter’s Square in Rome to address a general audience. (May 21) Socialist François Mitterrand becomes president of France. (Oct. 6) Egyptian president Anwar Sadat is assassinated during a parade by army members who were part of the Egyptian Islamic
Time Line
■
1145
Jihad organization, a group opposed to his negotiations with Israel. (Dec. 13) Polish president Wojciech Jaruzelski declares martial law in response to growing government opposition by the Solidarity trade union. Government and politics: (Jan. 19) Officials from the United States and Iran sign an agreement to release fifty-two American hostages after 444 days of captivity. (Jan. 20) Ronald Reagan becomes the fortieth president of the United States; minutes after his inauguration, Iran releases the fifty-two American hostages. (Mar. 30) President Reagan is shot in the chest outside a Washington, D.C., hotel by John Hinckley, Jr.; two police officers and Press Secretary James Brady are also wounded. (Aug. 19) President Reagan appoints Sandra Day O’Connor to be the first woman justice of the U.S. Supreme Court. Military and war: (Aug. 19) Libyan leader Muammar al-Qaddafi sends two fighter jets to intercept two U.S. fighter jets over the Gulf of Sidra; the American jets destroy the Libyan fighters. (Aug. 31) A bomb explodes at the U.S. Air Force base in Ramstein, West Germany, injuring twenty people. Society: Pac-Man, a Japanese video arcade game, is introduced in the United States. (Jul.) Christine Craft, an anchorwoman on a Kansas City television station, is demoted to reporter after focus group research concludes she is “too old, too unattractive, and wouldn’t defer to men.” (Jul. 29) Lady Diana Spencer marries Charles, Prince of Wales. Business and economics: G. D. Searle and Company begins selling NutraSweet, the brand name for aspartame, an artificial sugar, after the FDA approves the product. (Jan. 21) The first De Lorean, a stainless steel sports car with gull-wing doors, moves off the production line. Transportation and communications: (Mar. 6) Walter Cronkite signs off for the last time after anchoring the CBS Evening News for nineteen years. (Aug. 3) The Professional Air Traffic Controllers Organization (PATCO) goes on strike. (Aug. 5) President Reagan fires 11,359 striking air traffic controllers who ignore his order to return to work. (Aug. 7) The Washington Star ceases operations after 128 years of publication. (Sept. 26) The Boeing 767 airliner makes its debut flight. Science and technology: (Mar. 19) Three workers are killed and five are injured during a test of the
1146
■
The Eighties in America
Time Line
space shuttle Columbia. (Aug. 12) IBM begins selling a personal computer at a base price of $1,565. Environment and health: (Jun. 5) The Centers for Disease Control and Prevention (CDC) report that five homosexual men in Los Angeles have a rare form of pneumonia seen only in patients with weakened immune systems, the first recognized cases of acquired immunodeficiency syndrome (AIDS). (Dec. 28) The first American test-tube baby, Elizabeth Jordan Carr, is born in Norfolk, Virginia. Arts and literature: Jane Fonda’s Workout Book is released and remains number one on The New York Times best-seller list for more than a year. (May 6) A jury of architects and sculptors unanimously selects Maya Lin’s design for the Vietnam Veterans Memorial in Washington, D.C. (Dec.) A review in Artforum launches the career of artist Jean-Michel Basquiat. Popular culture: (Jan. 12) Dynasty, a television soap opera about wealthy Denver-based oil family the Carringtons, premieres and becomes a smash hit. (Apr. 18) The rock band Yes splits up but regroups in 1983. (Aug. 1) Music Television (MTV) goes on the air. (Sept. 19) Simon and Garfunkel perform “The Concert in New York City’s Central Park,” a free show attended by almost half a million people. (Oct. 15) Heavy metal band Metallica is formed in Los Angeles. Sports: (Apr. 18) A minor league baseball game between the Rochester Red Wings and the Pawtucket Red Sox becomes the longest professional baseball game in history, lasting eight hours and twenty-five minutes; the final one of the game’s thirty-three innings is played on June 23. (Aug. 9) Major League Baseball (MLB) players end their strike as the All-Star Game is held in Cleveland’s Municipal Stadium. (Sept.) John McEnroe defeats Björn Borg in the final game of the U.S. Open tennis tournament, becoming the first player since the 1920’s to win three consecutive U.S. Open men’s singles titles. Crime: (Jun. 21) Wayne Bertram Williams is arrested and charged with two murders; he is later convicted of murdering twenty-three of the thirty children and young adults slain in the Atlanta child murders. (Aug. 24) Mark David Chapman is sentenced to twenty years to life imprisonment after being convicted of murdering John Lennon in New York City.
1982 International events: (Mar. 10) The United States places an embargo on Libyan oil imports in response to Libya’s alleged support of terrorist groups. (Apr. 25) Israel completes its withdrawal from the Sinai Peninsula according to the terms of the Israel-Egypt Peace Treaty. (Jun. 6) Forces under the command of Israeli defense minister Ariel Sharon invade southern Lebanon; following this attack, the United Nations Security Council demands that Israel withdraw its troops from Lebanon. (Nov. 14) Lech Wauòsa, the leader of Poland’s outlawed Solidarity movement, is released after eleven months in prison. Government and politics: (Apr. 17) By proclamation of Queen Elizabeth II, Canada patriates its constitution and is granted full political independence from the United Kingdom. (Jun. 8) President Reagan becomes the first American chief executive to address a joint session of the British Parliament. (Jun. 30) The Equal Rights Amendment (ERA) falls short of the thirty-eight states needed to pass; Phyllis Schlafly and other leaders of the Religious Right take credit for its defeat. (Oct. 27) Dominion Day is officially renamed Canada Day. Military and war: (Apr. 2) Argentina invades the British-controlled Falkland Islands. (Apr. 4) The government of the Falklands surrenders and the islands are placed under Argentinian control. (Apr. 5) A British Royal Navy task force sails to South America to recapture the Falklands. (Jun. 14) The Falklands War ends, when Argentina agrees to formally surrender to the United Kingdom. Society: In response to a Gallup Poll question, 51 percent of Americans say they do not view homosexuality as normal. (Mar. 5) Comedian John Belushi dies of cocaine and heroin abuse in a Los Angeles hotel. (May 1) More than 100,000 people attend the first day of the 1982 World’s Fair in Knoxville, Tennessee. (Jun. 5) The first Rubik’s Cube World Championships are held in Budapest, Hungary. (Oct. 1) The Epcot theme park at Walt Disney World is opened to the public. Business and economics: (Jan. 8) AT&T agrees to divest itself into more than twenty regional subdivisions, commonly known as Baby Bells. (Feb. 19) The De Lorean automobile factory in Belfast, Northern Ireland, is put into receivership; the company will fold in 1983. (Jul. 23) The Interna-
The Eighties in America
tional Whaling Commission decides to end commercial whaling by 1985-1986. Transportation and communications: (May 2) The Weather Channel debuts on American cable television. (Sept. 15) The first issue of USA Today, a national newspaper published by the Gannett Company, goes on sale. Science and technology: (Sept. 19) Scott Fahlman, a computer scientist at Carnegie Mellon University, posts the first emoticons—symbols for smiley faces designed to distinguish serious posts from jokes. (Oct. 1) Sony sells its first consumer compact disc (CD) player. (Dec. 26) The computer is named Time magazine’s Man of the Year, the first time that the award is given to a nonhuman. Environment and health: (Sept. 29-Oct. 1) Seven people in the Chicago area die after ingesting Tylenol capsules laced with potassium cyanide. (Dec. 2) In an operation at the University of Utah, Barney Clark, a sixty-one-year-old retired dentist, becomes the first person to receive a permanent artificial heart. (Dec. 3) A final soil sample taken from the site of Times Beach, Missouri, is found to contain three hundred times the safe level of dioxin. (Dec. 23) The Environmental Protection Agency (EPA) recommends the evacuation of Times Beach because of its high levels of dioxin contamination. Arts and literature: (Apr.) John Updike’s novel Rabbit Is Rich, Charles Fuller’s A Soldier’s Play, and Sylvia Plath’s The Collected Poems are awarded Pulitzer Prizes. (Oct.) Gabriel García Márquez receives the Nobel Prize in Literature. (Oct. 7) Cats, a musical play based on Old Possum’s Book of Practical Cats and other poems by T. S. Eliot, opens on Broadway, where it will eventually run for a record 7,485 performances. Popular culture: The film E.T.: The Extra-Terrestrial tops the box office, with $310 million worth of ticket sales. The New York Times declares Grandmaster Flash and the Furious Five’s “The Message” to be “the most powerful pop record of the year.” By the end of the year, twenty-two million copies of the board game Trivial Pursuit will be sold in the United States. (Jan. 20) Ozzy Osbourne bites the head off of a live bat that is thrown at him while he is performing. (Feb. 1) Late Night with David Letterman, a comedy and talk show, premieres on NBC. (Dec. 1) Michael Jackson’s Thriller is released and will eventually sell
Time Line
■
1147
twenty million copies, making it the second biggest-selling album in entertainment history. Sports: (Feb. 24) Wayne Gretzky of the Edmonton Oilers scores his seventy-seventh goal of the National Hockey League (NHL) season, breaking the previous record of seventy-six. He will go on to score ninety-two goals that season, which remains the record. (May 8) French-Canadian racing driver Gilles Villeneuve is killed during qualifying runs for the Belgian Grand Prix. (May 30) In what Indianapolis Motor Speedway historian Donald Davidson and public address announcer Tom Carnegie later call the greatest moment in the track’s history, Gordon Johncock wins his second Indianapolis 500 race over Rick Mears by 0.16 second, the closest finish to that date. (May 30) Cal Ripken, Jr., of the Baltimore Orioles plays the first of what will become his record-breaking streak of 2,632 consecutive baseball games. Crime: (Jan. 6) “Freeway Killer” William Bonin, who along with several accomplices may have murdered as many as thirty-six people, is convicted of fourteen murders in California. (Mar. 16) Claus von Bülow is found guilty of the attempted murder of his wife, socialite Sunny von Bülow. (Jul. 16) The Reverend Sun Myung Moon is sentenced to eighteen months in prison and fined $25,000 for tax fraud and conspiracy to obstruct justice.
1983 International events: (Mar. 5) Bob Hawke is elected prime minister of Australia. (May 17) Lebanon, Israel, and the United States sign an agreement on Israeli withdrawal from Lebanon. (Jun. 9) Conservative Margaret Thatcher wins another term as prime minister of the United Kingdom with 42 percent of the popular vote. (Jul. 20) The government of Poland announces the end of martial law and grants amnesty to political prisoners. Government and politics: (Feb. 13) President Reagan proclaims 1983 “The Year of the Bible.” (Feb. 24) A special commission of Congress releases a report criticizing Japanese American internment during World War II. (Mar. 8) President Reagan calls the Soviet Union an “evil empire.” (Apr. 12) Harold Washington is the first African American to be elected mayor of Chicago. (Nov. 3) The Reverend Jesse Jackson announces his candidacy for the 1984 Democratic Party presidential nomination.
1148
■
The Eighties in America
Time Line
Military and war: (Mar. 23) President Reagan announces his Strategic Defense Initiative (SDI), a proposal to develop technology that can intercept enemy missiles; the media dub his proposal “Star Wars.” (Apr. 18) The U.S. embassy in Beirut, Lebanon, is bombed and sixty-three people are killed. (Oct. 19) Maurice Bishop, the prime minister of Grenada, and forty others are executed in a military coup; the People’s Revolutionary Army forms a military government to rule the country. (Oct. 23) Suicide truck-bombings destroy both the French and the U.S. Marine Corps barracks in Beirut, killing 241 American servicemen, 58 French paratroopers, and 6 Lebanese civilians. (Oct. 25) U.S. troops invade Grenada at the behest of Eugenia Charles of Dominica, a member of the Organization of American States; after the invasion, the prerevolutionary government is restored to power. Society: Cabbage Patch Kids mania takes off after the soft-sculpture dolls are placed on the market; by the end of the year, nearly three million dolls will be sold, exceeding the previous record for first-year doll sales by more than one million. (Sept. 17) Vanessa Williams becomes the first African American to be crowned Miss America. (Oct. 4) The first Hooters restaurant opens in Clearwater, Florida. Business and economics: Mortgage interest rates, which exceeded 16 percent in 1981, drop to 12 percent by the end of the year. The juice box becomes a new brown bag and lunch box option after Ocean Spray begins to sell cranberry juice in boxes. (Jun.) McDonald’s introduces Chicken McNuggets, small pieces of breaded chicken deepfried in oil. Transportation and communications: (Apr. 15) American Public Radio is founded; the network will become Public Radio International in 1994. (Sept. 5) Tom Brokaw becomes lead anchor for NBC Nightly News. Science and technology: (Jan. 26) The Lotus 1-2-3 spreadsheet program is released for IBM-PC compatible computers. (Apr. 7) Space shuttle Challenger astronauts F. Story Musgrave and Donald H. Peterson perform the first space shuttle space walk, which lasts four hours and ten minutes. (Jun. 18) Sally Ride, a crew member aboard Challenger, becomes the first American woman astronaut. (Aug. 30) Guion Bluford, the first African
American astronaut, is among the crew of Challenger. (Oct. 25) The first version of Microsoft Word software is introduced under the name Multi-Tool Word. Environment and health: (Jan.) A scientist at the Pasteur Institute in Paris isolates a virus that he believes is the original infecting microorganism of AIDS. (Feb. 23) The EPA announces that it will buy out and evacuate the dioxin-contaminated community of Times Beach, Missouri. (Mar. 9) Amid scandal, Anne Burford resigns as head of the EPA. Arts and literature: Return of the Jedi Storybook is the year’s best-selling fiction book. (Jan. 2) The musical Annie is performed for the last time after 2,377 shows on Broadway. (Apr.) Alice Walker’s novel The Color Purple receives the Pulitzer Prize. Popular culture: “Down Under” by Men at Work, “Africa” by Toto, and “Baby, Come to Me” by Patti Austin and James Ingram are among the year’s most popular songs. (Feb. 28) The television series M*A*S*H presents its final episode, “Goodbye, Farewell, and Amen”; it becomes the highestrated episode in television history. (Jul. 21) Diana Ross stages a free concert in Central Park for 800,000 people, enduring the severe weather; she vows to return the next day—and keeps her promise. Sports: (Jan. 22) Björn Borg retires from tennis after winning five consecutive Wimbledon championships. (Jul. 24) George Brett, third baseman for the Kansas City Royals, is expelled from a baseball game in Yankee Stadium after charging an umpire who called him out for having more pine tar on his bat than is technically allowed. (Dec. 13) The Denver Nuggets and the visiting Detroit Pistons combine for a National Basketball Association (NBA) record 370 points, with Detroit winning in triple overtime, 186-184. Crime: (Feb. 18) Thirteen people are killed in an attempted robbery in Seattle, Washington.
1984 International events: (Jun. 6) Indian troops storm the Golden Temple at Amritsar, the Sikhs’ holiest shrine, killing about three hundred people. (Aug. 21) Half a million people in Manila, Philippines, demonstrate against the government of Ferdinand Marcos. (Sept. 26) The United Kingdom and the People’s Republic of China sign an initial agreement to return Hong Kong to China
The Eighties in America
in 1997. (Oct. 31) Indian prime minister Indira Gandhi is assassinated by two Sikh security guards; the killing sparks riots in New Delhi, and about 2,700 Sikhs are killed. Government and politics: (Feb. 29) Canadian prime minister Pierre Trudeau announces his retirement. (Jun. 30) John Turner becomes Canada’s seventeenth prime minister. (Jul. 12) At its national convention in San Francisco, the Democratic Party nominates Walter Mondale for president and Geraldine Ferraro for vice president— the first woman nominated to that position. (Aug. 23) President Reagan and Vice President George H. W. Bush are nominated for second terms at the Republican National Convention in Dallas. (Sept. 4) The Progressive Conservative Party of Canada, led by Brian Mulroney, wins 211 seats in the House of Commons, forming the largest majority government in Canadian history. (Nov. 6) President Reagan defeats Mondale with 59 percent of the popular vote, the largest percentage since Richard Nixon’s 61 percent victory in 1972. Military and war: (Feb. 26) The U.S. Marines pull out of Beirut, Lebanon. Society: (Jun. 3) A unanimous Supreme Court upholds a Minnesota law that bars private clubs from discriminating against women. (Jul. 23) Vanessa Williams becomes the first Miss America to resign, surrendering her crown after nude photos of her are published in Penthouse magazine. (Oct.) The National Parent-Teacher Association (PTA) sends a letter to thirty record labels and the Recording Industry Association of America (RIAA) in which it proposes that labels be placed on recordings with “explicit lyrics or content.” Business and economics: (Jan. 1) AT&T divests into twenty-four independent regional units. (Jan. 24) The first Apple Macintosh computer goes on sale. (Nov. 4) Michael Dell, a student at the University of Texas, founds PCs Limited, which sells IBMcompatible personal computers built from stock components; the company will eventually change its name to Dell Computers. Transportation and communications: Deregulation by the Federal Communications Commission (FCC) enables the first infomercials to appear on television. (Jun. 22) Virgin Atlantic Airways makes its debut flight. Science and technology: (Jan. 5) Richard Stallman starts to develop GNU, a free software mass col-
Time Line
■
1149
laboration project. (Feb. 7) Astronauts Bruce McCandless II and Robert L. Stewart make the first untethered space walk. (Aug. 30) The space shuttle Discovery takes its maiden voyage. (Oct. 5) Marc Garneau, an astronaut on the space shuttle Challenger, becomes the first Canadian in space. Environment and health: (Oct. 26) Physicians at Loma Linda University Medical Center perform the first animal-to-human transplant in a newborn, when they place a baboon’s heart into the chest of Baby Fae, a twelve-day-old infant. (Dec. 3) A chemical leak from a Union Carbide pesticide plant in Bhopal, India, kills more than 1,000 people and injures from 15,000 to 22,000 others, of whom 6,000 will later die from their injuries. Arts and literature: William Gibson coins the term “cyberspace” in his novel Neuromancer. Other novels published in 1984 include Bright Lights, Big City, by Brat Packer author Jay McInerney, and The Hunt for Red October, by Tom Clancy. (Apr.) Ironweed by novelist William Kennedy, Glengarry Glen Ross by playwright David Mamet, and American Primitive by poet Mary Oliver are among the year’s Pulitzer Prize winners. (Oct. 11) Ma Rainey’s Black Bottom opens on Broadway; playwright August Wilson will later receive the New York Drama Critics Circle Award for Best Play. Popular culture: The first all-rap radio format is introduced at Los Angeles radio station KDAY. RunD.M.C. is the first rap group to have an album certified gold. (Sept. 14) The first MTV Video Music Awards are held in Radio City Music Hall, New York City, where “You Might Think” by the Cars is named Video of the Year. (Sept. 20) The Cosby Show premieres on NBC. Sports: (Feb. 8) The 1984 Winter Olympics open in Sarajevo, Yugoslavia. (May 8) The Soviet Union announces that it will boycott the 1984 Summer Olympics in Los Angeles. (May 8) The longest game in MLB history begins; the game between the Milwaukee Brewers and the Chicago White Sox will be played over the course of two days and twenty-five innings, with a total time of eight hours and six minutes. (Jul. 4) Richard Petty wins his two-hundredth career NASCAR victory at the Firecracker 400 in Daytona, Florida. (Jul. 28-Aug. 12) The 1984 Summer Olympics are held in Los Angeles. Crime: (Mar. 22) Teachers at the McMartin Preschool in Manhattan Beach, California, are
1150
■
Time Line
charged with Satanic ritual abuse of the schoolchildren; the charges are later determined to be completely unfounded and are dropped. (Jul. 18) James Oliver Huberty sprays a McDonald’s restaurant in San Ysidro, California, with gunfire, killing twenty-one people before he is shot and killed. (Dec. 22) While riding in a New York City subway car, Bernhard Goetz shoots four African American youths who try to steal from him.
1985 International events: (Mar. 11) Mikhail Gorbachev becomes the general secretary of the Soviet Communist Party and de facto leader of the Soviet Union. (Mar. 16) Associated Press reporter Terry Anderson is taken hostage in Beirut; he is eventually released on December 4, 1991. (Oct. 7) The cruise ship Achille Lauro is hijacked in the Mediterranean Sea by four Palestinian terrorists; one passenger, American Leon Klinghoffer, is killed. Government and politics: (Jan. 20) President Ronald Reagan is privately sworn in for a second term in office. (May 5) President Reagan joins German Chancellor Helmut Kohl for a controversial funeral service at a cemetery in Bitburg, Germany, which contains the graves of fifty-nine men who served in the S.S. during World War II. (Nov. 19) President Reagan and Soviet leader Mikhail Gorbachev meet for the first time in Geneva, Switzerland. Military and war: (Feb. 16) Israel begins withdrawing troops from Lebanon. Society: (May) Tipper Gore, the wife of then-Senator Albert Gore, and Susan Baker, wife of thenTreasury Secretary James Baker, among others, organize the Parents Music Resource Center (PMRC) to educate parents about lyrics that are “sexually explicit, excessively violent, or glorify the use of drugs and alcohol”; the group eventually persuades the Recording Industry Association of America (RIAA) voluntarily to place warning stickers on recordings it deems indecent or inappropriate for minors. (Jun. 4) The Supreme Court, ruling in Wallace v. Jaffree, strikes down an Alabama law that allowed public school teachers to hold a one-minute period of silence for “meditation or voluntary prayer” each day. (Sept. 30) “Shock jock” Howard Stern is fired from radio station WNBC-AM in New York City for his comedy sketch “Bestiality Dial-a-Date.”
The Eighties in America
Business and economics: (Apr. 23) Coca-Cola changes its formula and introduces New Coke; the new product receives an overwhelmingly negative response, and within three months the company puts its original formula back on the market. (Oct. 18) The Nintendo home entertainment system, an eight-bit video game counsel, is introduced to the North American market. Transportation and communications: (Jan. 7) Saturn Corporation, a subsidiary of General Motors, is founded in response to the American popularity of Japanese cars. (Jun. 17) The Discovery Channel, which provides documentary-like programming about science, history, and other topics, airs on cable television. (Dec. 1) Ford begins selling its Taurus model, which in the mid-1990’s will become the best-selling car in the United States. Science and technology: (Nov. 20) Microsoft Corporation releases Windows 1.0, the first version of its Windows software program. (Sept. 1) A joint American-French expedition locates the wreck of the Titanic. Environment and health: (Feb. 19) William J. Schroeder becomes the first artificial heart patient to leave the hospital. (Mar. 4) The FDA approves a blood test for AIDS, which has been used since then to screen all blood donations in the United States. (Oct. 2) Actor Rock Hudson dies, the first major public figure to die of AIDS. Arts and literature: The Accidental Tourist by Anne Tyler, The Mammoth Hunters by Jean Auel, and Lake Wobegon Days by Garrison Keillor are published. (Mar. 28) Biloxi Blues, a new comedy by Neil Simon, opens on Broadway; the play will win 1985 Tony Awards for Best Play, Best Featured Actor in a Play (Barry Miller), and Best Direction of a Play (Gene Saks). (Nov. 26) President Reagan sells the rights to his autobiography to Random House for a record three million dollars. Popular culture: Back to the Future is the year’s topgrossing film. (Jan. 28) USA for Africa, a group of musicians who include Michael Jackson, Stevie Wonder, Lionel Richie, Bob Dylan, Willie Nelson, Bruce Springsteen, Tina Turner, and Paul Simon, record “We Are the World” to raise money for Ethiopian famine victims. (Jul. 13) Live Aid concerts in Philadelphia and London raise millions of dollars for Ethiopian famine relief. Sports: (Mar. 6) Boxer Mike Tyson makes his pro-
The Eighties in America
fessional debut in Albany, New York, fighting a match in which he wins by a first-round knockout. (Mar. 31) WrestleMania, an annual wrestling payper-view event, debuts at Madison Square Garden in New York City. (Jul.) For the fourth year in a row, Martina Navratilova is the champion ladies singles tennis player at Wimbledon; she will capture two more championships, in 1986 and 1987. Crime: (Feb. 9) U.S. drug agent Enrique Camarena is kidnapped and murdered in Mexico; his body is discovered on March 5. (May 11) The FBI brings charges against the suspected heads of the five New York City Mafia families.
1986 International events: (Jan. 20) The United Kingdom and France announce their plans to build a rail tunnel under the English Channel. (Feb. 7) President Jean-Claude (“Baby Doc”) Duvalier flees Haiti after twenty-eight years of family rule. (Feb. 25) President Ferdinand Marcos of the Philippines goes into exile, and Corazon Aquino becomes the first Filipino woman president. (Mar. 26) An article in The New York Times charges that Kurt Waldheim, former United Nations secretary general and a candidate for president of Austria, may have been involved in Nazi war crimes during World War II. Government and politics: (Oct. 11) President Reagan and Soviet leader Mikhail Gorbachev meet in Reykjavík, Iceland, to discuss how they can reduce their intermediate missile stocks in Europe. (Nov. 3) The Iran-Contra affair begins when a Lebanese magazine reports that the United States has been secretly selling weapons to Iran in order to secure the release of seven American hostages held by pro-Iranian groups in Lebanon. (Nov. 25) Attorney General Edwin Meese announces that profits from covert weapons sales to Iran were illegally diverted to the anticommunist Contra rebels in Nicaragua. (Nov. 26) President Reagan denies his involvement in the Iran-Contra scandal and appoints three people to a special review board, later called the Tower Commission, to investigate the affair. Military and war: (Apr. 15) At least fifteen people die after United States’ planes bomb targets in Tripoli, Libya, and that nation’s Benghazi region. Society: (Jan. 20) Martin Luther King Day, a federal holiday honoring the civil rights leader, is ob-
Time Line
■
1151
served for the first time. (May 25) At least five million people participate in Hands Across America, forming a human chain from New York City to Long Beach, California, to raise money to combat homelessness and hunger. (Jul. 5) After an extensive refurbishing, the Statue of Liberty is reopened to the public. (Oct. 28) The centennial of the Statue of Liberty’s dedication is celebrated in New York Harbor. Business and economics: (Jan. 9) Kodak stops making instant cameras after losing a patent fight with Polaroid. (Oct. 9) News Corporation, Rupert Murdock’s media company, completes its acquisition of the Metromedia group of broadcasting stations and launches the FOX Broadcasting Company. (Nov. 11) Sperry Rand and Burroughs merge to form Unisys, the world’s second-largest computer company. Transportation and communications: (Jul. 1) Seaboard System Railroad and Chessie System, Inc., merge to create CSX Transportation, a railroad company serving the East Coast. (Aug. 31) Aeroméxico Flight 498 collides with a small Piper aircraft over Cerritos, California, killing sixty-seven passengers and fifteen people on the ground. Science and technology: (Jan. 12) The space shuttle is launched with the first Hispanic astronaut, Dr. Franklin R. Chang-Diaz. (Jan. 19) Brain, the first personal computer virus, begins to spread. (Jan. 28) The space shuttle Challenger disintegrates seventy-three seconds after its launch, killing its crew of seven astronauts, including schoolteacher Christa McAuliffe. (Feb. 9) Halley’s comet reaches its closest point to the Sun during its second visit to the solar system in the twentieth century. (Feb. 19) The Soviet Union launches the Mir space station. (Dec. 23) The aircraft Voyager completes the first nonstop circumnavigation of the Earth by air without refueling in nine days, three minutes, and forty-four seconds. Environment and health: About twenty-four million Americans are regularly performing aerobics, 90 percent of them women. Geneticists begin discussing the possibility of mounting a project to sequence the human genome. (Apr. 26) A reactor at the Chernobyl nuclear plant in Ukraine explodes, killing thirty-one people; thousands of other people are exposed to excessive amounts of radiation, and radioactivity renders large areas of Ukraine and Belarus uninhabitable.
1152
■
Time Line
Arts and literature: Stephen King’s It, Tom Clancy’s Red Storm Rising, and James Clavell’s Whirlwind are the year’s three top-selling fiction books. (Oct.) Wole Soyinka of Nigeria is awarded the Nobel Prize in Literature. (Oct.) World’s Fair, a novel by E. L. Doctorow, and Arctic Dreams, a nonfiction book by Barry Lopez, receive National Book Awards. Popular culture: Top Gun, starring Tom Cruise, is the top- grossing film of the year. (Apr. 21) During a highly publicized television show, journalist Geraldo Rivera opens gangster Al Capone’s secret vault but finds only a bottle of moonshine. (Sept. 19) The film Blue Velvet is released, establishing David Lynch as a major American director. (Sept. 27) A tour bus carrying heavy metal band Metallica crashes in Sweden, killing their bassist, Cliff Burton. Sports: (Apr. 29) Boston Red Sox pitcher Roger Clemens becomes the first pitcher in history to strike out twenty batters during a nine-inning game, defeating the Seattle Mariners. (May 24) The Montreal Canadiens defeat the Calgary Flames in five games to win the Stanley Cup. (Jul. 27) American cyclist Greg LeMond wins the Tour de France. (Nov. 22) Mike Tyson earns his first world boxing title by defeating Trevor Berbick in Las Vegas. Crime: (Aug. 20) Patrick Sherrill, an employee of the U.S. Postal Service in Edmond, Oklahoma, kills fourteen of his coworkers before committing suicide. (Dec. 20) Three African Americans are assaulted by a group of white teenagers in the Howard Beach neighborhood of Queens, New York; one of the victims, Michael Griffith, is run over and killed by a motorist as he tries to flee the attackers.
1987 International events: (Apr. 27) The U.S. Department of Justice declares Austrian president Kurt Waldheim to be an “undesirable alien.” (Jun. 10) During a visit to Berlin, President Reagan challenges Soviet premier Mikhail Gorbachev to tear down the Berlin Wall. (Dec. 8) The first Intifada, a Palestinian uprising against Israeli rule, begins in the Gaza Strip and West Bank. (Dec. 8) President Reagan and Soviet premier Gorbachev sign the Intermediate-Range Nuclear Forces (INF) Treaty. Government and politics: (Feb. 26) The Tower Com-
The Eighties in America
mission, which has been investigating the IranContra affair, criticizes President Reagan for failing to control his national security staff. (Mar. 4) In an address to the nation on the Iran-Contra affair, President Reagan acknowledges that his dealings with Iran “deteriorated” into an arms-forhostages deal. (May 8) Allegations that Senator Gary Hart had an extramarital affair with Donna Rice force Hart to drop out of the race for the Democratic presidential nomination. (Sept. 17) Televangelist Pat Robertson announces his candidacy for the 1988 Republican presidential nomination. (Oct. 23) By a vote of 58-42, the U.S. Senate rejects President Reagan’s nomination of former Solicitor General Robert Bork to the Supreme Court. (Nov. 18) U.S. Senate and House committees release reports charging President Reagan with “ultimate responsibility” for the IranContra affair. Military and war: (May 17) While patrolling the Persian Gulf, the USS Stark is struck by two missiles from an Iraqi Mirage fighter; thirty-seven sailors are killed and twenty-one others are injured in the explosion. (Oct. 19) U.S. warships destroy two Iranian oil platforms in the Persian Gulf. Society: (Mar.) ACT UP, an activist organization demanding increased resources to fight AIDS, is founded in New York City. (Mar. 19) Televangelist Jim Bakker, head of PTL Ministries, resigns after admitting that he had an affair with church secretary Jessica Hahn. (May 5) The Assemblies of God defrocks Bakker. (Jun. 19) In Edwards v. Aguillard, the Supreme Court declares unconstitutional a Louisiana law requiring that Creation science be taught in all public schools that teach evolution. (Oct. 14-16) Jessica McClure, an eighteen-monthold child, falls down a well in Midland, Texas, and is rescued fifty-eight hours later, a real-life drama watched by millions of American television viewers. Business and economics: (Jan. 31) The last Ohrbach’s department store closes in New York City after sixty-four years in business. (Mar. 2) Chrysler Corporation acquires American Motors Corporation. (Jul. 17) The Dow Jones Industrial Average closes above the 2,500 mark for the first time, at 2,510.04. (Oct. 19) On Black Monday, the Dow Jones Industrial average falls 508 points, or almost 23 percent, while stock markets in other countries experience similar declines.
The Eighties in America
Transportation and communications: (Aug. 4) The FCC rescinds the Fairness Doctrine, which required radio and television stations to “fairly” present controversial issues. (Aug. 16) Northwest Airlines Flight 255 crashes as it takes off from Detroit Metropolitan Airport, killing all but 1 of its 156 passengers. Science and technology: (Feb. 23) Supernova 1987A is observed—the first supernova visible to the naked eye since 1604. (Sept. 7-21) The world’s first conference on artificial life is held at Los Alamos National Laboratory in New Mexico. Environment and health: (May 11) The first heartlung transplant is performed in Baltimore. (Aug. 4) The World Commission on Environment and Development, also known as the Brundtland Commission, publishes a report, Our Common Future, which seeks to discuss the environment and development as a single issue. (Dec. 29) Prozac, an antidepressant, becomes available in the United States. Arts and literature: Beloved, a novel by Toni Morrison, is nominated for both the National Book Award and the National Book Critics Circle Award; it does not win either award but receives the 1988 Pulitzer Prize for fiction. The Bonfire of the Vanities, Tom Wolfe’s novel about New York City in the 1980’s, is published in book form after being serialized in Rolling Stone. (Mar. 12) Les Misérables opens on Broadway; it later wins eight 1987 Tony Awards, including Best Musical. Popular culture: (Jan. 3) Aretha Franklin becomes the first woman inducted into the Rock and Roll Hall of Fame. (Apr. 19) The Simpsons, an animated dysfunctional family, make their first appearance on The Tracy Ullman Show. (Jul. 1) The first Edgefest, an annual rock festival that primarily promotes Canadian music, is staged at Molson Park in Barrie, Ontario. Sports: (Jan. 25) The New York Giants defeat the Denver Broncos, 39-20, in Super Bowl XXI, winning the NFL championship for the first time since 1956. (Mar. 29) WrestleMania III is held at the Pontiac Silverdome in Pontiac, Michigan, setting the North American indoor attendance record at 93,173. (Apr. 30) NASCAR driver Bill Elliott sets the record for fastest lap at Talladega Motor Speedway at 211 miles, or 340 kilometers, per hour. (Oct. 25) Winning only eight-five games in the regular season, the Minnesota Twins sur-
Time Line
■
1153
prise baseball fans by defeating the St. Louis Cardinals to win the World Series. Crime: (Jan. 13) Fat Tony and Carmine Peruccia, two members of the New York City Mafia, are sentenced to one hundred years in prison for racketeering. (Jul. 4) A court in Lyon, France, sentences former Gestapo boss Klaus Barbie to life imprisonment for crimes against humanity.
1988 International events: (Jan. 1) Soviet premier Mikhail Gorbachev initiates perestroika, a program of economic restructuring. (May 15) The Soviet Union withdraws from Afghanistan after more than eight years of war. (Aug. 20) The Iran-Iraq war ends; an estimated one million people were killed in the eight-year conflict. (Nov. 15) An independent state of Palestine is proclaimed at the Palestinian National Council meeting in Algiers. (Dec. 2) Benazir Bhutto is sworn in as prime minister of Pakistan, the first woman to head the government of an Islamic country. Government and politics: (Feb. 3) The House of Representatives denies President Reagan’s request for $36.25 million to support the Nicaraguan Contras. (Feb. 12) Anthony Kennedy is appointed to the Supreme Court. (Jul. 20) The Democratic National Convention in Atlanta nominates Michael Dukakis for president and Senator Lloyd Bentsen for vice president. (Aug. 18) The Republican National Convention in New Orleans nominates Vice President George H. W. Bush for president and Senator Dan Quayle for vice president. (Oct. 5) During a vice presidential debate, Quayle maintains that he has as much government experience as John F. Kennedy did when he ran for president in 1960; Bentsen, his Democratic opponent, elicits a positive audience response when he replies, “Senator, I knew Jack Kennedy. I served with Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.” (Nov. 8) Bush is elected president of the United States. (Nov. 21) Brian Mulroney and the Progressive Conservative Party of Canada win a second majority government. Military and war: (Mar. 8) Two U.S. Army helicopters collide in Fort Campbell, Kentucky, killing seventeen servicemen. (Jul. 3) Iran Air Flight 655 is shot down by missiles launched from the USS Vincennes. (Nov. 22) The initial prototype of
1154
■
The Eighties in America
Time Line
the stealth fighter is unveiled in Palmdale, California. Society: (Jan. 1) The Evangelical Lutheran Church in America is founded, creating the largest Lutheran denomination in the United States. (Feb. 21) During his program, televangelist Jimmy Swaggart admits to committing an unspecified sin; his sin is later revealed to be an affair with a prostitute. (Mar. 6) Students at Gallaudet University, a school for the deaf, go on strike to protest the appointment of a nondeaf university president. Business and economics: (Jul. 14) Volkswagen closes its Westmoreland County, Pennsylvania, plant— the first factory built by a foreign automaker in the United States—after ten years of operation. (Sept. 5) The Robert M. Bass Group, with two billion dollars of federal aid, agrees to buy American Savings and Loan Association, the largest thrift in the United States. (Oct. 30) Philip Morris purchases Kraft Foods for $13.1 billion. (Nov. 30) Kohlberg Kravis Roberts & Co. buys RJR Nabisco for $25.07 billion. Transportation and communications: (Dec. 21) Pan Am Flight 103 is blown up by Libyan terrorists over Lockerbie, Scotland, killing 270 people. Science and technology: Microsoft passes Lotus to become the world’s largest software company. Microsoft begins developing its Windows NT (new technology). (Sept. 29) The National Aeronautics and Space Administration (NASA), which had grounded flights after the Challenger disaster, resumes space shuttle flights by launching the space shuttle Discovery. Environment and health: Congress bans smoking on domestic air flights that are less than two hours long. The FDA approves the marketing of Viaspan, an isotonic solution used to preserve donated livers in a viable state before transplantation. (May 16) U.S. Surgeon General C. Everett Koop states in a report that the addictive properties of nicotine are similar to those of heroin and cocaine. (Jul. 6) The first reported medical waste on beaches in the New York area, including hypodermic needles and syringes possibly infected with the AIDS virus, wash ashore on Long Island; subsequent medical waste discoveries on beaches in Coney Island and in Monmouth County, New Jersey, force the closure of numerous New Yorkarea beaches. Arts and literature: The Lyre of Orpheus, the third
novel in Robertson Davies’ Cornish Trilogy, is published amid favorable reviews from critics. (Jun. 11) A concert at London’s Wembley Stadium, featuring stars from the fields of music, comedy, and film, celebrates the seventieth birthday of imprisoned African National Congress leader Nelson Mandela. Popular culture: For the first time ever, compact discs (CDs) outsell vinyl recordings. Oprah Winfrey has the highest-rated television talk show in the United States. Who Framed Roger Rabbit, the first film to combine live actors and animated characters, is the top-grossing film of the year. (Apr. 11) The Last Emperor, directed by Bernardo Bertolucci, wins nine Academy Awards. Sports: (Jan. 29) The Midwest Classic Conference, a college athletic organization, is founded. (Feb. 13-28) The 1988 Winter Olympics are held in Calgary, Alberta, Canada. (Aug. 9) The Chicago Cubs play their first-ever night game at home in Wrigley Field, defeating the New York Mets, 6-4. (Sept. 17-Oct. 2) The Summer Olympic Games are held in Seoul, South Korea. (Oct. 15) An injured Kirk Gibson hits a dramatic home run to win the first game of the World Series for the Los Angeles Dodgers, defeating the Oakland A’s by a score of 5-4; the Dodgers go on to win the series in five games. Crime: (Mar. 16) Lieutenant Colonel Oliver North and Vice Admiral John Poindexter are indicted on charges of conspiracy to defraud the United States because of their roles in the Iran-Contra affair. (Nov. 11) Police in Sacramento, California, find a body buried in the lawn at the boardinghouse of sixty-year-old Dorothea Puente; six more bodies are eventually found, and Puente is convicted of three murders and sentenced to life in prison.
1989 International events: (Jan. 18) The Communist Party of Poland votes to legalize Solidarity. (Apr. 15) Students from Beijing, Shanghai, Xian, and Nanjing, China, begin protesting in Tiananmen Square. (May 20) The Chinese government declares martial law in Beijing in response to the Tiananmen Square protests. (May 30) Student protesters in Tiananmen Square unveil a statue, the Goddess of Democracy. (Jun. 4) The final standoff between student protesters and the mili-
The Eighties in America
tary takes place in Tiananmen Square. (Nov. 9) The Berlin Wall falls as East Germany opens checkpoints, allowing its citizens to freely travel to West Germany; celebrating Germans begin tearing down the wall. (Nov. 17) The Velvet Revolution begins in Czechoslovakia as a peaceful student demonstration in Prague is severely repelled by police. (Nov. 28) With other communist regimes falling all around it and with growing street protests, the Communist Party of Czechoslovakia announces that it will relinquish its monopoly on political power. (Dec. 29) Václav Havel is elected president of Czechoslovakia, the country’s first noncommunist leader in more than forty years. Government and politics: (Jan. 20) George H. W. Bush becomes the forty-first president of the United States. (Feb. 10) Ron Brown is elected chairman of the Democratic National Committee, the first African American to lead a major political party. (Nov. 7) David Dinkins becomes the first African American mayor of New York City, while Douglas Wilder of Virginia is the first elected African American governor. Military and war:. (Feb. 23) The Senate Armed Services Committee rejects President Bush’s nomination of John Tower for secretary of defense. Society: (Mar. 14) President Bush bans the importation of assault weapons into the United States. (Jun. 1) The SkyDome stadium, now known as Rogers Centre, opens in Toronto. (Sept. 5) During his first televised news conference, President Bush holds up a bag of cocaine purchased at Lafayette Park, across the street from the White House. Business and economics: (Mar. 9) A strike forces Eastern Air Lines into bankruptcy. (Aug. 7) Federal Express buys Flying Tigers, originally a volunteer group of pilots who fought in World War II. (Oct. 13) In what is later called the Friday the Thirteenth Minicrash, the Dow Jones Industrial Average plummets 190.58 points to close at 2,569.26, most likely a result of the collapsing junk bond market. Transportation and communications: (Mar. 1) The United States ratifies the Berne Convention, an international treaty on copyrights. (Mar. 4) Time, Inc., and Warner Communications announce plans to merge and create Time Warner. Science and technology: (Mar. 23) Stanley Pons and Martin Fleischmann announce that they have
Time Line
■
1155
achieved cold fusion at the University of Utah. (Jul. 26) A federal grand jury indicts Cornell University student Robert Tappan Morris, Jr., for releasing a computer virus, making him the first person to be prosecuted under the 1986 Computer Fraud and Abuse Act. (Aug. 25) Voyager II flies past the planet Neptune and its moon Triton. Environment and health: (Feb. 14) Union Carbide agrees to pay the Indian government $470 million for the damages caused in the 1984 chemical leak disaster in Bhopal. (Mar. 24) The Exxon Valdez oil tanker runs aground, spilling 240,000 barrels, or 11 million gallons, of oil into Prince William Sound in Alaska. Arts and literature: (Feb. 14) Iranian leader Ayatollah Khomeini encourages Muslims to kill author Salman Rushdie for writing his novel The Satanic Verses; Khomeini later offers a three-million-dollar bounty for Rushdie’s murder. (Jun. 12) The Corcoran Gallery of Art in Washington, D.C., removes an exhibit of erotic photos by Robert Mapplethorpe. Popular culture: (Apr. 16) The Dilbert comic strip is syndicated for the first time. (Jul. 5) The sitcom Seinfeld premieres on television. (Nov. 15) Disney’s The Little Mermaid is released in theaters. (Dec. 17) The first full-length episode of The Simpsons airs on FOX. Sports: (Apr. 2) Hulk Hogan defeats Randy Savage to become the World Wrestling Federation champion. (May 25) The Calgary Flames win their first Stanley Cup with a 4-2 victory over the Montreal Canadiens. (Aug. 24) Baseball player Pete Rose consents to a lifetime ban from the sport following allegations of illegal gambling; he also is barred from induction into the Baseball Hall of Fame. Crime: (Jan. 24) Serial killer Ted Bundy is executed in Florida. (Apr. 19) A Central Park jogger is brutally attacked during an evening run in the New York City park. (Aug. 20) Lyle and Erik Menendez murder their wealthy parents in the den of the family’s Beverly Hills home. (Dec. 6) In the worst single-day massacre in Canadian history, a twentyfive-year-old man who hates women goes to the École Polytechnique in Montreal and kills fourteen women, injures thirteen others, and then shoots himself. Rebecca Kuzins
■ Bibliography This bibliography lists books containing substantial material about a wide variety of topics pertaining to the 1980’s. Additional works, and especially works on narrower subjects, can be found in the “Further Readings” notes at the end of every essay in The Eighties in America. Books are listed under the following seven categories: 1. General Works . . . . . 2. Politics and Politicians 3. Race . . . . . . . . . . 4. Supreme Court . . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
. . . .
1156 1156 1156 1157
1. General Works Carroll, Peter. It Seemed Like Nothing Happened: America in the 1970’s. New Brunswick, N.J.: Rutgers University Press, 1990. Carroll examines the 1970’s and explains why the decade was more significant than it appeared at the time. This history of the previous decade provides background information for better understanding the events of the 1980’s. Johnson, Haynes. Sleepwalking Through History: America in the Reagan Years. New York: W. W. Norton, 1991. Johnson argues that the United States in the 1980’s was lulled to sleep by Ronald Reagan, while the country’s ills, particularly the gap between the rich and the poor, intensified. He condemns Reagan for weakening America’s constitutional system and for lowering public concern about the nation’s problems.
2. Politics and Politicians Collins, Robert M. Transforming America: Politics and Culture During the Reagan Years. New York: Columbia University Press, 2006. While America had a growing pride in itself during Ronald Reagan’s presidency, the country also became more divided. Collins examines the source of both the pride and the division, focusing his explanation on Reagan. He also discusses other issues, including what he views as a divide between religious and secular forces. Includes photographs and a short bibliography. Cook, Ramsay. Teeth of Time: Remembering Pierre Elliott Trudeau. Montreal: McGill-Queen’s University Press, 2006. Cook, a longtime friend and political supporter of Trudeau, reflects on the private life and public career of the former Canadian prime
5. Foreign Policy and Events. . . . . . . . . . 1158 6. Culture and Entertainment . . . . . . . . . 1159 7. Homosexuality. . . . . . . . . . . . . . . . 1160
minister. Cook examines the political issues, including constitutional reform and nationalism, which Trudeau strove to address throughout his lifetime. Craig, Barbara Hinkson, and David M. O’Brien. Abortion and American Politics. Chatham, N.J.: Chatham House, 1993. The authors chronicle the political controversy surrounding abortion rights, describing the interaction between states, the federal government, interest groups, the Supreme Court, Congress, and the president. The book covers the period between the Supreme Court rulings in Roe v. Wade (1973) and Planned Parenthood v. Casey (1992). Shilts, Randy. And the Band Played On: People, Politics, and the AIDS Epidemic. New York: Penguin, 1988. This work examines the early history of the AIDS epidemic, from the disease’s origin until 1984, when the United States began making serious efforts to combat the illness. Shilts argues that the conservative Reagan administration and the American public, including gay leaders, tried to sweep the epidemic under the rug, which resulted in a delay in treating the disease. Tygiel, Jules. Ronald Reagan and the Triumph of American Conservatism. New York: Longman, 2006. This book explores how Ronald Reagan symbolized the accomplishments of the conservative political movement. Tygiel discusses Reagan’s achievement, failings, and the issues he failed to address.
3. Race Carter, Dan T. From George Wallace to Newt Gingrich: Race in the Conservative Counterrevolution, 19631994. Baton Rouge: Louisiana State University Press, 1996. This study of the role of racial issues
The Eighties in America
in American conservative politics includes valuable information about presidents Ronald Reagan and George H. W. Bush’s response to these issues. Edsall, Thomas Byrne, with Mary D. Edsall. Chain Reaction: The Impact of Race, Rights, and Taxes on American Politics. New York: W. W. Norton, 1992. The Edsalls argue that the Democrats had become the minority political party by the early 1990’s because of three factors: race, civil rights, and taxes. They focus on how these factors turned many working-class and middle-class white people against the Democrats, providing a detailed analysis of the presidential elections from 1964 to 1988. Hacker, Andrew. Two Nations: Black and White, Separate, Hostile, Unequal. New York: Charles Scribner’s Sons, 2003. Hacker does not buy into the argument that race no longer matters in America; he argues persuasively that this issue continues to have an impact upon the country. He cites statistics to demonstrate that racism is a major reason for the disparities in income and crime and other differences between black and white Americans. Laham, Nicholas. The Reagan Presidency and the Politics of Race: In Pursuit of Colorblind Justice and Limited Government. Westport, Conn.: Praeger, 1998. Laham looks at racial issues during the Reagan presidency. He argues that Ronald Reagan’s desire to reduce the enforcement of civil rights laws was not motivated by racism but by the president’s desire to limit the role of government. Nagel, Joane. American Indian Ethnic Renewal: Red Power and the Resurgence of Identity and Culture. New York: Oxford University Press, 1997. Nagel argues that Americans are now more willing to identify themselves as Native Americans or American Indians than they were in the past. She attributes this, in part, to the Civil Rights movement of the 1960’s, but she identifies other reasons for the resurgence in Native American culture in the last half of the twentieth century. Nightingale, Carl Husemoller. On The Edge: A History of Poor Black Children and Their American Dreams. New York: Basic Books, 1993. Nightingale argues that both left-wing and right-wing politicians misunderstand the causes of black poverty. He indicts the role of the media and Americans’ conspicuous consumption as the reasons for this indigence.
Bibliography
■
1157
4. Supreme Court Davis, Derek. Original Intent: Chief Justice Rehnquist and the Course of American Church-State Relations. Buffalo, N.Y.: Prometheus Books, 1991. Davis examines the Rehnquist Court during its initial years, focusing on its rulings regarding churchstate relations. He argues that William H. Rehnquist endangered religious freedom by allowing state aid to religion. The book also includes a brief examination of the other justices who served on the Court in the early 1990’s. O’Connor, Sandra Day, and H. Alan Day. Lazy B: Growing up on a Cattle Ranch in the American Southwest. New York: Random House, 2003. O’Connor, whom Ronald Reagan appointed to the Supreme Court, and Day, her brother, recall their youth on the Lazy B cattle ranch. Schultz, David A., and Christopher E. Smith. The Jurisprudential Vision of Justice Antonin Scalia. Lanham, Md.: Rowman & Littlefield, 1996. Schultz and Smith examine the first ten years of Scalia’s service on the Supreme Court. They argue that there is not one consistent vision that has motivated Scalia, and that Scalia’s extreme right-wing nature has prevented him from forming coalitions or remaking the law as much as he could have. The authors do not accept Scalia’s word that his opinions follow the direct writing of the Constitution, or what is known as textualism; they critique Scalia’s opinions to demonstrate how they often divert from a strictly textual interpretation. Schwartz, Bernard. The Ascent of Pragmatism: The Burger Court in Action. Reading, Mass.: AddisonWesley, 1990. Schwartz, who wrote a well-received book on Chief Justice Earl Warren, argues that the Burger Court was more a continuation of the Warren Court than a significant departure from it. He contends that Chief Justice Warren E. Burger was not the effective leader of the Court, as one might expect a chief justice to be, but that the Court was led by the more centrist justices. In addition to examining the Court as a whole, Schwartz also looks at the contribution of each justice. Tushnet, Mark V. A Court Divided: The Rehnquist Court and the Future of Constitutional Law. New York: W. W. Norton, 2005. Tushnet describes the divisions within the Rehnquist Court, examining the views of the individual justices. He suggests that
1158
■
Bibliography
the centrists on the Court, especially Anthony Kennedy and Sandra Day O’Connor, played a significant role. While more focused on the 1990’s, the book also devotes significant attention to the 1980’s. _______. Making Constitutional Law: Thurgood Marshall and the Supreme Court, 1961-1991. New York: Oxford University Press, 1997. Thurgood Marshall was one of the most liberal members of the Supreme Court in the 1980’s, and Tushnet locates the source of Marshall’s liberalism, describing his views on a variety of constitutional issues. The book also provides an overview of the Court and the justices who served with Marshall. Van Sickel, Robert W. Not a Particularly Different Voice: The Jurisprudence of Sandra Day O’Connor. New York: Peter Lang, 1998. In 1981, O’Connor was the first woman appointed to the Supreme Court. This book argues that rather than providing a liberal break with tradition because of her gender, O’Connor continued to expound the Court’s conservative and centrist philosophies. Van Sickel suggests that O’Connor’s legal views and opinions were not shaped by her gender, but by her tendency to follow precedent and to avoid constitutional questions, wherever possible. Yarbrough, Tinsley. Rehnquist Court and the Constitution. New York: Oxford University Press, 2001. While the course of the Rehnquist Court was not always consistent, it had a huge influence upon America, beginning in the 1980’s. Yarbrough tries to explain how and why the Court reached its decisions, noting that its opinions generally continued past precedent. However, the Court broke new ground on economic issues, striking down more regulations on business and commerce than its predecessors.
5. Foreign Policy and Events Andrew, Arthur. The Rise and Fall of a Middle Power: Canadian Diplomacy from King to Mulroney. Toronto: Lorimer, 1993. Andrew, a former Canadian ambassador, argues that Canadian power has been declining since its peak in the mid1960’s. He maintains that Canada could choose to reassert its power, but as of the early 1990’s the country had become dependent on and subservient to the wishes of other nations, including the United States.
The Eighties in America
Bell, Coral. The Reagan Paradox: American Foreign Policy in the 1980’s. New Brunswick, N.J.: Rutgers University Press, 1990. Bell describes the gap between Ronald Reagan’s spoken goals for international relations and how he actually conducted foreign policy. Farber, David. Taken Hostage: The Iran Hostage Crisis and America’s First Encounter with Radical Islam. Princeton, N.J.: Princeton University Press, 2006. In his chronicle of the Iranian hostage crisis, Farber examines the rise of radical Islam and the impact of the crisis upon the United States. The effect of the crisis, Farber argues, cannot be explained without describing the general mood of melancholy that affected America in the 1970’s. Gaddis, John Lewis. The United States and the End of the Cold War: Implications, Reconsiderations, Provocations. New York: Oxford University Press, 1994. Gaddis examines the reasons for the end of the Cold War and reconsiders some of the significant figures in that struggle. He also describes the post-Cold War world, arguing that the end of the war did not mean the end of world conflict. Garthoff, Raymond L. The Great Transition: AmericanSoviet Relations and the End of the Cold War. Washington, D.C.: Brookings Institution Press, 1994. This book analyzes U.S.-Soviet relations under presidents Ronald Reagan and George H. W. Bush. Garthoff argues that the United States was unnecessarily provocative toward the Soviet Union and downplays the United States’ role in “winning” the Cold War. Hahn, Peter L. Crisis and Crossfire: The United States and the Middle East Since 1945. Washington, D.C.: Potomac Books, 2005. This work provides a short introduction to the United States’ policy in the Middle East during the last sixty years. In addition to discussing America’s relations with Iraq, the book also examines the Israeli-Palestinian conflict and America’s dependence upon the region’s oil. Karsh, Efraim. The Iran-Iraq War, 1980-1988. Oxford, England: Osprey, 2002. This brief history of the war describes the major battles and the tactics that Iran employed to repel Iraq. Includes a chronology, time line, and photographs. Karsh, Efraim, and Inari Rautsi. Saddam Hussein: A Political Biography. New York: Grove Press, 2003. This biography of Hussein focuses on how he consolidated and maintained political power in Iraq,
The Eighties in America
concluding with his role in the first Persian Gulf War. It also details his reasons for starting a war with Iran and invading Kuwait. LeoGrande, William M. Our Own Backyard: The United States in Central America, 1977-1992. Chapel Hill: University of North Carolina Press, 2000. LeoGrande focuses on El Salvador and Nicaragua in his discussion of Central America, examining how these countries were shaped by the Cold War. He also discusses the United States’ policy in Central America and how the Vietnam War affected America’s involvement in that region. North, Oliver. Under Fire: An American Story. New York: HarperCollins, 1992. Oliver North was heavily involved in the Iran-Contra affair, and in this autobiography he defends that operation and his participation in it. Sick, Gary. All Fall Down: America’s Tragic Encounter with Iran. Lincoln, Nebr.: iUniverse.com, 2001. Sick, a former aide who worked for the National Security Council and focused on Iran during the Carter administration, presents this blow-by-blow account of the Iran hostage crisis. He places the incident in context by reviewing American policy toward Iran before the crisis, including the activities of the Nixon administration. _______. October Surprise: America’s Hostages in Iran and the Election of Ronald Reagan. New York: Random House, 1991. Sick argues that Reagan supporters managed to maneuver the Iranians into waiting until after the November, 1980, elections to release the hostages, in return for arms and other concessions after Reagan was elected. The claims are controversial, but Sick’s position in the National Security Council during the Reagan, Carter, and Ford administrations bolsters his claims. Thornton, Richard C. The Reagan Revolution, I: The Politics of U.S. Foreign Policy. New York: Simon & Schuster, 1987. Thornton describes foreign policy in the early years of the Reagan administration, focusing on Ronald Reagan’s opposition to the Soviet Union. Walsh, Lawrence. Firewall: The Iran-Contra Conspiracy and Cover-Up. New York: W. W. Norton, 1998. Walsh, the independent prosecutor in the IranContra investigation, maintains that President Ronald Reagan knew about the affair and that the cover-up protected Reagan from any legal ramifications.
Bibliography
■
1159
Woodward, Bob. Veil: The Secret Wars of the CIA, 19811987. New York: Simon & Schuster, 2005. This work, by one of the reporters who broke the Watergate story, focuses on the Central Intelligence Agency (CIA) during the Reagan administration. It presents a picture of the CIA as dangerously active under the leadership of William Casey, and this picture embroiled the book in controversy. Woodward also maintains that the CIA was actively involved in the Iran-Contra affair.
6. Culture and Entertainment Cullen, Jim, and Daniel Cullen. Born in the U.S.A.: Bruce Springsteen and the American Tradition. New York: HarperCollins, 1997. In the 1980’s, many people used Bruce Springsteen’s song, “Born in the U.S.A.,” as a rallying cry for Americans. Cullen examines the roots of the song and many others written by Springsteen and places Springsteen within American musical and literary traditions. Freiberger, Paul, and Michael Swaine. Fire in the Valley: The Making of the Personal Computer. Berkeley, Calif.: McGraw-Hill, 1999. This work chronicles the early days of Silicon Valley, beginning in the 1950’s, with the creation of large computers such as UNIVAC, as well as discussing the birth of the personal computer in the 1970’s. Freiberger and Swaine focus on important figures in the computer’s development, including Bill Gates and Doug Englebart. Fuller, Linda K. The Cosby Show: Audiences, Impact, and Implications. Westport, Conn.: Greenwood Press, 1992. The Cosby Show was one of the most popular television programs in the 1980’s, and Fuller examines the reasons for this popularity, along with the show’s effect on the television marketplace and the nationwide image of African Americans. She also surveys people in other countries in order to demonstrate the show’s worldwide impact. Kallen, Stuart A. A Cultural History of the United States Through the Decades: The 1980’s. San Diego, Calif.: Lucent Books, 1998. This overview of the United States during the 1980’s not only examines popular culture but also discusses the Reagan Revolution and the Cold War. It features information about AIDS, music, women’s rights, technology, science, and the start of the computer revolution. Mansour, David. From Abba to Zoom: A Pop Culture Encyclopedia of the Late Twentieth Century. Kansas City,
1160
■
The Eighties in America
Bibliography
Mo.: Andrews McMeel, 2005. The encyclopedia contains thousands of entries regarding television, gadgets, fads, and music of the 1980’s. Rettenmund, Matthew. Totally Awesome 80’s: A Lexicon of the Music, Videos, Movies, TV Shows, Stars, and Trends of That Decadent Decade. New York: St. Martin’s Griffin, 1996. This work is more of a listing than an analysis, as Rettenmund provides a list of the best in each category included in his title. He also discusses the changing vocabulary and other trends, along with the popular culture figures of the period. Wills, Gary. Under God: Religion and American Politics. New York: Simon & Schuster, 1990. Wills argues that America has always been a Christian nation and that the election of 1988 reflected this fact. He covers a wide variety of topics in this book, moving from the Scopes trial through the 1980’s. While acknowledging the importance of religion, Wills also holds that the separation of church and state is good for both the state and the church.
7. Homosexuality Rutledge, Leigh. The Gay Decades: From Stonewall to the Present. New York: Plume, 1992. As the title suggests, this book covers gay history in the United States from the 1969 Stonewall riot until the late 1980’s. Rutledge goes beyond simply focusing on gays and lesbians to also discuss public figures and others who have shaped gay culture. As the author admits, this work focuses more on gay men than on lesbians. Shawyer, Lois. And the Flag Was Still There: Straight People, Gay People, and Sexuality in the U.S. Military. New York: Haworth Press, 1995. Although American homosexuals argued for greater civil rights in the 1980’s and obtained some changes in the 1990’s, the issue of gays in the military remains unresolved. Shawyer examines the issue, arguing that the United States should drop its ban on homosexuals in the military. Shawyer played a significant role in persuading Canada to drop its ban, and she brings that experience to bear in her book. Scott A. Merriman
■ Web Sites In selecting the following Web sites, efforts have been made to identify sites of broadest interest to readers and those most useful in providing additional links. Attention has also been given to representative examples of more specialized sites, such as pages on individual personages and events.
AIDS In Their Own Words: NIH Researchers Recall the Early Years of AIDS http://aidshistory.nih.gov/home.html Created by the National Institutes of Health (NIH), this Web site recounts the AIDS epidemic of the 1980’s from the government’s perspective. It includes documents and images about the disease, along with a time line, oral history transcripts, and links to related sites. Ryan White.com http://www.ryanwhite.com/index.html Dedicated to Ryan White, the site includes an extensive discussion about the boy who was diagnosed with AIDS at the age of thirteen and who gained international recognition for his subsequent fight to remain in school. It features supplemental material about his mother’s life and links to information about AIDS research and prevention.
Crises and Disasters Avoiding Disaster: The Importance of Having a Disaster Plan http://iml.jou.ufl.edu/projects/spring01/Hogue/ index.html The Exxon Valdez Oil Spill http://iml.jou.ufl.edu/projects/spring01/Hogue/ exxon.html Johnson & Johnson’s Tylenol http://iml.jou.ufl.edu/projects/spring01/Hogue/ tylenol.html Designed as part of a 2001 University of Florida research project, these sites summarize the effects of several major corporate crises, including the Exxon Valdez oil spill and the Tylenol tampering scare, both of which occurred in the 1980’s. These sites analyze how the corporations managed these crises and discuss the long-term public impact of these events.
BBC News: One Night in Bhopal http://news.bbc.co.uk/1/hi/programmes/bhopal/ default.stm This page from the British Broadcasting Corporation (BBC) News site examines the 1984 Bhopal chemical spill—the worst industrial disaster in history. An estimated three thousand people died from a chemical leak at a factory owned by Union Carbide, an American company. The site includes video and audio coverage and reports the stories of survivors, some 500,000 of whom still suffer from aftereffects.
Government and Law American Experience—Jimmy Carter http://www.pbs.org/wgbh/amex/carter/ Created to accompany an episode of the Public Broadcasting Service’s (PBS) program American Experience, this site provides information about Jimmy Carter’s life before, during, and after his presidency, as well as a discussion of the episode and a teacher’s guide. It includes a time line, photos of Carter and his family, and an in-depth examination of some of the issues of his presidency, including his attempts to bring peace to the Middle East and to end the Iranian hostage crisis. Canadian Constitutional Documents: A Legal History http://www.solon.org/Constitutions/Canada/English/ index.html The 1980’s saw a great deal of constitutional change in Canada, not the least of which was the Canada Act of 1982, in which Britain relinquished the power to change Canada’s laws, including its constitution. This site contains the text of the act, in both English and French, describes the previous legislation that the act amended, and summarizes the act’s impact upon Canada. The Canadian Encyclopedia http://www.thecanadianencyclopedia.com/ index.cfm?PgNm=Homepage&Params=A1
1162
■
The Eighties in America
Web Sites
While this Web site is in no way limited to the 1980’s, it provides a wealth of information about Canada during that decade. It features articles written by a variety of individuals, including academics, about Canadian history, and a search engine to enable users to retrieve information. Famous Trials http://www.umkc.edu/famoustrials This Web site covers fifty famous trials, ranging in time from 339 b.c.e. to 2006. It includes two American trials from the 1980’s: that of John Hinckley, Jr., who attempted to assassinate President Ronald Reagan, and the McMartin Preschool teachers charged with sexually abusing students. From Cheers to Jeers: The Mulroney Years http://archives.cbc.ca/IDD-1-73-1469/politics_economy/ brian_mulroney/ This page is part of the Prime Ministers’ Gallery created by the Canadian Broadcasting Corporation (CBC). It focuses on former Prime Minister Brian Mulroney, including video and audio clips about him, as well as teacher lesson plans. The page chronicles how Mulroney was at first highly regarded, but later left office in disgrace. Ronald Reagan Presidential Library http://www.reagan.utexas.edu This Web site is a good place for both serious researchers and more casual learners to retrieve information about Reagan. It explains how one can gain access to Reagan’s presidential papers and features biographies of Reagan and his wife Nancy. Ten Greatest Canadians—Pierre Elliot Trudeau http://www.cbc.ca/greatest/top_ten/nominee/trudeaupierre.html Trudeau is frequently called one of the greatest prime ministers, and the Canadian Broadcasting Corporation (CBC) named him one of the ten greatest Canadians for his charisma, flair, and impact upon Canada. This page from a CBC Web site discusses his accomplishments, failures, and brilliant return to politics in the early 1980’s. It includes a time line and links to multimedia materials.
United States Senate Committee on the Judiciary Hearing 99-1067: Hearings Before the Senate Committee on the Judiciary on the Nomination of Justice William Hubbs Rehnquist to be Chief Justice of the United States, July 29, 30, 31, and August 1, 1986 http://www.gpoaccess.gov/congress/senate/judiciary/ sh99-1067/browse.html The complete text of the 1986 hearings in which the Senate Judiciary Committee considered President Ronald Reagan’s promotion of William H. Rehnquist from associate to chief justice of the Supreme Court. This site is useful for students researching legal history in the 1980’s, as Rehnquist’s role as chief justice helped move the Court in a more conservative direction. Walter Mondale http://www.spartacus.schoolnet.co.uk/USAmondale.htm Created by Spartacus Educational, a British organization that designs Web sites for history instruction, this page focuses on Mondale, the unsuccessful presidential candidate in 1984, who also served as Jimmy Carter’s vice president. Unlike some treatments of Mondale, this biography covers his life before he became vice president and after his 1984 presidential run.
Military The Invasion of Grenada http://www.historyguy.com/Grenada.html This Web site looks at the United States’ invasion of Grenada in 1983. It examines the background, causes, consequences, and casualties of the conflict, along with links to related sites. Lessons Learned: Iran-Iraq War http://www.fas.org/man/dod-101/ops/war/docs/3203/ This document, compiled by the U.S. Marine Corps in 1990, analyzes the Iran-Iraq war, which was fought from 1980 to 1988 and in which the United States’ backed the Iraqi government of Saddam Hussein. The instability from this war prompted Hussein to later invade Kuwait and significantly shaped subsequent events in the region. In addition to an overview of the conflict, there are sections on strategy, tactics and operations, and chemical weapons, among other features.
The Eighties in America
Marine Attack in Lebanon http://www.nytimes.com/learning/general/onthisday/ 991023onthisday_big.html#headlines A reproduction of the front page of The New York Times for October 24, 1983, containing a report of the previous day’s bombing of an American Marine Corps barracks in Beirut, Lebanon, where American forces were on a peacekeeping mission. About 230 people were killed in the attack. “The Panama Invasion Revisited: Lessons for the Use of Force in the Post Cold War Era” http://www.mtholyoke.edu/acad/intrel/gilboa.htm This Web site, hosted by Mt. Holyoke University, contains an article from the Political Science Quarterly examining the United States’ 1989 invasion of Panama. The article focuses on then-Panamanian leader Manuel Noriega, who was captured in the invasion, brought to the United States, and tried for criminal drug operations. Saddam Hussein http://topics.nytimes.com/top/reference/timestopics/ people/h/saddam_hussein/index.html?inline=nyt-per This Web site from The New York Times surveys the life, reign, and death of Saddam Hussein, former dictator of Iraq. In the 1980’s, Hussein was supported by the United States, who later declared war to overthrow his regime. The Web site chronicles Saddam’s rise, leadership of Iraq, and fall from power and includes a time line, links to related articles, and photographs.
Pop Culture and Entertainment AllMusic http://www.allmusic.com A compendium of material on all types of music, this site contains detailed information on many musicians and bands from the 1980’s. It includes biographies, discographies, and lists of musicians who influenced each performer and band. The Amazing 80’s: 1980-1989 http://library.thinkquest.org/J0111064/80home.htm A generic site on the 1980’s, featuring information on entertainment, events, fads and fashions, inventions, and sports heroes. Though each section contains only a short list of items or individuals, this is a good beginning for further research.
Web Sites
■
1163
BBC-Cult—I Love the 80’s http://www.bbc.co.uk/cult/ilove/years/80sindex.shtml The British Broadcasting Corporation (BBC) hosts this site dedicated to the 1980’s. It is organized by year, with each year featuring information about television, games, and toys. For those who think they really know the 1980’s, a quiz is included, and there also are trivia questions and a photo gallery. Big Hair Metal and Glam Rock http://www.bighairmetal.com/ One of the biggest rock music fads of the 1980’s was heavy metal, and bands performing this type of music were famous for having “big hair.” This site features pages of information about Van Halen, Mötley Crüe, Pat Benatar, and other big hair metal bands and performers. The Eighties Club: The Politics and Pop Culture of the 1980’s http://eightiesclub.tripod.com/index.htm This is an in-depth look at the 1980’s, with some material on the politics and headlines of the period. It includes a good deal of information on Ronald Reagan, with reprints of many of his speeches and numerous essays about him. It also contains sections on films and music, an encyclopedia of the 1980’s, and a time line. 80’s Memories, Fads, and Events http://www.tripletsandus.com/80’s/memories.htm This site combines information, nostalgia, and fun. It offers users a chance to play vintage 1980’s videos games online, including Frogger and Space Invaders, discusses why Generation X is a bad name for the generation that grew up in the 1980’s, and includes a list of slang and fads from the decade. 80’s Movies Rewind http://www.fast-rewind.com/ This Web site covers hundreds of movies from the 1980’s, ranging from well-known films to cult classics. It is primarily created by film fans, is frequently updated, and includes links to other sites. The 80’s Server http://www.80s.com/default.html Unlike other sites that deal with 1980’s pop culture, The 80’s Server provides links to sites that sell period memorabilia, including music. It also links to
1164
■
The Eighties in America
Web Sites
several places where users can play games related to the 1980’s, as well as other sites about music, films, television shows, and sporting events of the era.
ysis of the conservative Christian political mobilization that began in the 1980’s. It also discusses the scandals that reduced the movement’s power.
Film History of the 1980’s www.filmsite.org/80sintro.html As the title suggests, this site contains a list of popular films, including a section on Academy Award winners, from the 1980’s. Though the list is not comprehensive, it offers an idea of what was popular at the time. Some films have links to reviews.
Soviet Union
Wide World of Sports Highlights—1980’s http://espn.go.com/abcsports/wwos/milestones/ 1980s.html This Web site lists a large number of milestones and highlights in 1980’s sports that were covered on the television program Wide World of Sports. The sports covered range from hockey to women’s volleyball, skiing, and gymnastics.
Religion Defining Evangelicalism www.wheaton.edu/isae/defining_evangelicalism.html A history of the evangelical movement in the United States, including an explanation of how the term “evangelical movement” was used in the 1980’s. It also contains a discussion of the movement’s interaction with politics beginning in the 1980’s. The Political Mobilization of the New Christian Right http://are.as.wvu.edu/lebeau1.htm Authored by Bryan LeBeau, chair of Creighton University’s history department, this Web site discusses the right-wing religious movement, with anal-
Cold War http://www.cnn.com/SPECIALS/cold.war/ This Web site accompanied the Cable News Network’s (CNN) special series on the Cold War. The site provides historical perspective about the war’s origins and its end in the late 1980’s. Its list of declassified documents will be of particular interest to the more serious student of history. Cold War International History Project http://wilsoncenter.org/ index.cfm?topic_id=1409&fuseaction=topics.home This project, administered by the Woodrow Wilson International Center for Scholars, provides source documents, particularly those from the Soviets, about the Cold War. It also features links to information about relevant events, documents, publications, and other information about the conflict. Perestroika and the Soviet Military: Implications for U.S. Policy http://www.cato.org/pubs/pas/pa133.html Perestroika was the restructuring undertaken by the Soviet Union during the 1980’s in a last-ditch attempt to save the nation’s economy. This article, written in 1990 by a retired American military official before the collapse of the Soviet Union, examines perestroika’s impact on American foreign policy; it also suggests what the United States should do to maximize its position in the world. Scott A. Merriman
■ Glossary This list is a representative collection of words and phrases that were either first used or gained prominence during the 1980’s in the United States. (n. = noun; adj. = adjective; adv. = adverb; v. = verb; exp. = expression) AIDS, n. The acronym for the disease acquired immunodeficiency syndrome. airhead, n. An ignorant or foolish individual. amped, adj. Excited. anorexia nervosa, n. A serious eating disorder that primarily afflicts teenage girls and young women and involves self-starvation. bag, v. To take or steal something. barf bag, n. A very offensive person. Barney, n. A male who is considered unattractive. Beemer, n. A BMW automobile. bimbette, n. A female who is considered either stupid or sexually promiscuous. Bite me, exp. An expression meaning “Kiss my ass.” Black Monday, n. The day in October, 1987, when the stock market plummeted more than five hundred points. bling, n. Luxurious items such as jewelry. blood, n. Friendship. blow off, v. To let pass or to skip. bodacious, adj. Attractive. boho, n. Someone who is considered to be an artist or a bohemian. bohunk, n. A very dumb athlete. boink, v. To have sex. bone, v. To engage in sexual activity. boogie, v. To leave. boom box, n. A portable stereo player. boy toy, n. A good-looking male who is under the control of an older woman. bring-down, n. Sad news. bro, n. A male relative or friend with whom someone has established a close bond. bulimia, n. A serious eating disorder that afflicts primarily teenage girls and young women and involves the gorging and purging of food. bunk, adj. Nonsense or insincere bluster. bust rocks, v. To exert oneself. butt ugly, adj. Physically unattractive. Cabbage Patch Kids, n. An extremely popular brand of doll created by Xavier Roberts and manufactured by Coleco. camcorder, n. A portable video recording device.
cap, n. A bullet. Care Bears, n. A popular set of characters created by American Greetings that appeared on greeting cards, on television, in feature films, and as stuffed teddy bears. CD, n. The acronym for compact disc. CD-ROM, n. The acronym for compact disc readonly memory. cheese, n. Money. cheesy, adj. Shabby or clichéd. choice, adj. Very good. chopper, n. A motorcycle that has been customized. clock, v. To punch. clydesdale, n. A large all-American male. CNN, n. The acronym for Cable News Network. cordless telephone, n. A portable telephone that works through the use of radio waves. couch potato, n. Someone who spends a great deal of time watching television. crack, n. A very addictive form of cocaine. crackass, adj. Something that is considered to be cheap. crib, n. The place where someone lives. crusty, adj. Filthy. cyberpunk, n. A popular science-fiction genre that primarily depicts alienated characters living in a world of high technology. dead presidents, n. Money. deep shit, n. A lot of trouble. deke, v. To fake a person out. dickhead, n. Someone who is considered to be a fool. dinero, n. Money. dink, n. The acronym for dual income, no kids. dipstick, n. Someone who is considered to be a fool. dis, v. To say something offensive or disrespectful to someone. disposable cameras, n. Inexpensive cameras made to be used only once. do lunch, v. To make an appointment to have lunch. Donkey Kong, n. A popular video game released by Nintendo. Don’t have a cow, exp. To ask someone to stay calm. dork, n. Someone who does socially inappropriate things.
1166
■
Glossary
douchebag, n. A reprehensible person. downsize, v. To lay off employees. duh, exp. That is dumb. dump on, v. To criticize someone. dweeb, n. A person who is socially inept. earthbound, adj. Traditional or old-fashioned. Eat my shorts, exp. A retort to criticism. ecstasy, n. The street name for the psychedelic drug methylenedioxymethamphetamine, or MDMA. ESPN, n. The acronym for Entertainment and Sports Programming Network. evil empire, n. A term used by President Ronald Reagan to describe the Soviet Union. family values, n. The so-called moral compass of a society, often used to promote a conservative or traditional ideology. Farm Aid, n. A charity concert held for the benefit of American farmers. Five-O, n. A police officer. for sure or fer sure, exp. Of course. fresh, adj. Something very new. frosted, adj. Mad or angry. Gag me with a spoon, exp. That is disgusting. gangbanger, n. An active member of a street gang. gear, n. Clothes. geek, n. A person who is not cool or socially popular. Generation X, n. A generation that began to have influence on popular culture during the 1980’s. genetic fingerprinting, n. The ability to distinguish one individual from another through the scientific testing of the individuals’ deoxyribonucleic acid (DNA). Get bent, exp. Go away. Get with the program, exp. To do what is considered proper. glasnost, n. A Russian term employed by Soviet leader Mikhail Gorbachev that means “openness.” go postal, v. To go crazy. Great Communicator, n. Used as a term of respect for the communication skills of President Ronald Reagan. grody to the max, adj. Very disgusting. hacker, n. A person who is very good at using computers. hair band, n. A hard rock band in which the members of the band have long frizzy hair.
The Eighties in America
hardcore, n. A loud, rebellious, and aggressive form of punk music. have a cow, v. To get upset. hellacious, adj. Stupendous. high-five, v. To slap the palm of someone else with your palm when it is stretched above the head. HIV, n. The acronym for human immunodeficiency virus. homeboy, n. A good friend. hoops, n. Basketball. hoser, n. A person who is considered irritating or without any redeeming qualities. house, n. A type of dance music played at warehouse parties. I’m so sure, exp. A statement of sarcastic disbelief. in a New York minute, adv. Happening with speed. infomercial, n. A television informational program that is really no more than a lengthy paid advertisement for a product. issues, n. Problems. JAP, n. The acronym for Jewish American Princess. Jarvik-7, n. An artificial heart designed by Dr. Robert K. Jarvik. joanie, n. A girl who is considered to be awkward or boring. Just Say No, exp. A slogan initiated by First Lady Nancy Reagan to encourage individuals to avoid getting involved with illegal drugs. kegger, n. A beer party. kickass, adj. Outstanding. killer, adj. Outstanding. Kiss my grits, exp. Another way of saying “kiss my ass.” kryptonite, n. A definite weakness. lame, adj. Of disappointing quality. laptop, n. A portable personal computer. like, exp. Used as an interjection, “like” serves as a meaningless word when repeated many times in casual speech. liposuction, n. A surgical procedure by which fat is sucked out of a patient. Live Aid, n. A charity rock concert held for the purpose of raising money for famine-ravaged Ethiopia.
The Eighties in America
Macintosh, n. A personal computer manufactured by Apple. major, adj. Extremely good. Make me barf, exp. Stating displeasure with a situation. Make my day, exp. Telling someone to go ahead and do what they were going to do, with the understanding that they will then suffer the consequences for their actions. mall chick, n. A girl who likes to spend many hours at the mall. max out, v. Take to the limit. Me Generation, n. A generation that is preoccupied with the self. moonies, n. The religious followers of the Reverend Sun Myung Moon. moonwalk, n. A backsliding dance move made popular by Michael Jackson. MS-DOS, n. The acronym for Microsoft disk operating system. MTV, n. The acronym for Music Television. neoexpressionism, n. An art movement that promoted bold figurative painting. new wave, n. A music genre that emphasized a pop sound through the use of synthesizers. no way, adv. Definitely not going to happen. nuke, v. To heat something in the microwave. NutraSweet, n. A popular artificial sweetener. oink, n. The acronym for one income, no kids. out to lunch, adj. Crazy. Pac-Man, n. A popular video game. PC, n. The acronym for both personal computer and political correctness. perestroika, n. A Russian term associated with Soviet leader Mikhail Gorbachev that means “reconstruction.” perp, n. A criminal. PMRC, n. The acronym for Parents’ Music Resource Coalition. poser, n. Someone pretending to be something that they are not. pound, v. To drink to an extreme. preppie, n. Someone who acts superior to others. Prozac, n. An antidepressant drug prescribed for a variety of psychological disorders.
Glossary
■
1167
ralph, v. To vomit. Reaganomics, n. The term used to describe President Ronald Reagan’s free market economic policies. real, v. Excellent. Rogaine, n. The brand name for minoxidil, which is used as a topical drug for the treatment of male pattern baldness. rollerblades, n. A style of roller skates, also known as inline skates. RU-486, n. The commonly used name for the synthetic compound mifepristone, which is used for the medical termination of early intrauterine pregnancy. Rubik’s cube, n. A popular mechanical puzzle. rule, v. To be amazing. rush, n. A stimulating experience. Rust Belt, n. A term used to describe the manufacturing region of the United States, as so many of these companies went out of business. safe sex, n. A set of practices employed by persons engaging in sexual activity in order to protect themselves against sexually transmitted diseases (STDs). scope, v. To examine closely. scumbag, n. A disgusting person. Shit happens, exp. Bad things or problems are always going to pop up in one’s life. Shop ’til you drop, exp. Shopping until one can no longer physically shop anymore. ska, n. A musical style strongly influenced by reggae. slam dancing, n. A dance form whereby participants literally crash into one another. space cadet, n. Someone who is out of touch with reality. space shuttle, n. A reusable spacecraft used to transport people into space. Speedos, n. A pair of men’s swimming trunks that fit snugly. stagflation, n. The combination of stagnating economic growth and rising prices. Star Wars, n. The common name for the Strategic Defense Initiative (SDI) proposed by President Ronald Reagan. stoked, adj. Excited. suck face, v. To French kiss. sucks, v. To be not up to the job or occasion. sweet, adj. Very nice.
1168
■
Glossary
ta-tas, n. Female breasts. to the max, adv. To go to the limit. toasted, adj. Drunk. toss your cookies, v. To vomit. totally awesome, adj. Amazing. Trivial Pursuit, n. A popular board game in which participants must answer popular culture and general knowledge questions. tubular, adj. Reaching greatness. USFL, n. The acronym for United States Football League. UVs, n. The acronym for ultraviolet rays of the sun. Valley girl, n. A rich teenage girl from the San Fernando Valley section of Los Angeles. veg, v. To take it easy. Walkman, n. A popular portable audio player manufactured by Sony. wannabe, n. A person who has ambitions of being like someone else. warped, adj. Something disturbing or very weird.
The Eighties in America
whack, v. To kill. Whatever, exp. It is of no consequence. Where’s the beef?, exp. A hamburger advertising slogan that also was used in the political arena to ask the question “Where’s the substance?” wicked, adj. Fantastic. wig out, v. To lose control of one’s emotions. wigged, adj. Insane. Windows, n. A hugely popular computer operating system created by Microsoft. yes way, adv. Possibly going to happen (stated after “no way”). Yo, exp. Hello. You’re toast, exp. You’re in a lot of trouble. yuppie, n. The acronym for young urban professional or young upwardly mobile professional. zapper, n. The remote control of a television. zeek, n. An extremely uncool person. zip it, v. Shut up. Jeffry Jensen
■ List of Entries by Category
Subject Headings Used in List African Americans . . . Art & Architecture . . . Asian Americans . . . . Business . . . . . . . . . Canada . . . . . . . . . Court Cases & the Law . Crime & Punishment . . Disasters . . . . . . . . . Economics . . . . . . . Education . . . . . . . . Environmental Issues. . Film . . . . . . . . . . . Health & Medicine . . . International Relations. Journalism . . . . . . . Latinos . . . . . . . . . Legislation . . . . . . .
. . . . . . . . . . . . . . . . .
African Americans Affirmative action Africa and the United States African Americans Atlanta child murders Basquiat, Jean-Michel Beloved Brawley, Tawana Break dancing Central Park jogger case Color Purple, The Cosby Show, The Crack epidemic Do the Right Thing Gangs Griffith-Joyner, Florence Hawkins, Yusef Hip-hop and rap Holmes, Larry Horton, William Houston, Whitney Howard Beach incident Jackson, Bo Jackson, Jesse Jackson, Michael Jazz Johnson, Magic
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
1169 1169 1169 1170 1170 1170 1170 1171 1171 1171 1171 1171 1172 1172 1172 1172 1172
Literature . . . . . . . . Military & War . . . . . Music . . . . . . . . . . Native Americans . . . People . . . . . . . . . . Politics & Government . Popular Culture . . . . Religion & Spirituality . Science & Technology . Sexuality. . . . . . . . . Social Issues . . . . . . . Sports . . . . . . . . . . Television . . . . . . . . Terrorism . . . . . . . . Theater & Dance . . . . Transportation . . . . . Women’s Issues . . . . .
Kincaid, Jamaica Leonard, Sugar Ray Lewis, Carl Martin Luther King Day Miami Riot of 1980 MOVE Mr. T Multiculturalism in education Murphy, Eddie Nation of Yahweh Naylor, Gloria Prince Public Enemy Racial discrimination Rice, Jerry Richie, Lionel Run-D.M.C. Taylor, Lawrence Thomas, Isiah Turner, Tina Tyson, Mike Washington, Harold Welfare Williams, Vanessa L. Wilson, August Winfrey, Oprah World music
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . .
1173 1173 1173 1173 1173 1175 1175 1176 1176 1176 1176 1177 1177 1178 1178 1178 1178
Art & Architecture Architecture Art movements Basquiat, Jean-Michel Deconstructivist architecture Gehry, Frank Gentrification Neoexpressionism in painting Pei, I. M. Performance art Photography Rock and Roll Hall of Fame Schnabel, Julian SkyDome Vietnam Veterans Memorial Xanadu Houses
Asian Americans Air India Flight 182 bombing Asian Americans Boat people Hwang, David Henry Immigration to Canada Immigration to the United States Japan and North America Joy Luck Club, The Martial arts
1170
■
The Eighties in America
List of Entries by Category
Minorities in Canada Multiculturalism in education Pei, I. M. Racial discrimination Stockton massacre Vietnam Veterans Memorial World music
Business Advertising Affirmative action Age discrimination Air traffic controllers’ strike Apple Computer AT&T breakup Baseball strike of 1981 Book publishing Business and the economy in Canada Business and the economy in the United States Chrysler Corporation federal rescue De Lorean, John Fax machines 401(k) plans Glass ceiling Globalization Iacocca, Lee Income and wages in Canada Income and wages in the United States Meritor Savings Bank v. Vinson Microsoft Mommy track Power dressing Savings and loan (S&L) crisis Scandals Sexual harassment Tamper-proof packaging Turner, Ted Ueberroth, Peter Unions Voicemail Wall Street Women in the workforce Yuppies
Canada Aboriginal rights in Canada Adams, Bryan Agriculture in Canada Air India Flight 182 bombing Bourassa, Robert Business and the economy in Canada Canada Act of 1982
Canada and the British Commonwealth Canada and the United States Canada Health Act of 1984 Canada-United States Free Trade Agreement Canadian Caper Canadian Charter of Rights and Freedoms Chrétien, Jean Davies, Robertson Demographics of Canada École Polytechnique massacre Education in Canada Elections in Canada Europe and North America Film in Canada Foreign policy of Canada Fox, Michael J. Garneau, Marc Gibson, William Gimli Glider Gretzky, Wayne Handmaid’s Tale, The Harp seal hunting Health care in Canada Hockey Immigration to Canada Income and wages in Canada Inflation in Canada Japan and North America Jennings, Peter Lemieux, Mario Lévesque, René Literature in Canada Meech Lake Accord Middle East and North America Minorities in Canada Mulroney, Brian National Anthem Act of 1980 National Energy Program (NEP) Ocean Ranger oil rig disaster Olson, Clifford Quebec English sign ban Quebec referendum of 1980 Religion and spirituality in Canada Richler, Mordechai Sauvé, Jeanne Schreyer, Edward Shamrock Summit SkyDome Soviet Union and North America Toronto bathhouse raids of 1981 Trudeau, Pierre
Turner, John Unemployment in Canada Vancouver Expo ’86
Court Cases & the Law Bork, Robert Bowers v. Hardwick Hustler Magazine v. Falwell Meritor Savings Bank v. Vinson O’Connor, Sandra Day People’s Court, The Rehnquist, William H. Roberts v. United States Jaycees Supreme Court decisions Thompson v. Oklahoma Tort reform movement Webster v. Reproductive Health Services
Crime & Punishment Air India Flight 182 bombing America’s Most Wanted Atlanta child murders Berg, Alan Bonin, William Brawley, Tawana Central Park jogger case Crime Domestic violence Dupont Plaza Hotel fire École Polytechnique massacre Gangs Goetz, Bernhard Goldmark murders Hawkins, Yusef Horton, William Howard Beach incident Lennon, John Lucas, Henry Lee McMartin Preschool trials Missing and runaway children Nation of Yahweh New Mexico State Penitentiary Riot Night Stalker case Olson, Clifford Organized crime Pan Am Flight 103 bombing People’s Court, The Post office shootings Rape Reagan assassination attempt Rose, Pete San Ysidro McDonald’s massacre Scandals Sexual harassment
The Eighties in America
List of Entries by Category
Stockton massacre Terrorism Tort reform movement Tylenol murders U.S. Senate bombing
Social Security reform Tax Reform Act of 1986 Unemployment in Canada Unemployment in the United States Welfare
Disasters
Education
AIDS epidemic Cerritos plane crash Challenger disaster Cold Sunday Day After, The El Niño Exxon Valdez oil spill Heat wave of 1980 Hurricane Hugo Loma Prieta earthquake MGM Grand Hotel fire Mount St. Helens eruption Natural disasters Nuclear winter scenario Ocean Ranger oil rig disaster Sioux City plane crash Times Beach dioxin scare Twilight Zone accident Yellowstone National Park fires
Affirmative action Bennett, William Closing of the American Mind, The Drug Abuse Resistance Education (D.A.R.E.) École Polytechnique massacre Education in Canada Education in the United States Information age Just Say No campaign McMartin Preschool trials Magnet schools Mainstreaming in education Multiculturalism in education Nation at Risk, A National Education Summit of 1989 Political correctness School vouchers debate Standards and accountability in education Stockton massacre
Economics Black Monday stock market crash Business and the economy in Canada Business and the economy in the United States Canada-United States Free Trade Agreement Consumerism Economic Recovery Tax Act of 1981 Farm Aid Farm crisis Food Security Act of 1985 401(k) plans Gentrification Globalization Home shopping channels Income and wages in Canada Income and wages in the United States Inflation in Canada Inflation in the United States Junk bonds Military spending National Energy Program (NEP) Reaganomics Recessions Savings and loan (S&L) crisis
Environmental Issues Agriculture in Canada Agriculture in the United States Air pollution Biopesticides Cancer research Cold Sunday El Niño Environmental movement Exxon Valdez oil spill Food Security Act of 1985 Harp seal hunting Heat wave of 1980 Malathion spraying Mount St. Helens eruption Nuclear Waste Policy Act of 1983 Nuclear winter scenario Ozone hole Radon Spotted owl controversy Superfund program Times Beach dioxin scare Water pollution Watt, James G. Yellowstone National Park fires
■
1171
Film Academy Awards Action films Airplane! Aliens Back to the Future Big Chill, The Blade Runner Blue Velvet Brat Pack in acting Breakfast Club, The Bridges, Jeff Broderick, Matthew Cher Close, Glenn Colorization of black-and-white films Comedians Costner, Kevin Cruise, Tom Do the Right Thing Douglas, Michael Empire Strikes Back, The Epic films E.T.: The Extra-Terrestrial Fast Times at Ridgemont High Fatal Attraction Film in Canada Film in the United States Flashdance Ford, Harrison Fox, Michael J. Full Metal Jacket Gere, Richard Ghostbusters Gibson, Mel Hannah, Daryl Heaven’s Gate Hoffman, Dustin Horror films Hughes, John Hurt, William Kiss of the Spider Woman Last Temptation of Christ, The Little Mermaid, The Martin, Steve Multiplex theaters Murphy, Eddie Murray, Bill Nicholson, Jack On Golden Pond Ordinary People PG-13 rating Platoon Raging Bull
1172
■
The Eighties in America
List of Entries by Category
Raiders of the Lost Ark RoboCop Schwarzenegger, Arnold Science-fiction films Scorsese, Martin Sequels sex, lies, and videotape Shields, Brooke Special effects Spielberg, Steven Stone, Oliver Streep, Meryl Teen films Terminator, The Terms of Endearment This Is Spin¨al Tap Tron Turner, Kathleen Twilight Zone accident Wall Street Weaver, Sigourney When Harry Met Sally . . . Who Framed Roger Rabbit Williams, Robin Winfrey, Oprah
Health & Medicine Abortion Aerobics AIDS epidemic Air pollution Alternative medicine Artificial heart Aspartame Baby Fae heart transplantation Biopesticides Caffeine Canada Health Act of 1984 Cancer research Diets Environmental movement Fetal medicine Food trends Genetics research Health care in Canada Health care in the United States Health maintenance organizations (HMOs) Hudson, Rock Koop, C. Everett Malathion spraying Marathon of Hope Medicine Plastic surgery
Prozac Radon Simmons, Richard Smoking and tobacco Superfund program Tamper-proof packaging Times Beach dioxin scare Transplantation Water pollution White, Ryan
International Relations Africa and the United States Anderson, Terry Berlin Wall Canada and the British Commonwealth Canada and the United States Canada-United States Free Trade Agreement Canadian Caper China and the United States Cold War Europe and North America Foreign policy of Canada Foreign policy of the United States Globalization Goodwill Games of 1986 Grenada invasion Haig, Alexander Intermediate-Range Nuclear Forces (INF) Treaty Iran-Contra affair Iranian hostage crisis Israel and the United States Japan and North America Kirkpatrick, Jeane Klinghoffer, Leon Latin America Libya bombing Mariel boatlift Mexico and the United States Middle East and North America Olympic boycotts Panama invasion Reagan Doctrine Reagan’s “Evil Empire” speech Reykjavik Summit Shamrock Summit Shultz, George P. Smith, Samantha Soviet Union and North America United Nations
USS Stark incident USS Vincennes incident
Journalism Brokaw, Tom Cable television CNN Craft, Christine Jennings, Peter Journalism Network anchors Pauley, Jane Rather, Dan Rivera, Geraldo Tabloid television Television USA Today
Latinos Basquiat, Jean-Michel Boat people Dupont Plaza Hotel fire Immigration Reform and Control Act of 1986 Immigration to the United States Latin America Latinos Mariel boatlift Mexico and the United States Multiculturalism in education Night Stalker case Rivera, Geraldo San Ysidro McDonald’s massacre Valenzuela, Fernando
Legislation Affirmative action Canada Act of 1982 Canada Health Act of 1984 Canadian Charter of Rights and Freedoms Chrysler Corporation federal rescue Congress, U.S. Disability rights movement Economic Recovery Tax Act of 1981 Food Security Act of 1985 Goldwater-Nichols Act of 1986 Immigration Reform and Control Act of 1986 Indian Gaming Regulatory Act of 1988 McKinney Homeless Assistance Act of 1987 National Anthem Act of 1980
The Eighties in America National Energy Program National Minimum Drinking Age Act of 1984 Nuclear Waste Policy Act of 1983 Quebec English sign ban Tax Reform Act of 1986 Tort reform movement
Literature Auel, Jean M. Beattie, Ann Beloved Bonfire of the Vanities, The Book publishing Boyle, T. Coraghessan Brat Pack in literature Children’s literature Clancy, Tom Closing of the American Mind, The Color Purple, The Confederacy of Dunces, A Cyberpunk literature Davies, Robertson Dworkin, Andrea Erdrich, Louise Gibson, William Handmaid’s Tale, The Heidi Chronicles, The Henley, Beth Hwang, David Henry Irving, John Joy Luck Club, The Keillor, Garrison Kincaid, Jamaica King, Stephen Literature in Canada Literature in the United States Ludlum, Robert Mamet, David Miller, Sue Minimalist literature Naylor, Gloria Oates, Joyce Carol Poetry Richler, Mordecai Shepard, Sam Steel, Danielle Theater Third Wave, The Torch Song Trilogy Tyler, Anne White Noise Wilson, August
List of Entries by Category
Military & War Beirut bombings Cold War Full Metal Jacket Goldwater-Nichols Act of 1986 Grenada invasion Intermediate-Range Nuclear Forces (INF) Treaty Iran-Contra affair Libya bombing M*A*S*H series finale Military ban on homosexuals Military spending North, Oliver Panama invasion Platoon Poindexter, John Rambo Stealth fighter Strategic Defense Initiative (SDI) Tower Commission USS Stark incident USS Vincennes incident Vietnam Veterans Memorial Weinberger, Caspar West Berlin discotheque bombing
Music Adams, Bryan Blondie Bon Jovi Boy George and Culture Club Broadway musicals Cats Cher Classical music Compact discs (CDs) Country music Devo Duran Duran Farm Aid Flashdance Glass, Philip Go-Go’s, The Grant, Amy Guns n’ Roses Heavy metal Hip-hop and rap Houston, Whitney Jackson, Michael Jazz Journey Lauper, Cyndi Lennon, John
■
1173
Little Mermaid, The Live Aid Madonna Mellencamp, John Cougar Michael, George Mötley Crüe MTV Music Music videos New Wave music Osbourne, Ozzy Parental advisory stickers Phantom of the Opera, The Pop music Prince Public Enemy R.E.M. Richie, Lionel Rock and Roll Hall of Fame Run-D.M.C. Springsteen, Bruce Star Search Sting Synthesizers Talking Heads Teen singers This Is Spin¨al Tap Turner, Tina US Festivals USA for Africa U2 Van Halen Vangelis Women in rock music World music Yankovic, Weird Al
Native Americans Aboriginal rights in Canada Cher Erdrich, Louise Harp seal hunting Indian Gaming Regulatory Act of 1988 Minorities in Canada Multiculturalism in education Native Americans Turner, Tina World music
People Adams, Bryan Anderson, Terry Atwater, Lee
1174
■
The Eighties in America
List of Entries by Category
Auel, Jean M. Bakker, Jim and Tammy Faye Basquiat, Jean-Michel Beattie, Ann Bennett, William Bentsen, Lloyd Berg, Alan Bird, Larry Boitano, Brian Bonin, William Bork, Robert H. Bourassa, Robert Boy George and Culture Club Boyle, T. Coraghessan Brawley, Tawana Brett, George Bridges, Jeff Broderick, Matthew Brokaw, Tom Bush, George H. W. Cher Chrétien, Jean Claiborne, Harry E. Clancy, Tom Close, Glenn Costner, Kevin Craft, Christine Cruise, Tom Davies, Robertson Decker, Mary De Lorean, John Douglas, Michael Dukakis, Michael Dworkin, Andrea Elway, John Erdrich, Louise Falwell, Jerry Ferraro, Geraldine Flynt, Larry Ford, Harrison Fox, Michael J. Gallagher Garneau, Marc Gehry, Frank Gere, Richard Gibson, Kirk Gibson, Mel Gibson, William Glass, Philip Goetz, Bernhard Grant, Amy Gretzky, Wayne Griffith-Joyner, Florence Guns n’ Roses
Haig, Alexander Hannah, Daryl Hart, Gary Hawkins, Yusef Henley, Beth Herman, Pee-Wee Hershiser, Orel Hoffman, Dustin Holmes, Larry Horton, William Houston, Whitney Hubbard, L. Ron Hudson, Rock Hughes, John Hurt, William Hwang, David Henry Iacocca, Lee Irving, John Jackson, Bo Jackson, Jesse Jackson, Michael Jennings, Peter Johnson, Magic Keillor, Garrison Kincaid, Jamaica King, Stephen Kirkpatrick, Jeane Klinghoffer, Leon Koop, C. Everett LaRouche, Lyndon Lauper, Cyndi Lemieux, Mario LeMond, Greg Lennon, John Leonard, Sugar Ray Letterman, David Lévesque, René Lewis, Carl Louganis, Greg Lucas, Henry Lee Ludlum, Robert McEnroe, John Madonna Mamet, David Martin, Steve Meese, Edwin, III Mellencamp, John Cougar Michael, George Miller, Sue Mondale, Walter Montana, Joe Mr. T Mulroney, Brian Murphy, Eddie
Murray, Bill Navratilova, Martina Naylor, Gloria Nicholson, Jack North, Oliver Oates, Joyce Carol O’Connor, Sandra Day Olson, Clifford O’Neill, Tip Osbourne, Ozzy Pauley, Jane Pei, I. M. Peller, Clara Poindexter, John Prince Quayle, Dan Rather, Dan Reagan, Nancy Reagan, Ronald Regan, Donald Rehnquist, William H. Retton, Mary Lou Rice, Jerry Richie, Lionel Richler, Mordecai Ride, Sally Rivera, Geraldo Robertson, Pat Rose, Pete Ryan, Nolan Sauvé, Jeanne Schnabel, Julian Schreyer, Edward Schroeder, Pat Schwarzenegger, Arnold Scorsese, Martin Shepard, Sam Shields, Brooke Shultz, George P. Simmons, Richard Smith, Samantha Spielberg, Steven Springsteen, Bruce Steel, Danielle Sting Stone, Oliver Streep, Meryl Sununu, John H. Swaggart, Jimmy Taylor, Lawrence Thomas, Isiah Trudeau, Pierre Turner, John Turner, Kathleen
The Eighties in America Turner, Ted Turner, Tina Tyler, Anne Tyson, Mike Ueberroth, Peter Valenzuela, Fernando Vangelis Washington, Harold Watson, Tom Watt, James G. Weaver, Sigourney Weinberger, Caspar White, Ryan Williams, Robin Williams, Vanessa L. Wilson, August Winfrey, Oprah Wright, Jim Yankovic, Weird Al
Politics & Government Abscam Atwater, Lee Bennett, William Bentsen, Lloyd Bork, Robert H. Bourassa, Robert Bush, George H. W. Canada Act of 1982 Canada-United States Free Trade Agreement Canadian Charter of Rights and Freedoms Chrétien, Jean Claiborne, Harry E. Conch Republic Congressional page sex scandal of 1983 Conservatism in U.S. politics Dukakis, Michael Elections in Canada Elections in the United States, midterm Elections in the United States, 1980 Elections in the United States, 1984 Elections in the United States, 1988 Ferraro, Geraldine Flag burning Food Security Act of 1985 Gender gap in voting Haig, Alexander Hart, Gary Horton, William Iran-Contra affair
List of Entries by Category Jackson, Jesse Just Say No campaign Kirkpatrick, Jeane Koop, C. Everett LaRouche, Lyndon Lévesque, René Liberalism in U.S. politics Meech Lake Accord Meese, Edwin, III Mondale, Walter Moral Majority Mulroney, Brian National Anthem Act of 1980 National Energy Program North, Oliver O’Connor, Sandra Day O’Neill, Tip Poindexter, John Quayle, Dan Quebec English sign ban Quebec referendum of 1980 Reagan, Nancy Reagan, Ronald Reagan assassination attempt Reagan Democrats Reagan Revolution Regan, Donald Rehnquist, William H. Robertson, Pat Sauvé, Jeanne Scandals Schreyer, Edward Schroeder, Pat Shultz, George P. Social Security reform Spotted owl controversy Statue of Liberty restoration and centennial Sununu, John H. Supreme Court decisions Tanner ’88 Tower Commission Trudeau, Pierre Turner, John U.S. Senate bombing Washington, Harold Weinberger, Caspar Wright, Jim
Popular Culture Aerobics Airplane! Auel, Jean M. Baby Jessica rescue
Back to the Future Bloom County Brat Pack in acting Brat Pack in literature Break dancing Breakfast Club, The Cabbage Patch Kids Comedians Comic strips Dallas Dance, popular Diets Dynasty Empire Strikes Back, The E.T.: The Extra-Terrestrial Fads Fashions and clothing Fast Times at Ridgemont High Flashdance Food trends Gallagher General Hospital Generation X Ghostbusters Hairstyles Hands Across America Herman, Pee-Wee Hip-hop and rap Hobbies and recreation Home shopping channels Home video rentals Infomercials Keillor, Garrison King, Stephen Knoxville World’s Fair Leg warmers Louisiana World Exposition Married . . . with Children Martial arts Max Headroom Miami Vice Mommy track Mr. T MTV Mullet Music New Coke Pac-Man Peller, Clara Photography Pop music Power dressing Preppies Raiders of the Lost Ark
■
1175
1176
■
The Eighties in America
List of Entries by Category
Rambo Simmons, Richard Slang and slogans Star Search Starbucks Steel, Danielle Tabloid television Teen singers Television Toys and games Trivial Pursuit Valley girls Vancouver Expo ’86 Video games and arcades Wave, the When Harry Met Sally . . . World Wrestling Federation Yankovic, Weird Al Yuppies
Religion & Spirituality Bakker, Jim and Tammy Faye Evangelical Lutheran Church in America Falwell, Jerry Goldmark murders Grant, Amy Heritage USA Hubbard, L. Ron Hustler Magazine v. Falwell Jewish Americans Last Temptation of Christ, The Moral Majority Nation of Yahweh Religion and spirituality in Canada Religion and spirituality in the United States Robertson, Pat Swaggart, Jimmy Televangelism
Science & Technology Apple Computer Archaeology Artificial heart Astronomy Bioengineering Biopesticides CAD/CAM technology Camcorders Cancer research Car alarms Cell phones Challenger disaster
Colorization of black-and-white films Compact discs Computers Cosmos Cyberpunk literature Disposable cameras DNA fingerprinting Doppler radar Fax machines Fetal medicine Garneau, Marc Genetics research Halley’s comet Information age Inventions Medicine Microsoft Nobel Prizes Nuclear winter scenario Ozone hole Pac-Man Ride, Sally Robots Science and technology Science-fiction films SETI Institute Space exploration Space shuttle program Star Trek: The Next Generation Superconductors Synthesizers Tamper-proof packaging Third Wave, The Titanic wreck discovery Tron Video games and arcades Virtual reality Voicemail Voyager global flight Who Framed Roger Rabbit
Sexuality ACT UP AIDS epidemic Androgyny Bowers v. Hardwick Congressional page sex scandal of 1983 Fatal Attraction Flynt, Larry Handmaid’s Tale, The Homosexuality and gay rights Hudson, Rock
Kiss of the Spider Woman Louganis, Greg Military ban on homosexuals Pornography Rape Sexual harassment Shields, Brooke Swaggart, Jimmy Torch Song Trilogy Toronto bathhouse raids of 1981 Williams, Vanessa L.
Social Issues Aboriginal rights in Canada Abortion ACT UP Affirmative action African Americans Age discrimination AIDS epidemic AIDS Memorial Quilt Asian Americans Berg, Alan Biological clock Boat people Bonfire of the Vanities, The Bowers v. Hardwick Brawley, Tawana Central Park jogger case Comic Relief Consumerism Crack epidemic Craft, Christine Demographics of Canada Demographics of the United States Disability rights movement Do the Right Thing Domestic violence Drug Abuse Resistance Education (D.A.R.E.) Dworkin, Andrea École Polytechnique massacre Farm Aid Farm crisis Feminism Flag burning Flynt, Larry Gallaudet University protests Gangs Generation X Gentrification Glass ceiling Goetz, Bernhard Goldmark murders
The Eighties in America Handmaid’s Tale, The Hawkins, Yusef Homelessness Homosexuality and gay rights Horton, William Howard Beach incident Hustler Magazine v. Falwell Immigration Reform and Control Act of 1986 Immigration to Canada Immigration to the United States Indian Gaming Regulatory Act of 1988 Jewish Americans Just Say No campaign Latinos Live Aid McKinney Homeless Assistance Act of 1987 Marriage and divorce Martin Luther King Day Meritor Savings Bank v. Vinson Miami Riot of 1980 Military ban on homosexuals Minorities in Canada Missing and runaway children Mommy track Mothers Against Drunk Driving (MADD) MOVE Nation of Yahweh National Minimum Drinking Age Act of 1984 Native Americans Political correctness Pornography Prozac Psychology Racial discrimination Rape Roberts v. United States Jaycees Sexual harassment Skinheads and neo-Nazis Spotted owl controversy Third Wave, The thirtysomething Thompson v. Oklahoma Toronto bathhouse raids of 1981 Tort reform movement USA for Africa Wall Street Webster v. Reproductive Health Services Welfare Williams, Vanessa L.
List of Entries by Category Women in the workforce Women’s rights
Sports Arena Football League Baseball Baseball strike of 1981 Basketball Bird, Larry Boitano, Brian Boxing Brett, George Decker, Mary Elway, John Football Gibson, Kirk Golf Goodwill Games of 1986 Gretzky, Wayne Griffith-Joyner, Florence Hershiser, Orel Hockey Holmes, Larry Jackson, Bo Johnson, Magic Lemieux, Mario LeMond, Greg Leonard, Sugar Ray Lewis, Carl Louganis, Greg McEnroe, John Miracle on Ice Montana, Joe Navratilova, Martina Olympic boycotts Olympic Games of 1980 Olympic Games of 1984 Olympic Games of 1988 Play, the Retton, Mary Lou Rice, Jerry Rose, Pete Ryan, Nolan SkyDome Soccer Sports Taylor, Lawrence Tennis Thomas, Isiah Turner, Ted Tyson, Mike Valenzuela, Fernando Watson, Tom Wave, the
■
1177
Television Bakker, Jim and Tammy Faye Brokaw, Tom Cable television Cagney and Lacey Cheers Children’s television CNN Colorization of black-and-white films Comedians Cosby Show, The Craft, Christine Dallas Day After, The Designing Women Dynasty Facts of Life, The Falwell, Jerry Family Ties Fox, Michael J. FOX network General Hospital Golden Girls, The Hill Street Blues Home shopping channels Infomercials Jennings, Peter L.A. Law Letterman, David Magnum, P.I. Married . . . with Children M*A*S*H series finale Miami Vice Miniseries Moonlighting MTV Murphy, Eddie Murray, Bill Music videos Network anchors Pauley, Jane Rather, Dan Rivera, Geraldo St. Elsewhere Sitcoms Soap operas Star Search Star Trek: The Next Generation Tabloid television Talk shows Tanner ’88 Televangelism Television thirtysomething
1178
■
The Eighties in America
List of Entries by Category
Turner, Ted Williams, Robin Winfrey, Oprah Wonder Years, The
Terrorism Air India Flight 182 bombing Anderson, Terry Beirut bombings Berg, Alan Canadian Caper Iranian hostage crisis Klinghoffer, Leon Libya bombing Pan Am Flight 103 bombing Terrorism Tylenol murders West Berlin discotheque bombing
Theater & Dance Ballet Break dancing Broadway musicals Cats Dance, popular Flashdance Heidi Chronicles, The Henley, Beth Hip-hop and rap Hwang, David Henry Jackson, Michael Literature in Canada Literature in the United States
Madonna Mamet, David Performance art Phantom of the Opera, The Shepard, Sam Theater Torch Song Trilogy Wilson, August
Transportation Air India Flight 182 bombing Air traffic controllers’ strike Car alarms Cerritos plane crash Chrysler Corporation federal rescue De Lorean, John Gimli Glider Iacocca, Lee Minivans Pan Am Flight 103 bombing Sioux City plane crash Stealth fighter Voyager global flight
Women’s Issues Abortion Affirmative action Auel, Jean M. Biological clock Cagney and Lacey Central Park jogger case Color Purple, The Craft, Christine
Designing Women Domestic violence Dworkin, Andrea École Polytechnique massacre Fatal Attraction Feminism Ferraro, Geraldine Gender gap in voting Glass ceiling Go-Go’s, The Golden Girls, The Handmaid’s Tale, The Heidi Chronicles, The Henley, Beth Madonna Marriage and divorce Meritor Savings Bank v. Vinson Mommy track O’Connor, Sandra Day Pauley, Jane Pornography Power dressing Rape Ride, Sally Roberts v. United States Jaycees Sauvé, Jeanne Sexual harassment Turner, Tina Valley girls Webster v. Reproductive Health Services Women in rock music Women in the workforce Women’s rights